Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CONTROL OF ONE OR MORE DEVICES IN A VEHICLE
Document Type and Number:
WIPO Patent Application WO/2023/003877
Kind Code:
A1
Abstract:
In various embodiments, methods, systems, software, and apparatuses for controlling devices in vehicles are provided, e.g., depending on gestures of vehicle occupants, a present environment of a vehicle, and/or a predicted future environment of the vehicle.

Inventors:
GORMLY NIGEL CHARLES (US)
MAKKER TANYA (US)
TRIKHA NITESH (US)
MULPURI RAO P (US)
MENDENHALL MARK DAVID (US)
SHRIVASTAVA DHAIRYA (US)
BROWN STEPHEN CLARK (US)
GUPTA ANURAG (US)
MALIK AJAY (US)
SUI SIYAO (US)
WANG CHUQING (US)
Application Number:
PCT/US2022/037593
Publication Date:
January 26, 2023
Filing Date:
July 19, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
VIEW INC (US)
International Classes:
B60R21/015; B60H1/00; B60J3/04; B60K35/00; B60R16/037; B60R21/01; G06N20/00
Foreign References:
US20190265868A12019-08-29
US20200150602A12020-05-14
US20180106098A12018-04-19
US20140324299A12014-10-30
EP3521166A12019-08-07
Attorney, Agent or Firm:
SRINIVASAN, Arthi G. et al. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. An apparatus for controlling at least one device in a vehicle, the apparatus comprising at least one controller that is configured to: operatively couple to a device ensemble disposed in or on the vehicle, receive, or direct receipt of, data from the device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and alter, or direct alteration of, at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

2. The apparatus of claim 1 , wherein the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a photosensor, an airflow sensor, an oxygen sensor, and a carbon dioxide sensor.

3. The apparatus of claim 1, wherein the device ensemble comprises a transceiver, a processor, a memory device, a network adaptor, or a controller.

4. The apparatus of claim 1 , wherein at least one component of the device ensemble is reversibly removeable, wherein reversibly removable includes an ability to remove and insert the at least one component from the device ensemble in a reversible manner.

5. The apparatus of claim 1 , wherein the at least one device comprises (i) a tintable window, (i) a heating ventilation and air conditioning (HVAC) component, a media display, or a safety component, of the vehicle.

6. The apparatus of claim 1 , wherein the at least one device comprises two or more devices of the vehicle, and wherein altering, or directing alteration of, the present state of the at least one device comprises altering the present states of each of the two or more devices of the vehicle differently.

7. The apparatus of claim 6, wherein the two or more devices of the vehicle are on opposite sides of the vehicle.

8. The apparatus of claim 1, wherein the alteration is based at least in part on an output of a machine learning model.

9. The apparatus of claim 1, wherein the alteration is based at least in part on a requested state of an occupant of the vehicle.

10. The apparatus of claim 9, wherein the requested state is determined based at least in part on an output of a machine learning model.

11. The apparatus of claim 9, wherein the requested state is determined based at least in part on historical data associated with the occupant of the vehicle.

12. The apparatus of claim 1, wherein the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on data received from the device ensemble indicating that the vehicle is unoccupied.

13. The apparatus of claim 1 , wherein the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on a projected route of the vehicle.

14. A method for controlling at least one device in a vehicle, the method comprising: receiving data from a device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and altering at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

15. A non-transitory computer-readable program instructions for controlling at least one device in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: receiving, or directing receipt of, data from a device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and altering, or directing alteration of, at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

16. An apparatus for media viewing in a vehicle, the apparatus comprising at least one controller that is configured to:

(a) operatively couple to a tintable window of the vehicle and to a media display; and

(b) control, or direct control of, (i) an optical state of the tintable window; and/or (ii) presentation of media content on the media display of the vehicle.

17. The apparatus of claim 16, wherein the media display is an organic light emitting diode (OLED) device.

18. The apparatus of claim 16, wherein the at least one controller is configured to control, or direct control of, a transparency of the media display and/or media projected by the media display.

19. The apparatus of claim 16, wherein the at least one controller is configured to control, or direct control of, the optical state of the tintable window at least in part by increasing a contrast between the tintable window and the media display.

20. The apparatus of claim 19, wherein the at least one controller is configured to increase the contrast between the tintable window and the media display synergistically with the presentation of the media content on the media display.

21. The apparatus of claim 19, wherein the at least one controller is configured to increase the contrast between the tintable window and the media display in response to controlling, or directing control of, the presentation of the media content on the media display.

22. The apparatus of claim 16, wherein the media content comprises video content from a video, a television show, a movie, a video conference, and/or an advertisement.

23. The apparatus of claim 16, wherein the media content comprises image content and/or text content from one or more documents. 24. The apparatus of claim 16, wherein the media content is streamed from a remote server.

25. The apparatus of claim 16, wherein the media display comprises one or more speaker devices, and wherein the at least one controller is configured to control, or direct control of, presentation of audio content by the one or more speaker devices.

26. The apparatus of claim 16, wherein the media display is connected to the tintable window, or is incorporated in an integrated unit forming the tintable window.

27. A method for media viewing in a vehicle, the method comprising: controlling (i) an optical state of a tintable window of the vehicle; and/or (ii) presentation of media content on a media display of the vehicle.

28. A non-transitory computer readable program instructions for media viewing in a vehicle, the non-transitory computer readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: controlling, or directing control of, (i) an optical state of a tintable window of the vehicle; and/or (ii) presentation of media content on a media display of the vehicle.

29. An apparatus for controlling at least one device in a vehicle, the apparatus comprising at least one controller that is configured to: identify, or direct identification of, at least one gesture by an occupant of the vehicle; identify, or direct identification of, a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and cause the at least one device to transition to the state identified.

30. The apparatus of claim 29, wherein the at least one controller is part of a hierarchical control system.

31. The apparatus claim 29, wherein at least one controller of the hierarchical control system is remote from the vehicle.

32. The apparatus of claim 29, wherein the at least one device comprises the tintable window, and wherein the state identified comprises a tint state of the tintable window.

33. The apparatus of claim 29, wherein the at least one device comprises the media device, and wherein the state identified comprises initiating presentation of a media content item by the media device.

34. The apparatus of claim 29, wherein the at least one device comprises the media device, and wherein the state identified comprises stopping presentation of a media content item by the media device.

35. The apparatus of claim 29, wherein the at least one controller is configured to identify, or direct identification of, the at least one gesture by the occupant of the vehicle at least in part by identifying, or directing identification of, a gesture pattern of the occupant of the vehicle.

36. The apparatus of claim 35, wherein the gesture pattern comprises (i) facial expression pattern, (ii) voice pattern, or (iii) motion pattern.

37. The apparatus of claim 29, wherein the at least one controller is configured to receive, or direct receipt of, data from a device ensemble disposed in or on the vehicle, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter.

38. The apparatus of claim 37, wherein the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a visible light photosensor, an airflow sensor, an oxygen sensor, a sound sensor, a pressure sensor, a volatile organic compound (VOC) sensor, a particulate matter sensor, a humidity sensor, and a carbon dioxide sensor.

39. The apparatus of claim 29, wherein the environmental control device comprises: a heating ventilation and air conditioning HVAC system, lighting, at tintable window, an olfactory compound dispenser, a humidity adjuster, or a music player.

40. The apparatus of claim 39, wherein the lighting comprises a media display.

41. A method for controlling at least one device in a vehicle, the method comprising: identifying at least one gesture by an occupant of the vehicle; identifying a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and causing the at least one device to transition to the state identified.

42. A non-transitory computer-readable program instructions for controlling at least one device in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: identifying, or directing identification of, at least one gesture by an occupant of the vehicle; identifying, or directing identification of, a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and causing, or directing causation of, the at least one device to transition to the state identified.

43. An apparatus for media viewing in a vehicle, the apparatus comprising at least one controller that is configured to: operatively couple to a media display of the vehicle and/or to a tintable window of the vehicle; obtain, or direct obtainment of, information associated with one or more occupants of the vehicle; identify, or direct identification of: (i) media content to be presented by the media display, and/or (ii) a tintable window setting for the tintable window based at least in part on the information obtained; and cause, or direct causing of, (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window.

44. The apparatus of claim 43, wherein the information associated with the one or more occupants of the vehicle comprises a stored and/or a historical preference.

45. The apparatus of claim 44, wherein the at least one controller is configured to (i) use, or direct usage of, the stored and/or historical preferences as a learning set for a machine learning scheme utilized to predict preference of the user at a future time and/or (ii) operatively couple to a processor performing the machine learning scheme.

46. The apparatus of claim 43, wherein the information associated with the one or more occupants of the vehicle comprises at least one user input.

47. The apparatus of claim 46, wherein the at least one user input comprises a gesture by at least one occupant of the one or more occupants of the vehicle.

48. The apparatus of claim 46, wherein the at least one user input comprises input received via a graphical user interface.

49. The apparatus of claim 48, wherein the graphical user interface is presented via the media display.

50. The apparatus of claim 43, wherein the media content identified comprises: a television show, a movie, an advertisement, a video call, or safety information.

51. The apparatus of claim 43, wherein the media content is identified based at least in part on a third party media content application.

52. The apparatus of claim 43, wherein the at least one controller is configured to determine, or direct determination of, whether the vehicle is in motion, wherein causing, or directing causing of, the media display to present the media content identified is based at least in part (i) on a determination of whether the vehicle is in motion and/or (ii) on location of the media display in the vehicle.

53. The apparatus of claim 52, wherein the at least one controller is configured to inhibit, or direct inhibition of, presentation of media content in response to a determination (i) that the vehicle is in motion and/or (ii) of location of the media display in the vehicle.

54. A method for media viewing in a vehicle, the method comprising: obtaining information associated with one or more occupants of the vehicle; identifying (i) media content to be presented by a media display of the vehicle, and/or (ii) a tintable window setting for a tintable window of the vehicle based at least in part on the information obtained; and causing (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window.

55. A non-transitory computer-readable program instructions for media viewing in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: obtaining, or directing obtainment of, information associated with one or more occupants of the vehicle; identifying, or directing identification of, (i) media content to be presented by the media display, and/or (ii) a tintable window setting for the tintable window based at least in part on the information obtained; and causing, or directing causation of, (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window.

Description:
CONTROL OF ONE OR MORE DEVICES IN A VEHICLE

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority to U.S. Provisional Application No. 63/224,377, filed on July 21, 2021, which is hereby incorporated by reference herein in its entirety for all purposes. This application relates as a Continuation-in-Part to International Patent Application Serial No. PCT/US21/27418, filed April 15, 2021, titled “INTERACTION BETWEEN AN ENCLOSURE AND ONE OR MORE OCCUPANTS,” which claims priority from U.S. Provisional Patent Application Serial No. 63/080,899, filed September 21, 2020, titled “INTERACTION BETWEEN AN ENCLOSURE AND ONE OR MORE OCCUPANTS,” from U.S. Provisional Application Serial No. 63/052,639, filed July 16, 2020, titled “INDIRECT INTERACTIVE INTERACTION WITH A TARGET IN AN ENCLOSURE,” and from U.S. Provisional Application Serial No. 63/010,977, filed April 16, 2020, titled “INDIRECT INTERACTION WITH A TARGET IN AN ENCLOSURE.” This application is also a Continuation-in-Part of U.S. Patent Application Serial No. 17/249,148, filed February 22, 2021, titled “CONTROLLING OPTICALLY-SWITCHABLE DEVICES,” which is a Continuation of U.S. Patent Application Serial No. 16/096,557, filed October 25, 2018, titled “CONTROLLING OPTICALLY-SWITCHABLE DEVICES,” which is a National Stage Entry of International Patent Application Serial No. PCT/US 17/29476, filed April 25, 2017, titled “CONTROLLING OPTICALLY-SWITCHABLE DEVICES,” which claims priority from U.S. Provisional Application Serial No. 62/327,880, filed April 26, 2016, titled “CONTROLLING OPTICALLY-SWITCHABLE DEVICES,” which is a Continuation-in-Part of U.S. Patent Application Serial No. 14/391,122, filed October 7, 2014, now U.S. Patent No. 10,365,531, issued July 30, 2019, titled “APPLICATIONS FOR CONTROLLING OPTICALLY SWITCHABLE DEVICES,” which is a National Stage Entry of International Patent Application Serial No. PCT/US 13/36456, filed April 12, 2013, titled “APPLICATIONS FOR CONTROLLING OPTICALLY SWITCHABLE DEVICES,” which claims priority from U.S. Provisional Application Serial No. 61/624,175, filed April 13, 2012, titled “APPLICATIONS FOR CONTROLLING OPTICALLY SWITCHABLE DEVICES.” This application is also a Continuation-in-Part of U.S. Patent Application Serial No. 16/946,947, filed July 13, 2020, titled “AUTOMATED COMMISSIONING OF CONTROLLERS IN A WINDOW NETWORK,” which is a Continuation of U.S. Patent Application Serial No. 16/462,916, filed May 21, 2019, titled “AUTOMATED COMMISSIONING OF CONTROLLERS IN A WINDOW NETWORK,” which is a Continuation of U.S. Patent Application Serial No. 16/082,793, filed September 6, 2018, now U.S. Patent No. 10,935,864, issued March 1, 2021, titled “METHOD OF COMMISSIONING ELECTROCHROMIC WINDOWS.” U.S. Patent Application Serial No. 16/462,916, filed May 21, 2019, titled “AUTOMATED COMMISSIONING OF CONTROLLERS IN A WINDOW NETWORK,” is also a National Stage Entry of International Patent Application Serial No. PCT/US17/62634, filed November 20, 2017, titled “AUTOMATED COMMISSIONING OF CONTROLLERS IN A WINDOW NETWORK,” which claims priority from U.S. Provisional Patent Application Serial No. 62/551,649, filed August 29, 2017, titled “AUTOMATED COMMISSIONING OF CONTROLLERS IN A WINDOW NETWORK,” and from U.S. Provisional Patent Application Serial No. 62/426,126, filed November 23, 2016, titled “AUTOMATED COMMISSIONING OF CONTROLLERS IN A WINDOW NETWORK.” This application is also a Continuation-in-Part of U.S. Patent Application Serial No. 16/950,774, filed November 17, 2020, titled “DISPLAYS FOR TINTABLE WINDOWS,” which is a Continuation of U.S. Patent Application Serial No. 16/608,157, filed October 24, 2019, titled “DISPLAYS FOR TINTABLE WINDOWS,” which is a National Stage Entry of International Patent Application Serial No. PCT/US 18/29476, filed April 25, 2018, titled “DISPLAYS FOR TINTABLE WINDOWS,” which claims priority to (i) U.S. Provisional Patent Application Serial No. 62/607,618, filed December 19, 2017, titled “ELECTROCHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY FIELD,” (ii) U.S. Provisional Patent Application Serial No. 62/523,606, filed June 22, 2017, titled “ELECTROCHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY,” (iii) U.S. Provisional Patent Application Serial No. 62/507,704, filed May 17, 2017, titled “ELECTROCHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY,” (iv) U.S. Provisional Patent Application Serial No. 62/506,514, filed May 15, 2017, titled “ELECTROCHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY,” and (v) U.S. Provisional Patent Application Serial No. 62/490,457, filed April 26, 2017, titled “ELECTROCHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY.” This application is also a Continuation-In-Part of U.S. Patent Application Serial No. 17/083,128, filed October 28, 2020, titled “BUILDING NETWORK,” which is a Continuation of U.S. Patent Application Serial No. 16/664,089, filed October 25, 2019, titled “BUILDING NETWORK,” that is a National Stage Entry of International Patent Application Serial No.

PCT/US19/30467, filed May, 2, 2019, titled “EDGE NETWORK FOR BUILDING SERVICES,” which claims priority to U.S. Provisional Patent Application Serial No. 62/666,033, filed May 02, 2018, U.S. Patent Application Serial No. 17/083,128, is also a Continuation-In-Part of International Patent Application Serial No. PCT/US18/29460, filed April 25, 2018, that claims priority to U.S. Provisional Patent Application Serial No. 62/607,618, to U.S. Provisional Patent Application Serial No. 62/523,606, to U.S. Provisional Patent Application Serial No. 62/507,704, to U.S. Provisional Patent Application Serial No. 62/506,514, and to U.S. Provisional Patent Application Serial No. 62/490,457. This application is also a Continuation-In-Part of U.S. Patent Application Serial No. 17/081,809, filed October 27, 2020, titled “TINTABLE WINDOW SYSTEM COMPUTING PLATFORM,” which is a Continuation of U.S. Patent Application Serial No. 16/608,159, filed October 24, 2019, titled “TINTABLE WINDOW SYSTEM COMPUTING PLATFORM,” that is a National Stage Entry of International Patent Application Serial No. PCT/US18/29406, filed April, 25, 2018, titled “TINTABLE WINDOW SYSTEM COMPUTING PLATFORM,” which claims priority to U.S. Provisional Patent Application Serial No. 62/607,618, U.S. Provisional Patent Application Serial No. 62/523,606, U.S. Provisional Patent Application Serial No.

62/507,704, U.S. Provisional Patent Application Serial No. 62/506,514, and U.S. Provisional Patent Application Serial No. 62/490,457. This application is also a Continuation-In-Part of International Patent Application Serial No. PCT/US20/53641, filed September 30, 2020, titled “TANDEM VISION WINDOW AND MEDIA DISPLAY,” which claims priority to U.S. Provisional Patent Application Serial No. 62/911,271, filed October 5, 2019, titled “TANDEM VISION WINDOW AND TRANSPARENT DISPLAY,” to U.S. Provisional Patent Application Serial No. 62/952,207, filed December 20, 2019, titled “TANDEM VISION WINDOW AND TRANSPARENT DISPLAY,” to U.S. Provisional Patent Application Serial No. 62/975,706, filed February 12, 2020, titled “TANDEM VISION WINDOW AND MEDIA DISPLAY,” to U.S. Provisional Patent Application Serial No. 63/085,254, filed September 30, 2020, titled “TANDEM VISION WINDOW AND MEDIA DISPLAY.” This application is also a Continuation-In-Part of U.S. Provisional Patent Application Serial No. 63/170,245, filed April 2, 2021, titled “DISPLAY CONSTRUCT FOR MEDIA PROJECTION AND WIRELESS CHARGING,” of U.S. Provisional Patent Application Serial No. 63/154,352, filed February 26, 2021, titled “DISPLAY CONSTRUCT FOR MEDIA PROJECTION AND WIRELESS CHARGING,” and of U.S. Provisional Patent Application Serial No. 63/115,842, filed November 19, 2020, titled “DISPLAY CONSTRUCT FOR MEDIA PROJECTION.” Each of the above recited patent applications is entirely incorporated herein by reference.

BACKGROUND

[0002] Vehicles are associated with various dangers (e.g., accidentally leaving a child or pet in the vehicle on a hot day, accidentally leaving valuable items in the vehicle, not being able to see due to glare or fogging windows, etc.) and discomforts (e.g., being bored on a long trip, being too hot or too cold, etc.). Addressing these dangers and discomforts would lead to improved safety (e.g., safety of passengers while the vehicle is in motion, while the vehicle is parked, etc.) and/or improved comfort. However, addressing these dangers or discomforts may be difficult and/or resource intensive.

SUMMARY [0003] Various aspects disclosed herein alleviate at least part of the above referenced shortcomings.

[0004] As disclosed herein, one or more devices associated with a vehicle may be controlled, for example, to achieve a target environment within the vehicle and/or to achieve a target state for one or more occupants of the vehicle (e.g., to increase health, safety, and/or comfort). The one or more devices may comprise a tintable window, a media display, an HVAC component, sound emitter (e.g., speaker), smell emitter, or the like. A target environment and/or a target state for one or more occupants of the vehicle may be determined, e.g., based at least in part on (i) sensor data that indicates present conditions within and/or around the vehicle, (ii) predicted conditions within and/or around the vehicle, and/or (iii) inferred and/or explicitly indicated user preferences (e.g., though user input). State(s) of one or more devices may be altered automatically (e.g., without manual user input), thereby improving vehicle safety and/or occupant comfort.

[0005] In another aspect, an apparatus for controlling at least one device in a vehicle, the apparatus comprises at least one controller that is configured to: operatively couple to a device ensemble disposed in or on the vehicle, receive, or direct receipt of, data from the device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and alter, or direct alteration of, at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

[0006] In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a photosensor, an airflow sensor, an oxygen sensor, and a carbon dioxide sensor. In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, a network adaptor, or a controller. In some embodiments, at least one component of the device ensemble is reversibly removeable, wherein reversibly removable includes an ability to remove and insert the at least one component from the device ensemble in a reversible manner. In some embodiments, the at least one device comprises (i) a tintable window, (i) a heating ventilation and air conditioning (HVAC) component, a media display, or a safety component, of the vehicle. In some embodiments, the at least one device comprises two or more devices of the vehicle, and wherein altering, or directing alteration of, the present state of the at least one device comprises altering the present states of each of the two or more devices of the vehicle differently. In some embodiments, the two or more devices of the vehicle are on opposite sides of the vehicle. In some embodiments, the alteration is based at least in part on an output of a machine learning model. In some embodiments, the alteration is based at least in part on a requested state of an occupant of the vehicle. In some embodiments, the requested state is determined based at least in part on an output of a machine learning model. In some embodiments, the requested state is determined based at least in part on historical data associated with the occupant of the vehicle. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on data received from the device ensemble indicating that the vehicle is unoccupied. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on a projected route of the vehicle.

[0007] In another aspect, a method for controlling at least one device in a vehicle, the method comprises: receiving data from a device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and altering at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

[0008] In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a photosensor, an airflow sensor, an oxygen sensor, and a carbon dioxide sensor. In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, a network adaptor, or a controller. In some embodiments, at least one component of the device ensemble is reversibly removeable, wherein reversibly removable includes an ability to remove and insert the at least one component from the device ensemble in a reversible manner. In some embodiments, the at least one device comprises (i) a tintable window, (i) a heating ventilation and air conditioning (HVAC) component, a media display, or a safety component, of the vehicle. In some embodiments, the at least one device comprises two or more devices of the vehicle, and wherein altering the present state of the at least one device comprises altering the present states of each of the two or more devices of the vehicle differently. In some embodiments, the two or more devices of the vehicle are on opposite sides of the vehicle. In some embodiments, the alteration is based at least in part on an output of a machine learning model. In some embodiments, the alteration is based at least in part on a requested state of an occupant of the vehicle. In some embodiments, the requested state is determined based at least in part on an output of a machine learning model. In some embodiments, the requested state is determined based at least in part on historical data associated with the occupant of the vehicle. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on data received from the device ensemble indicating that the vehicle is unoccupied. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on a projected route of the vehicle.

[0009] In another aspect, a non-transitory computer-readable program instructions for controlling at least one device in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprises: receiving, or directing receipt of, data from a device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and altering, or directing alteration of, at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

[0010] In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a photosensor, an airflow sensor, an oxygen sensor, and a carbon dioxide sensor. In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, a network adaptor, or a controller. In some embodiments, at least one component of the device ensemble is reversibly removeable, wherein reversibly removable includes an ability to remove and insert the at least one component from the device ensemble in a reversible manner. In some embodiments, the at least one device comprises (i) a tintable window, (i) a heating ventilation and air conditioning (HVAC) component, a media display, or a safety component, of the vehicle. In some embodiments, the at least one device comprises two or more devices of the vehicle, and wherein altering the present state of the at least one device comprises altering the present states of each of the two or more devices of the vehicle differently. In some embodiments, the two or more devices of the vehicle are on opposite sides of the vehicle. In some embodiments, the alteration is based at least in part on an output of a machine learning model. In some embodiments, the alteration is based at least in part on a requested state of an occupant of the vehicle. In some embodiments, the requested state is determined based at least in part on an output of a machine learning model. In some embodiments, the requested state is determined based at least in part on historical data associated with the occupant of the vehicle. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on data received from the device ensemble indicating that the vehicle is unoccupied. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on a projected route of the vehicle.

[0011] In another aspect, a system for controlling at least one device in a vehicle, the system comprises: a network configured to: operatively couple to a device ensemble disposed in or on the vehicle; operatively couple to the at least one device; receive data from the device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and transmit instructions that cause alteration of at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble.

[0012] In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a photosensor, an airflow sensor, an oxygen sensor, and a carbon dioxide sensor. In some embodiments, at least a portion of the network is disposed in an envelope of the vehicle. In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, a network adaptor, or a controller. In some embodiments, at least one component of the device ensemble is reversibly removeable, wherein reversibly removable includes an ability to remove and insert the at least one component from the device ensemble in a reversible manner. In some embodiments, the at least one device comprises (i) a tintable window, (i) a heating ventilation and air conditioning (HVAC) component, a media display, or a safety component, of the vehicle. In some embodiments, the at least one device comprises two or more devices of the vehicle, and wherein the transmitted instructions cause alteration of the present states of each of the two or more devices of the vehicle differently. In some embodiments, the two or more devices of the vehicle are on opposite sides of the vehicle. In some embodiments, the alteration is based at least in part on an output of a machine learning model. In some embodiments, the alteration is based at least in part on a requested state of an occupant of the vehicle. In some embodiments, the requested state is determined based at least in part on an output of a machine learning model. In some embodiments, the requested state is determined based at least in part on historical data associated with the occupant of the vehicle. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on data received from the device ensemble indicating that the vehicle is unoccupied. In some embodiments, the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on a projected route of the vehicle. In some embodiments, the network is configured to receive the data from the device ensemble using a first communication protocol, and wherein the network is configured to transmit the instructions using a second communication protocol. In some embodiments, the first communication protocol is different than the second communication protocol. In some embodiments, the network is configured to translate information between the first communication protocol and the second communication protocol. In some embodiments, the first communication protocol and/or the second communication protocol comprise a wireless communication protocol. In some embodiments, the network is configured to facilitate transmission of signals abiding by control protocol, cellular communication protocol, and/or media protocol.

[0013] In another aspect, an apparatus for media viewing in a vehicle, the apparatus comprises at least one controller that is configured to: (a) operatively couple to a tintable window of the vehicle and to a media display; and (b) control, or direct control of, (i) an optical state of the tintable window; and/or (ii) presentation of media content on the media display of the vehicle.

[0014] In some embodiments, the media display is an organic light emitting diode (OLED) device. In some embodiments, the at least one controller is configured to control, or direct control of, a transparency of the media display and/or media projected by the media display. In some embodiments, the at least one controller is configured to control, or direct control of, the optical state of the tintable window at least in part by adjusting a transparency of the tintable window. In some embodiments, the at least one controller is configured to control, or direct control of, the optical state of the tintable window at least in part by increasing a contrast between the tintable window and the media display. In some embodiments, the at least one controller is configured to increase the contrast between the tintable window and the media display synergistically with the presentation of the media content on the media display. In some embodiments, the at least one controller is configured to increase the contrast between the tintable window and the media display in response to controlling, or directing control of, the presentation of the media content on the media display. In some embodiments, the media content comprises video content from a video, a television show, a movie, a video conference, and/or an advertisement. In some embodiments, the media content comprises image content and/or text content from one or more documents. In some embodiments, the media content is streamed from a remote server. In some embodiments, the media display comprises one or more speaker devices, and wherein the at least one controller is configured to control, or direct control of, presentation of audio content by the one or more speaker devices. In some embodiments, the media display is connected to the tintable window, or is incorporated in an integrated unit forming the tintable window. In some embodiments, the integrated unit is an insulated glass unit (IGU) and/or wherein the tintable window comprises an electrochromic construct. [0015] In another aspect, a method for media viewing in a vehicle, the method comprises: controlling (i) an optical state of a tintable window of the vehicle; and/or (ii) presentation of media content on a media display of the vehicle.

[0016] In some embodiments, the media display is an organic light emitting diode (OLED) device. In some embodiments, the method further comprises controlling a transparency of the media display and/or media projected by the media display. In some embodiments, controlling the optical state of the tintable window comprises adjusting a transparency of the tintable window. In some embodiments, controlling the optical state of the tintable window comprises increasing a contrast between the tintable window and the media display. In some embodiments, increasing the contrast between the tintable window and the media display is synergistic with the presentation of the media content on the media display. In some embodiments, increasing the contrast between the tintable window and the media display is in response to controlling, or directing control of, the presentation of the media content on the media display. In some embodiments, the media content comprises video content from a video, a television show, a movie, a video conference, and/or an advertisement. In some embodiments, the media content comprises image content and/or text content from one or more documents. In some embodiments, the media content is streamed from a remote server. In some embodiments, the media display comprises one or more speaker devices, and wherein the at least one controller is configured to control, or direct control of, presentation of audio content by the one or more speaker devices. In some embodiments, the media display is connected to the tintable window, or is incorporated in an integrated unit forming the tintable window. In some embodiments, the integrated unit is an insulated glass unit (IGU) and/or wherein the tintable window comprises an electrochromic construct.

[0017] In another aspect, a non-transitory computer readable program instructions for media viewing in a vehicle, the non-transitory computer readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprises: controlling, or directing control of, (i) an optical state of a tintable window of the vehicle; and/or (ii) presentation of media content on a media display of the vehicle.

[0018] In some embodiments, the media display is an organic light emitting diode (OLED) device. In some embodiments, the operations comprise controlling, or directing control of, a transparency of the media display and/or media projected by the media display. In some embodiments, the operations comprise controlling, or directing control of, the optical state of the tintable window at least in part by adjusting a transparency of the tintable window. In some embodiments, the operations comprise controlling, or directing control of, the optical state of the tintable window at least in part by increasing a contrast between the tintable window and the media display. In some embodiments, the operations comprise increasing, or directing increase of, the contrast between the tintable window and the media display synergistically with the presentation of the media content on the media display. In some embodiments, the operations comprise increasing, or directing increase of, the contrast between the tintable window and the media display in response to controlling, or directing control of, the presentation of the media content on the media display. In some embodiments, the media content comprises video content from a video, a television show, a movie, a video conference, and/or an advertisement. In some embodiments, the media content comprises image content and/or text content from one or more documents. In some embodiments, the media content is streamed from a remote server. In some embodiments, the media display comprises one or more speaker devices, and wherein the at least one controller is configured to control, or direct control of, presentation of audio content by the one or more speaker devices. In some embodiments, the media display is connected to the tintable window, or is incorporated in an integrated unit forming the tintable window. In some embodiments, the integrated unit is an insulated glass unit (IGU) and/or wherein the tintable window comprises an electrochromic construct.

[0019] In another aspect, a system for media viewing in a vehicle, the system comprises: a network configured to: operatively couple to a tintable window of the vehicle and to a media display; and transmit instructions that control (i) an optical state of a tintable window of the vehicle; and/or (ii) presentation of media content on a media display of the vehicle.

[0020] In some embodiments, the media display is an organic light emitting diode (OLED) device. In some embodiments, the network is configured to transmit instructions that control a transparency of the media display and/or media projected by the media display. In some embodiments, the network is configured to transmit instructions that control the optical state of the tintable window at least in part by transmitting instructions that adjust a transparency of the tintable window. In some embodiments, the network is configured to transmit instructions that control the optical state of the tintable window at least in part by transmitting instructions that increase a contrast between the tintable window and the media display. In some embodiments, the network is configured to transmit the instructions that increase the contrast between the tintable window and the media display synergistically with the presentation of the media content on the media display. In some embodiments, the network is configured to transmit the instructions that increase the contrast between the tintable window and the media display in response to transmitting instructions to control the presentation of the media content on the media display. In some embodiments, the media content comprises video content from a video, a television show, a movie, a video conference, and/or an advertisement. In some embodiments, the media content comprises image content and/or text content from one or more documents. In some embodiments, the media content is streamed from a remote server. In some embodiments, the media display comprises one or more speaker devices, and wherein the at least one controller is configured to control, or direct control of, presentation of audio content by the one or more speaker devices. In some embodiments, the media display is connected to the tintable window, or is incorporated in an integrated unit forming the tintable window. In some embodiments, the integrated unit is an insulated glass unit (IGU) and/or wherein the tintable window comprises an electrochromic construct. In some embodiments, the network is configured to facilitate transmission of signals abiding by control protocol, cellular communication protocol, and/or media protocol.

[0021] In another aspect, an apparatus for controlling at least one device in a vehicle, the apparatus comprising at least one controller that is configured to: identify, or direct identification of, at least one gesture by an occupant of the vehicle; identify, or direct identification of, a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and cause the at least one device to transition to the state identified.

[0022] In some embodiments, the at least one controller is part of a hierarchical control system. In some embodiments, at least one controller of the hierarchical control system is remote from the vehicle. In some embodiments, at least one controller of the hierarchical control system is in the cloud or in a building. In some embodiments, the at least one device comprises the tintable window, and wherein the state identified comprises a tint state of the tintable window. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises initiating presentation of a media content item by the media device. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises stopping presentation of a media content item by the media device. In some embodiments, the at least one controller is configured to identify, or direct identification of, the at least one gesture by the occupant of the vehicle at least in part by identifying, or directing identification of, a gesture pattern of the occupant of the vehicle. In some embodiments, the gesture pattern comprises (i) facial expression pattern, (ii) voice pattern, or (iii) motion pattern. In some embodiments, the at least one controller is configured to receive, or direct receipt of, data from a device ensemble disposed in or on the vehicle, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter. In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a visible light photosensor, an airflow sensor, an oxygen sensor, a sound sensor, a pressure sensor, a volatile organic compound (VOC) sensor, a particulate matter sensor, a humidity sensor, and a carbon dioxide sensor. In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, network adaptors, or a controller. In some embodiments, the processor comprises a Graphics Processing Unit (GPU), a plurality of processors, or a microcontroller. In some embodiments, the environmental control device comprises: a heating ventilation and air conditioning HVAC system, lighting, at tintable window, an olfactory compound dispenser, a humidity adjuster, or a music player. In some embodiments, the lighting comprises a media display.

[0023] In another aspect, a method for controlling at least one device in a vehicle, the method comprises: identifying at least one gesture by an occupant of the vehicle; identifying a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and causing the at least one device to transition to the state identified.

[0024] In some embodiments, the at least one device comprises the tintable window, and wherein the state identified comprises a tint state of the tintable window. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises initiating presentation of a media content item by the media device. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises stopping presentation of a media content item by the media device. In some embodiments, identifying the at least one gesture by the occupant of the vehicle comprises identifying a gesture pattern of the occupant of the vehicle. In some embodiments, the gesture pattern comprises (i) facial expression pattern, (ii) voice pattern, or (iii) motion pattern. In some embodiments, the method further comprises receiving data from a device ensemble disposed in or on the vehicle, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter. In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a visible light photosensor, an airflow sensor, an oxygen sensor, a sound sensor, a pressure sensor, a volatile organic compound (VOC) sensor, a particulate matter sensor, a humidity sensor, and a carbon dioxide sensor.

In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, network adaptors, or a controller. In some embodiments, the processor comprises a Graphics Processing Unit (GPU), a plurality of processors, or a microcontroller. In some embodiments, the environmental control device comprises: a heating ventilation and air conditioning HVAC system, lighting, at tintable window, an olfactory compound dispenser, a humidity adjuster, or a music player. In some embodiments, the lighting comprises a media display.

[0025] In another aspect, a non-transitory computer-readable program instructions for controlling at least one device in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprises: identifying, or directing identification of, at least one gesture by an occupant of the vehicle; identifying, or directing identification of, a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and causing, or directing causation of, the at least one device to transition to the state identified.

[0026] In some embodiments, the at least one device comprises the tintable window, and wherein the state identified comprises a tint state of the tintable window. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises initiating presentation of a media content item by the media device. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises stopping presentation of a media content item by the media device. In some embodiments, identifying, or directing identification of, the at least one gesture by the occupant of the vehicle comprises identifying, or directing identification of, a gesture pattern of the occupant of the vehicle. In some embodiments, the gesture pattern comprises (i) facial expression pattern, (ii) voice pattern, or (iii) motion pattern. In some embodiments, the operations further comprise receiving data from a device ensemble disposed in or on the vehicle, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter. In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a visible light photosensor, an airflow sensor, an oxygen sensor, a sound sensor, a pressure sensor, a volatile organic compound (VOC) sensor, a particulate matter sensor, a humidity sensor, and a carbon dioxide sensor. In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, network adaptors, or a controller. In some embodiments, the processor comprises a Graphics Processing Unit (GPU), a plurality of processors, or a microcontroller. In some embodiments, the environmental control device comprises: a heating ventilation and air conditioning HVAC system, lighting, at tintable window, an olfactory compound dispenser, a humidity adjuster, or a music player. In some embodiments, the lighting comprises a media display. [0027] In another aspect, a system for controlling at least one device in a vehicle, the system comprises: a network configured to: receive data indicative of an identification of at least one gesture by an occupant of the vehicle; receive data indicative of an identification of a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and transmit instructions that cause the at least one device to transition to the state identified.

[0028] In some embodiments, the at least one device comprises the tintable window, and wherein the state identified comprises a tint state of the tintable window. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises initiating presentation of a media content item by the media device. In some embodiments, the at least one device comprises the media device, and wherein the state identified comprises stopping presentation of a media content item by the media device. In some embodiments, the identification of the at least one gesture by the occupant of the vehicle is based at least in part on an identification of a gesture pattern of the occupant of the vehicle. In some embodiments, the gesture pattern comprises (i) facial expression pattern, (ii) voice pattern, or (iii) motion pattern. In some embodiments, the network is configured to receive data from a device ensemble disposed in or on the vehicle, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter. In some embodiments, the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a visible light photosensor, an airflow sensor, an oxygen sensor, a sound sensor, a pressure sensor, a volatile organic compound (VOC) sensor, a particulate matter sensor, a humidity sensor, and a carbon dioxide sensor. In some embodiments, the device ensemble comprises a transceiver, a processor, a memory device, network adaptors, or a controller. In some embodiments, the processor comprises a Graphics Processing Unit (GPU), a plurality of processors, or a microcontroller. In some embodiments, the environmental control device comprises: a heating ventilation and air conditioning HVAC system, lighting, at tintable window, an olfactory compound dispenser, a humidity adjuster, or a music player. In some embodiments, the lighting comprises a media display. In some embodiments, the network is configured to receive the data indicative of the identification of the at least one gesture using a first communication protocol, and wherein the network is configured to transmit the instructions using a second communication protocol. In some embodiments, the first communication protocol is different than the second communication protocol. In some embodiments, the network is configured to translate information between the first communication protocol and the second communication protocol. In some embodiments, the first communication protocol and/or the second communication protocol comprise a wireless communication protocol. In some embodiments, the network is configured to facilitate transmission of signals abiding by control protocol, cellular communication protocol, and/or media protocol.

[0029] In another aspect, an apparatus for media viewing in a vehicle, the apparatus comprises at least one controller that is configured to: operatively couple to a media display of the vehicle and/or to a tintable window of the vehicle; obtain, or direct obtainment of, information associated with one or more occupants of the vehicle; identify, or direct identification of: (i) media content to be presented by the media display, and/or (ii) a tintable window setting for the tintable window based at least in part on the information obtained; and cause, or direct causing of, (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window.

[0030] In some embodiments, the information associated with the one or more occupants of the vehicle comprises a stored and/or a historical preference. In some embodiments, the at least one controller is configured to (i) use, or direct usage of, the stored and/or historical preferences as a learning set for a machine learning scheme utilized to predict preference of the user at a future time and/or (ii) operatively couple to a processor performing the machine learning scheme. In some embodiments, the information associated with the one or more occupants of the vehicle comprises at least one user input. In some embodiments, the at least one user input comprises a gesture by at least one occupant of the one or more occupants of the vehicle. In some embodiments, the at least one user input comprises input received via a graphical user interface. In some embodiments, the graphical user interface is presented via the media display. In some embodiments, the media content identified comprises: a television show, a movie, an advertisement, a video call, or safety information. In some embodiments, the media content is identified based at least in part on user preferences. In some embodiments, the media content is identified based at least in part on a third party media content application. In some embodiments, the at least one controller is configured to determine, or direct determination of, whether the vehicle is in motion, wherein causing, or directing causing of, the media display to present the media content identified is based at least in part (i) on a determination of whether the vehicle is in motion and/or (ii) on location of the media display in the vehicle. In some embodiments, the at least one controller is configured to inhibit, or direct inhibition of, presentation of media content in response to a determination (i) that the vehicle is in motion and/or (ii) of location of the media display in the vehicle. [0031] In another aspect, a method for media viewing in a vehicle, the method comprises: obtaining information associated with one or more occupants of the vehicle; identifying (i) media content to be presented by a media display of the vehicle, and/or (ii) a tintable window setting for a tintable window of the vehicle based at least in part on the information obtained; and causing (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window.

[0032] In some embodiments, the information associated with the one or more occupants of the vehicle comprises a stored and/or a historical preference. In some embodiments, the method further comprises using the stored and/or historical preferences as a learning set for a machine learning scheme utilized to predict preference of the user at a future time. In some embodiments, the information associated with the one or more occupants of the vehicle comprises at least one user input. In some embodiments, the at least one user input comprises a gesture by at least one occupant of the one or more occupants of the vehicle. In some embodiments, the at least one user input comprises input received via a graphical user interface. In some embodiments, the graphical user interface is presented via the media display. In some embodiments, the media content identified comprises: a television show, a movie, an advertisement, a video call, or safety information. In some embodiments, the media content is identified based at least in part on user preferences. In some embodiments, the media content is identified based at least in part on a third party media content application. In some embodiments, the method further comprises determining whether the vehicle is in motion, wherein causing the media display to present the media content identified is based at least in part (i) on a determination of whether the vehicle is in motion and/or (ii) on location of the media display in the vehicle. In some embodiments, the method further comprises inhibiting presentation of media content in response to a determination (i) that the vehicle is in motion and/or (ii) of location of the media display in the vehicle.

[0033] In another aspect, a non-transitory computer-readable program instructions for media viewing in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprises: obtaining, or directing obtainment of, information associated with one or more occupants of the vehicle; identifying, or directing identification of, (i) media content to be presented by the media display, and/or (ii) a tintable window setting for the tintable window based at least in part on the information obtained; and causing, or directing causation of, (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window. [0034] In some embodiments, the information associated with the one or more occupants of the vehicle comprises a stored and/or a historical preference. In some embodiments, the operations further comprise using the stored and/or historical preferences as a learning set for a machine learning scheme utilized to predict preference of the user at a future time. In some embodiments, the information associated with the one or more occupants of the vehicle comprises at least one user input. In some embodiments, the at least one user input comprises a gesture by at least one occupant of the one or more occupants of the vehicle. In some embodiments, the at least one user input comprises input received via a graphical user interface. In some embodiments, the graphical user interface is presented via the media display. In some embodiments, the media content identified comprises: a television show, a movie, an advertisement, a video call, or safety information. In some embodiments, the media content is identified based at least in part on user preferences. In some embodiments, the media content is identified based at least in part on a third party media content application. In some embodiments, the operations further comprise determining whether the vehicle is in motion, wherein causing, or directing causing of, the media display to present the media content identified is based at least in part (i) on a determination of whether the vehicle is in motion and/or (ii) on location of the media display in the vehicle. In some embodiments, the operations further comprise inhibiting presentation of media content in response to a determination (i) that the vehicle is in motion and/or (ii) of location of the media display in the vehicle.

[0035] In another aspect, a system for media viewing in a vehicle, the system comprises: a network configured to: operatively couple to a media display of the vehicle and/or to a tintable window of the vehicle; receive information associated with one or more occupants of the vehicle; transmit an identification of (i) media content to be presented by the media display, and/or (ii) a tintable window setting for the tintable window based at least in part on the information obtained; and transmit instructions that cause (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window. In some embodiments, the information associated with the one or more occupants of the vehicle comprises a stored and/or a historical preference. In some embodiments, the network is configured to operatively couple to a processor that performs a machine learning scheme that uses the stored and/or historical preferences as a learning set to predict preference of the user at a future time. In some embodiments, the information associated with the one or more occupants of the vehicle comprises at least one user input. In some embodiments, the at least one user input comprises a gesture by at least one occupant of the one or more occupants of the vehicle. In some embodiments, the at least one user input comprises input received via a graphical user interface. In some embodiments, the graphical user interface is presented via the media display. In some embodiments, the media content identified comprises: a television show, a movie, an advertisement, a video call, or safety information. In some embodiments, the media content is identified based at least in part on user preferences. In some embodiments, the media content is identified based at least in part on a third party media content application. In some embodiments, the network is further configured to receive a determination of whether the vehicle is in motion, wherein transmitting the instructions that cause the media display to present the media content identified is based at least in part (i) on the determination of whether the vehicle is in motion and/or (ii) on location of the media display in the vehicle. In some embodiments, the network is configured to transmit instructions that inhibit presentation of media content in response to a determination (i) that the vehicle is in motion and/or (ii) of location of the media display in the vehicle. In some embodiments, the network is configured to receive the information associated with the one or more occupants of the vehicle using a first communication protocol, and wherein the network is configured to transmit the instructions using a second communication protocol. In some embodiments, the first communication protocol is different than the second communication protocol. In some embodiments, the network is configured to translate information between the first communication protocol and the second communication protocol. In some embodiments, the first communication protocol and/or the second communication protocol comprise a wireless communication protocol. In some embodiments, the network is configured to facilitate transmission of signals abiding by control protocol, cellular communication protocol, and/or media protocol.

[0036] In another aspect, the present disclosure provides systems, apparatuses (e.g., controllers), and/or non-transitory computer-readable medium (e.g., software) that implement any of the methods disclosed herein.

[0037] In another aspect, the present disclosure provides methods that use any of the systems and/or apparatuses disclosed herein, e.g., for their intended purpose.

[0038] In another aspect, an apparatus comprises at least one controller that is programmed to direct a mechanism used to implement (e.g., effectuate) any of the method disclosed herein, wherein the at least one controller is operatively coupled to the mechanism.

[0039] In another aspect, an apparatus comprises at least one controller that is configured (e.g., programmed) to implement (e.g., effectuate) the method disclosed herein. The at least one controller may implement any of the methods disclosed herein.

[0040] In another aspect, a system comprises at least one controller that is programmed to direct operation of at least one another apparatus (or component thereof), and the apparatus (or component thereof), wherein the at least one controller is operatively coupled to the apparatus (or to the component thereof). The apparatus (or component thereof) may include any apparatus (or component thereof) disclosed herein. The at least one controller may direct any apparatus (or component thereof) disclosed herein.

[0041] In another aspect, a computer software product, comprising a non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a computer, cause the computer to direct a mechanism disclosed herein to implement (e.g., effectuate) any of the method disclosed herein, wherein the non-transitory computer-readable medium is operatively coupled to the mechanism. The mechanism can comprise any apparatus (or any component thereof) disclosed herein.

[0042] In another aspect, the present disclosure provides a non-transitory computer- readable medium comprising machine-executable code that, upon execution by one or more computer processors, implements any of the methods disclosed herein.

[0043] In another aspect, the present disclosure provides a non-transitory computer- readable medium comprising machine-executable code that, upon execution by one or more computer processors, effectuates directions of the controller(s) (e.g., as disclosed herein).

[0044] In another aspect, the present disclosure provides a computer system comprising one or more computer processors and a non-transitory computer-readable medium coupled thereto. The non-transitory computer-readable medium comprises machine-executable code that, upon execution by the one or more computer processors, implements any of the methods disclosed herein and/or effectuates directions of the controller(s) disclosed herein.

[0045] The content of this summary section is provided as a simplified introduction to the disclosure and is not intended to be used to limit the scope of any invention disclosed herein or the scope of the appended claims.

[0046] Additional aspects and advantages of the present disclosure will become readily apparent to those skilled in this art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the present disclosure is capable of other and different embodiments, and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.

[0047] These and other features and embodiments will be described in more detail with reference to the drawings. INCORPORATION BY REFERENCE

[0048] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.

BRIEF DESCRIPTION OF THE DRAWINGS

[0049] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings or figures (also “Fig.” and “Figs.” herein), of which:

[0050] Fig. 1 shows a perspective view of an enclosure (e.g., a vehicle) and a control system;

[0051] Fig. 2 schematically depicts a processing system; [0052] Fig. 3 shows a block diagram of an example master controller (MC);

[0053] Fig. 4 shows a block diagram of an example network controller (NC);

[0054] Fig. 5 illustrates an example control system;

[0055] Fig. 6 shows an apparatus including a device ensemble and its components and connectivity options; [0056] Fig. 7 is a block diagram showing example modules that may be used for implementing voice control;

[0057] Fig. 8 is a flowchart of a control method;

[0058] Fig. 9 is a flowchart of a control method;

[0059] Fig. 10 shows a configuration of components that may be used to implement certain control methods described herein;

[0060] Figs. 11A-C show various configurations of components that may be used to implement certain control methods described herein;

[0061] Fig. 12 depicts an electrochromic (EC) window lite, or IGU or laminate, with a transparent display; [0062] Fig. 13 depicts an electrochromic insulated glass unit with an on-glass transparent display;

[0063] Fig. 14 depicts an enclosure communicatively coupled to its digital twin representation; [0064] Fig. 15 is a flowchart for a control method;

[0065] Fig. 16 depicts user interaction with a digital twin to control a target (e.g., a device);

[0066] Fig. 17 is a schematic representation of a message diagram related to communications between system components;

[0067] Fig. 18 is a flowchart for a control method; [0068] Fig. 19 depicts an example vehicle;

[0069] Fig. 20 is a flowchart for a control method;

[0070] Fig. 21 is a flowchart for a control method;

[0071] Fig. 22 schematically shows an electrochromic device;

[0072] Fig. 23 shows a cross-sectional view of an example electrochromic window; [0073] Fig. 24 illustrates a voltage profile as a function of time; and

[0074] Fig. 25 shows a flow chart for a control method.

[0075] The figures and components therein may not be drawn to scale. Various components of the figures described herein may not be drawn to scale.

DETAILED DESCRIPTION [0076] While various embodiments of the invention have been shown, and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions may occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein might be employed. [0077] Terms such as “a,” “an,” and “the” are not intended to refer to only a singular entity but include the general class of which a specific example may be used for illustration. The terminology herein is used to describe specific embodiments of the invention(s), but their usage does not delimit the invention(s). [0078] When ranges are mentioned, the ranges are meant to be inclusive, unless otherwise specified. For example, a range between value 1 and value 2 is meant to be inclusive and include value 1 and value 2. The inclusive range will span any value from about value 1 to about value 2. The term “adjacent” or “adjacent to,” as used herein, includes “next to,” “adjoining,” “in contact with,” and “in proximity to.”

[0079] The term “operatively coupled” or “operatively connected” refers to a first element (e.g., mechanism) that is coupled (e.g., connected) to a second element, to allow the intended operation of the second and/or first element. The coupling may comprise physical or non-physical coupling. The non-physical coupling may comprise signal-induced coupling (e.g., wireless coupling). Coupled can include physical coupling (e.g., physically connected), or non-physical coupling (e.g., via wireless communication). Additionally, in the following description, the phrases “operable to,” “adapted to,” “configured to,” “designed to,” “programmed to,” or “capable of” may be used interchangeably where appropriate.

[0080] An element (e.g., mechanism) that is “configured to” perform a function includes a structural feature that causes the element to perform this function. A structural feature may include an electrical feature, such as a circuitry or a circuit element. A structural feature may include a circuitry (e.g., comprising electrical or optical circuitry). Electrical circuitry may comprise one or more wires. Optical circuitry may comprise at least one optical element (e.g., beam splitter, mirror, lens and/or optical fiber). A structural feature may include a mechanical feature. A mechanical feature may comprise a latch, a spring, a closure, a hinge, a chassis, a support, a fastener, or a cantilever, and so forth. Performing the function may comprise utilizing a logical feature. A logical feature may include programming instructions. Programming instructions may be executable by at least one processor. Programming instructions may be stored or encoded on a medium accessible by one or more processors.

[0081] The following detailed description is directed to specific example implementations for purposes of disclosing the subject matter. Although the disclosed implementations are described in sufficient detail to enable those of ordinary skill in the art to practice the disclosed subject matter, this disclosure is not limited to particular features of the specific example implementations described herein. On the contrary, the concepts and teachings disclosed herein can be implemented and applied in a multitude of different forms and ways without departing from their spirit and scope. For example, while the disclosed implementations focus on electrochromic windows (also referred to as smart windows), some of the systems, devices and methods disclosed herein can be made, applied or used without undue experimentation to incorporate, or while incorporating, other types of optically switchable devices that are actively switched/controlled, rather than passive coatings such as thermochromic coatings or photochromic coatings that tint passively in response to the sun’s rays. Some other types of actively controlled optically switchable devices include liquid crystal devices, suspended particle devices, and micro-blinds, among others. For example, some or all of such other optically switchable devices can be powered, driven or otherwise controlled or integrated with one or more of the disclosed implementations of controllers described herein.

[0082] In some embodiments, an enclosure corresponds to a vehicle (e.g., a car, a truck, a van, a bus, an airplane, a train or a train car, a rocket ship, or the like). The vehicle may be an autonomous vehicle, a semi-autonomous vehicle, or a non-autonomous vehicle (e.g., manually driven vehicle). In some embodiments, an enclosure comprises an area defined by at least one structure (e.g., fixture). The at least one structure may comprise at least one partition (e.g., wall). An enclosure may comprise and/or enclose one or more sub-enclosure. The at least one partition may comprise metal (e.g., steel), clay, stone, plastic, glass, plaster (e.g., gypsum), polymer (e.g., polyurethane, styrene, or vinyl), asbestos, fiber-glass, concrete (e.g., reinforced concrete), wood, paper, or a ceramic. The at least one wall may comprise wire, bricks, blocks (e.g., cinder blocks), tile, drywall, or frame (e.g., steel frame and/or wooden frame).

[0083] In some embodiments, the enclosure comprises one or more openings. The one or more openings may be reversibly closable. The one or more openings may be permanently open. A fundamental length scale of the one or more openings may be smaller relative to the fundamental length scale of the wall(s) that define the enclosure. A fundamental length scale may comprise a diameter of a bounding circle, a length, a width, or a height. The fundamental length scale may be abbreviated herein as “FLS.” A surface of the one or more openings may be smaller relative to the surface the wall(s) that define the enclosure. The opening surface may be a percentage of the total surface of the wall(s). For example, the opening surface can measure about 30%, 20%, 10%, 5%, or 1% of the walls(s). The wall(s) may comprise a floor, a ceiling or a side wall. The closable opening may be closed by at least one window or door. In some embodiments, an enclosure may be stationary and/or movable (e.g., a train, a plane, a ship, a vehicle, or a rocket).

[0084] In some embodiments, the enclosure encloses an atmosphere. The atmosphere may comprise one or more gases. The gases may include inert gases (e.g., argon or nitrogen) and/or non-inert gases (e.g., oxygen or carbon dioxide). The enclosure atmosphere may resemble an atmosphere external to the enclosure (e.g., ambient atmosphere) in at least one external atmosphere characteristic that includes: temperature, relative gas content, gas type (e.g., humidity, and/or oxygen level), debris (e.g., dust and/or pollen), and/or gas velocity. The enclosure atmosphere may be different from the atmosphere external to the enclosure in at least one external atmosphere characteristic that includes: temperature, relative gas content, gas type (e.g., humidity, and/or oxygen level), debris (e.g., dust and/or pollen), and/or gas velocity. For example, the enclosure atmosphere may be less humid (e.g., drier) than the external (e.g., ambient) atmosphere. For example, the enclosure atmosphere may contain the same (e.g., or a substantially similar) oxygen-to-nitrogen ratio as the atmosphere external to the enclosure. The velocity and/or content of the gas in the enclosure may be (e.g., substantially) similar throughout the enclosure. The velocity and/or content of the gas in the enclosure may be different in different portions of the enclosure (e.g., by flowing gas through to a vent that is coupled with the enclosure). The gas content may comprise relative gas ratio.

[0085] In some embodiments, a network infrastructure is provided in the enclosure (e.g., in a vehicle). The network infrastructure is available for various purposes such as for providing communication and/or power services. The communication services may comprise high bandwidth (e.g., wireless and/or wired) communications services. The communication services can be to occupants of the vehicle and/or to people outside of the vehicle. The network infrastructure may work in concert with, or as a partial replacement of, the infrastructure of one or more cellular carriers. The network may comprise one or more levels of encryption. The network may be communicatively coupled to the cloud and/or to one or more servers external to the facility. The network may support at least a fourth generation wireless (4G), or a fifth-generation wireless (5G) communication. The network may support cellular signals external and/or internal to the facility. The downlink communication network speeds may have a peak data rate of at least about 5 Gigabits per second (Gb/s), 10 Gb/s, or 20 Gb/s. The uplink communication network speeds may have a peak data rate of at least about 2Gb/s, 5Gb/s, or 10 Gb/s. The network infrastructure can be provided in a facility that includes electrically switchable windows. Examples of components of the network infrastructure include a high speed backhaul. The network infrastructure may include at least one cable, switch, (e.g., physical) antenna, transceivers, sensor, transmitter, receiver, radio, processor and/or controller (that may comprise a processor). The network infrastructure may be operatively coupled to, and/or include, a wireless network. The network infrastructure may comprise wiring (e.g., comprising an optical fiber, twisted cable, or coaxial cable). One or more devices (e.g., sensors and/or emitters) can be deployed (e.g., installed) in an environment, e.g., as part of installing the network infrastructure and/or after installing the network infrastructure. The device(s) may be communicatively coupled to the network. The network may comprise a power and/or communication network. The device can be self- discovered on the network, e.g., once it couples (e.g., on its attempt to couple) to the network. The network structure may comprise peer to peer network structure, or client-server network structure. The network may or may not have a central coordination entity (e.g., server(s) or another stable host).

[0086] In some embodiments, a network uses one or more communication protocols to transmit communications between devices operatively coupled to the network. The devices may include user devices (e.g., mobile phones, tablet computer, wearable computers, desktop computers, laptop computers, gaming consoles, vehicle information and/or entertainment systems, or the like), controllable devices (e.g., one or more tintable windows, a media display, components of an HVAC system, components of a lighting system, or the like), intermediate controllers that transmit and/or receive communications from other controllers (e.g., a master controller, an end (e.g., local) controller, etc.), or the like. In some embodiments, the communication protocols may include automation control protocols. In some embodiments, control protocols may be used to transmit instructions to devices that may be controlled, for example, instructions to perform particular actions. Examples of control protocols include: a vehicle bus standard that allows controllers and/or devices to communicate with each other without use of an intermediate host computer, such as Controller Area Network (CAN) protocol orCAN-based protocols (e.g., 11939, IS011783, etc.); an automotive network communications protocol, such as FlexRay; a vehicle-based or automotive-based high-speed serial communications and/or isochronous real-time data transfer protocol, such as IDB-1394; a communication bus protocol for devices within a vehicle, such as lEBus; a serial communications control protocol (e.g., 11708, or other serial communications control protocols); a communications protocol for on-board diagnostics of vehicle components, such as Keyword Protocol 2000 (KWP2000); a serial network protocol for communication between components of a vehicle, such as Logical Interconnect Network (LIN); a high-speed multimedia network technology for communication of audio, voice, video and/or other data, such as Media Oriented Systems Transport (MOST); a real-time distributed computing and/or communication protocol for intravehicular communication, such as UAVCAN; a vehicle bus protocol that uses a serial protocol, such as Vehicle Area Network (VAN), or the like. In some embodiments, the communication protocols may include cellular communication protocols. Cellular communication protocols may comprise a fourth generation (4G) communication protocol and/or a fifth generation (5G) communication protocol. In some embodiments, the communication protocols may include media protocols. Media protocols may include one or more protocols used to stream, present, and/or control presentation of media content on media devices and/or displays. Examples of media protocols include: a network control protocol to control streaming media servers and/or establishing and/or controlling media sessions, such as real-time streaming protocol (RTSP); a network protocol for delivering media content (e.g., video and/or audio) over Internet Protocol (IP) networks, such as real-time transport protocol (RTCP); a protocol associated with the IP suite, such as transmission control protocol (TCP); unicast protocols used for one-to-one communication between devices; multicast protocols for one-to-many communication between devices, or the like. In some embodiments, a network may be configured to use combinations of different communication protocols. For example, the network may be configured to receive data from a first device using a first communications protocol and transmit data to a second device using a second communications protocol. In one example, the network can receive data from the first device using a wireless communications protocol (e.g., 4G cellular communications protocol, 5G cellular communications protocol, or the like) and transmit data to the second device using an automation control protocol (e.g., a CAN protocol or a CAN-based protocol, VAN, etc.). In some embodiments, the network may be configured to translate instructions and/or other data from one protocol to another, different protocol. In some embodiments, the network may be configured to identify a protocol to be used based at least in part on an identity of a device (or a type of device) that is sending and/or receiving communications.

[0087] In another embodiment the present disclosure provides networks that are configured for transmission of any communication (e.g., signal) and/or (e.g., electrical) power facilitating any of the operations disclosed herein. The communication may comprise control communication, cellular communication, media communication, and/or data communication. The data communication may comprise sensor data communication and/or processed data communication. The networks may be configured to abide by one or more protocols facilitating such communication. For example, a communications protocol used by the network (e.g., with a BMS) can comprise a building automation and control networks protocol (BACnet). The network may be configured for (e.g., include hardware facilitating) communication protocols comprising BACnet (e.g., BACnet/SC), LonWorks, Modbus, KNX, European Home Systems Protocol (EHS), BatiBUS, European Installation Bus (EIB or Instabus), zigbee, Z-wave, Insteon, X10, Bluetooth, or WiFi. The network may be configured to transmit the control related protocol. A communication protocol may facilitate cellular communication abiding by at least a 2nd, 3rd, 4th, or 5th generation cellular communication protocol. The (e.g., cabling) network may comprise a tree, line, or star topologies. The network may comprise interworking and/or distributed application models for various tasks of the building automation. The control system may provide schemes for configuration and/or management of resources on the network. The network may permit binding of parts of a distributed application in different nodes operatively coupled to the network. The network may provide a communication system with a message protocol and models for the communication stack in each node (capable of hosting distributed applications (e.g., having a common Kernel). The control system may comprise programmable logic controller(s) (PLC(s)). The network may utilize a vehicle bus standard (e.g., by utilizing a CANBus protocol, a CANOpen protocol, or the like) that allows various controllers and/or devices to communicate (e.g., transmit and/or receive message) with each other directly (e.g., without using an intermediate host computer). Messages may be transmitted sequentially.

Messages may be prioritized for transmission on the bus. Prioritization may be based at least in part on priorities of devices. Messages may be transmitted via the bus based at least in part on priorities of the two or more devices (e.g., in instances in which two or more devices transmit messages simultaneously and/or concurrently).

[0088] In some embodiments, the control system is a computer-based control system. The control system can be installed in a facility (e.g., vehicle) to monitor and otherwise control (e.g., regulate, manipulate, restrict, direct, monitor, adjust, modulate, vary, alter, restrain, check, guide, or manage) the facility. For example, the control system may control one or more devices communicatively coupled to the network. The one or more devices may include mechanical and/or electrical equipment such as ventilation, lighting, power systems, fire systems, and/or security systems. Controllers (e.g., nodes and/or processors) may be suited for integration with the control system. The hardware of the control system may include interconnections by communication channels to one or more processors (e.g., and associated software), e.g., for maintaining one or more conditions in the facility. The one or more conditions in the facility may be according to preference(s) set by a user (e.g., an occupant, a facility owner, and/or a facility manager). The software can utilize, e.g., internet protocols and/or open standards. A node can be any addressable circuitry. For example, a node can be a circuitry that has an Internet Protocol (IP) address.

[0089] In some embodiments, a control system of the facility (e.g., of the vehicle) may be implemented in a vehicle (e.g., a car, a truck, a bus, a van, an airplane, or the like). The vehicle device control system function, e.g., to control one or more characteristics of an environment of the vehicle. The one or more characteristics may comprise: temperature, carbon dioxide levels, gas flow, various volatile organic compounds (VOCs), and/or humidity in the facility. There may be mechanical devices that are controlled by the control system such as one or more heaters, air conditioners, blowers, and/or vents. To control the vehicle environment, a control system may turn these various devices on and/or off, e.g., under defined conditions. A core function of a control system of the facility may be to maintain a comfortable environment for occupants of the facility (e.g., vehicle), e.g., while minimizing heating and cooling costs and/or demand. A control system can be used to control one or more of the various systems. A control system may be used to optimize the synergy between various systems. For example, the control system may be used to conserve energy by controlling when an air conditioning system (or component thereof) is turned on within the vehicle. As another example, a control system may be used to improve safety by determining which (if any) media displays of the vehicle is permitted to display media content based at least in part on a location of the media displays and/or whether the vehicle is in motion.

[0090] In some embodiments, a local controller (e.g., window controller) is integrated with a control system. The local controller may be directly connected to the device (e.g., without an lower hierarchy intervening controller between the local controller and the device). For example, the local controller can be configured to control one or more devices of the vehicle. For example, the window controller can be configured to control one or more tintable windows (e.g., electrochromic windows) of the vehicle. In one embodiment, the one or more electrochromic windows include at least one all solid state and inorganic electrochromic device, but may include more than one electrochromic device, e.g. where each lite or pane of an IGU is tintable. In one embodiment, the one or more electrochromic windows include only all solid state and inorganic electrochromic devices. In one embodiment, the electrochromic windows are multistate electrochromic windows. Examples of tintable windows can be found in, in U.S. patent application Ser. No. 12/851,514, filed on August 5, 2010, and titled "Multipane Electrochromic Windows," which is incorporated herein by reference in its entirety.

[0091] In some embodiments, one or more devices such as sensors, emitters, and/or actuators, are operatively coupled to at least one controller and/or processor. Sensor readings may be obtained by one or more processors and/or controllers. A controller may comprise a processing unit (e.g., CPU or GPU). A controller may receive an input (e.g., from at least one device or projected media). The controller may comprise circuitry, electrical wiring, optical wiring, socket, and/or outlet. A controller may receive an input and/or deliver an output. A controller may comprise multiple (e.g., sub ) controllers. An operation (e.g., as disclosed herein) may be performed by a single controller or by a plurality of controllers. At least two operations may be each preconformed by a different controller. At least two operations may be preconformed by the same controller. A device and/or media may be controlled by a single controller or by a plurality of controllers. At least two devices and/or media may be controlled by a different controller. At least two devices and/or media may be controlled by the same controller. The controller may be a part of a control system. The control system may comprise a master controller, and/or a local controller. The local controller may be a target controller. The control system may have two hierarchy layers (e.g., a master controller and local controllers). The control system may have three or more hierarchy layers (e.g., a master controller, compartment controllers, and local controllers).

For example, a vehicle that is a train may have a master controller of the train, a compartment controller for each train car, and local controllers that are directly coupled to one or more devices in a train car (e.g., railway wagon). For example, a vehicle that is a ship may have a master controller of the ship, a compartment controller for each story of the ship, and local controllers that are directly coupled to one or more devices in a story of the ship. For example, a vehicle that is an airplane may have a master controller of the airplane, a suite controller for each class suite (e.g., first class, business class, economy class), and local controllers that are directly coupled to one or more devices in a suite of the airplane.

For example, the local controller may be a window controller (e.g., controlling an optically switchable window), enclosure controller, or component controller. The controller may be a part of a hierarchal control system. They hierarchal control system may comprise a main controller that directs one or more controllers, e.g., local controllers (e.g., window controllers), enclosure controllers, and/or component controllers. The target may comprise a device such as a media display. The device may comprise an electrochromic window, a sensor, an emitter, an antenna, a receiver, a transceiver, or an actuator.

[0092] In some embodiments, the network infrastructure is operatively coupled to one or more controllers. In some embodiments, a physical location of the controller type in the hierarchal control system changes. A controller may control one or more devices (e.g., be directly coupled to the devices). A controller may be disposed proximal to the one or more devices it is controlling. For example, a controller may control an optically switchable device (e.g., IGU), an antenna, a sensor, and/or an output device (e.g., a light source, sounds source, smell source, gas source, HVAC outlet, a stereo system or speaker system, an olfactory compound dispenser, and/or heater). In one embodiment, an intermediate controller may direct one or more window controllers, one or more enclosure controllers, one or more component controllers, or any combination thereof. The compartment controller may comprise a network or intermediate controller. For example, the compartment (e.g., comprising network) controller may control a plurality of local (e.g., comprising window) controllers. A plurality of local controllers may be disposed in a portion of a facility (e.g., in a portion of a building). The portion of the facility may be a portion of a facility (e.g., vehicle), such as a suite of an airplane, a train car of a train, a region of an automobile, etc. For example, a compartment controller may be assigned to a portion of a vehicle, such as a left side of a car, a right-side of a car, a top deck of a double-decker bus, a train car of a train, or the like. In some embodiments, a portion of a vehicle may comprise a plurality of compartment controllers, e.g., depending on the size and/or the number of local controllers coupled to the compartment controller. For example, a compartment controller may be assigned to a portion of a vehicle. For example, a compartment controller may be assigned to a portion of the local controllers disposed in the facility (e.g., vehicle). For example, a compartment controller may be assigned to a portion of the regions of a vehicle. A master controller may be coupled to one or more compartment controllers. The compartment controller may be disposed in the facility. A master controller may be disposed in the vehicle, or external to the vehicle. The master controller may be disposed in the cloud. A controller may be a part of, or be operatively coupled to, a control system. A controller may receive one or more inputs. A controller may generate one or more outputs. The controller may be a single input single output controller (SISO) or a multiple input multiple output controller (MIMO). A controller may interpret an input signal received. A controller may acquire data from the one or more components (e.g., sensors). Acquire may comprise receive or extract. The acquisition of data may comprise measurement, estimation, determination, generation, or any combination thereof. A controller may comprise feedback control. A controller may comprise feed-forward control. Control may comprise on-off control, proportional control, proportional-integral (PI) control, or proportional-integral-derivative (PID) control. Control may comprise open loop control, or closed loop control. A controller may comprise closed loop control. A controller may comprise open loop control. A controller may comprise a user interface. A user interface may comprise (or operatively coupled to) a keyboard, keypad, mouse, touch screen, microphone, speech recognition package, camera, imaging system, or any combination thereof. Outputs may include a display (e.g., screen), speaker, or printer. In some embodiments, a local controller controls one or more devices and/or media (e.g., media for projection). For example, a local controller can control one or more IGUs, one or more sensors, one or more output devices (e.g., one or more emitters), one or more media displays, one or more media projectors, or any combination thereof.

[0093] In some embodiments, a control system includes a multipurpose controller. By incorporating feedback (e.g., of the controller), a control system may provide enhanced: (1) environmental control, (2) energy savings, (3) security, (4) flexibility in control options, (5) improved reliability and usable life of other systems (e.g., due to decreased reliance thereon and/or reduced maintenance thereof), (6) information availability and/or diagnostics, or (7) various combinations thereof. These enhancements may facilitate and/or assist automatically controlling any of the devices disclosed herein. In some embodiments, a control system may not be present in the facility. In some embodiments, a control system may be present in the facility while the master controller may be outside of the facility (e.g., located remotely such as in the cloud). In some embodiments, a control system may communicate with a portion of the levels in the hierarchy of controllers. For example, the control system may communicate (e.g., at a high level) with a master controller. In some embodiments, a control system may (e.g., temporarily) not communicate with a portion of the levels in the hierarchy of controllers of the control system. For example, the control system may (e.g., temporarily) not communicate with the local controller and/or intermediate controller. In certain embodiments, maintenance on the control system would not interrupt control of the devices of the facility that are communicatively coupled to the control system.

In some embodiments, the control system comprises at least one controller that may or may not be part of the hierarchical control system.

[0094] Fig. 1 shows an example of a control system architecture 100 disposed at least partly in an enclosure (e.g., vehicle) 150. Control system architecture 100 comprises a master controller 108 that controls network controllers 106, that in turn control local controllers 104.

In the example shown in Fig. 1, a master controller 108 is operatively coupled (e.g., wirelessly and/or wired) to a control system 124 and to a database 120. Arrows in FIG. 1 represents communication pathways. A controller may be operatively coupled (e.g., directly/indirectly and/or wired and/wirelessly) to an external source 110. Master controller 108 may control network controllers that include network controllers 106, that may in turn control local controllers such as window controllers 104. In some embodiments, the local controllers (e.g., 106) control one or more devices 102a-102j such as IGUs, one or more sensors, one or more output devices (e.g., one or more emitters), media displays, or any combination thereof. At least two of the devices 102a-102j may be of different types. At least two of the devices 102a-102j may be of the same type. The device may be any device disclosed herein. The external source may comprise a network. The external source may comprise one or more sensor or output device. The external source may comprise a cloud- based application and/or database. The communication may be wired and/or wireless. The external source may be disposed external to the vehicle. For example, the external source may comprise one or more sensors and/or antennas disposed, e.g., on a wall, ceiling, or roof of a vehicle. The communication may be monodirectional or bidirectional. In the example shown in Fig. 1, the communication all communication arrows are meant to be bidirectional (e.g., 118, 122, 114, and 112).

[0095] The methods, systems and/or the apparatus described herein may comprise a control system. The control system can be in communication with any of the apparatuses (e.g., sensors) described herein. The sensors may be of the same type or of different types, e.g., as described herein. For example, the control system may be in communication with the first sensor and/or with the second sensor. A plurality of devices (e.g., sensors and/or emitters) may be disposed in a container and may constitute an ensemble (e.g., device ensemble, and/or a digital architectural element). The ensemble may comprise at least two devices of the same type. The ensemble may comprise at least two devices of a different type. The devices in the ensemble may be operatively coupled to the same electrical board. The electrical board may comprise circuitry. The electrical board may comprise, or be operatively coupled to a controller (e.g., a local controller). The control system may control the one or more devices (e.g., sensors). The control system may control one or more components of a control system (e.g., lightening, security, media display, and/or air conditioning system). The controller may regulate at least one (e.g., environmental) characteristic of the enclosure. The control system may regulate the enclosure environment using any component of the control system. For example, the control system may regulate the energy supplied by a heating element and/or by a cooling element. For example, the control system may regulate velocity of an air flowing through a vent to and/or from the enclosure. The control system may comprise a processor. The processor may be a processing unit. The controller may comprise a processing unit. The processing unit may be central. The processing unit may comprise a central processing unit (abbreviated herein as “CPU”). The processing unit may comprise a graphic processing unit (abbreviated herein as “GPU”). The controller(s) or control mechanisms (e.g., comprising a computer system) may be programmed to implement one or more methods of the disclosure. The processor may be programmed to implement methods of the disclosure (e.g., by executing a computer readable code). The controller may control at least one component of the forming systems and/or apparatuses disclosed herein. Examples of digital architectural element, control system, network, and associated methods and computer readable media can be found in International Patent Applications Serial Nos. PCT/US18/29476, PCT/US19/30467,

PCT/US18/29460, PCT/US18/29406, and PCT/US20/70123 filed June 4, 2020, titled “SECURE BUILDING SERVICES NETWORK,” each of which is incorporated herein by reference in its entirety.

[0096] Fig. 2 shows a schematic example of a computer system 200 that is programmed or otherwise configured to one or more operations of any of the methods provided herein. The computer system can control (e.g., direct, monitor, and/or regulate) various features of the methods, apparatuses and systems of the present disclosure, such as, for example, control heating, cooling, lighting, and/or venting of an enclosure, or any combination thereof. The computer system can be part of, or be in communication with, any device ensemble disclosed herein (also referred to herein as “digital architectural element”). The computer may be coupled to one or more mechanisms disclosed herein, and/or any part of the mechanism(s). For example, the computer may be coupled to one or more sensors, valves, switches, lights, windows (e.g., IGUs), motors, pumps, optical components, or any combination thereof. [0097] The computer system can include a processing unit (e.g., 206) (also “processor,” “computer” and “computer processor” used herein). The computer system may include memory or memory location (e.g., 202) (e.g., random-access memory, read-only memory, flash memory), electronic storage unit (e.g., 204) (e.g., hard disk), communication interface (e.g., 203) (e.g., network adapter) for communicating with one or more other systems, and peripheral devices (e.g., 205), such as cache, other memory, data storage and/or electronic display adapters. In the example shown in Fig. 2, the memory 202, storage unit 204, interface 203, and peripheral devices 205 are in communication with the processing unit 206 through a communication bus (solid lines), such as a motherboard. The storage unit can be a data storage unit (or data repository) for storing data. The computer system can be operatively coupled to a computer network (“network”) (e.g., 201) with the aid of the communication interface. The network can be the Internet, an internet and/or extranet, or an intranet and/or extranet that is in communication with the Internet. In some cases, the network is a telecommunication and/or data network. The network can include one or more computer servers, which can enable distributed computing, such as cloud computing. The network, in some cases with the aid of the computer system, can implement a peer-to-peer network, which may enable devices coupled to the computer system to behave as a client or a server.

[0098] The processing unit can execute a sequence of machine-readable instructions, which can be embodied in a program or software. The instructions may be stored in a memory location, such as the memory 202. The instructions can be directed to the processing unit, which can subsequently program or otherwise configure the processing unit to implement methods of the present disclosure. Examples of operations performed by the processing unit can include fetch, decode, execute, and write back. The processing unit may interpret and/or execute instructions. The processor may include a microprocessor, a data processor, a central processing unit (CPU), a graphical processing unit (GPU), a system-on-chip (SOC), a co-processor, a network processor, an application specific integrated circuit (ASIC), an application specific instruction-set processor (ASIPs), a controller, a programmable logic device (PLD), a chipset, a field programmable gate array (FPGA), or any combination thereof. The processing unit can be part of a circuit, such as an integrated circuit. One or more other components of the system 200 can be included in the circuit.

[0099] The storage unit can store files, such as drivers, libraries and saved programs. The storage unit can store user data (e.g., user preferences and user programs). In some cases, the computer system can include one or more additional data storage units that are external to the computer system, such as located on a remote server that is in communication with the computer system through an intranet or the Internet. [0100] The computer system can communicate with one or more remote computer systems through a network. For instance, the computer system can communicate with a remote computer system of a user (e.g., operator). Examples of remote computer systems include personal computers (e.g., portable PC), slate or tablet PC's (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android-enabled device, Blackberry®), or personal digital assistants. A user (e.g., client) can access the computer system via the network.

[0101] Methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the computer system, such as, for example, on the memory 202 or electronic storage unit 204. The machine executable or machine-readable code can be provided in the form of software. During use, the processor 206 can execute the code. In some cases, the code can be retrieved from the storage unit and stored on the memory for ready access by the processor. In some situations, the electronic storage unit can be precluded, and machine-executable instructions are stored on memory.

[0102] The code can be pre-compiled and configured for use with a machine that has a processer adapted to execute the code, or the code can be compiled during runtime. The code can be supplied in a programming language that can be selected to enable the code to execute in a pre-compiled or as-compiled fashion. In some embodiments, the processor comprises a code. The code can be program instructions. The program instructions may cause the at least one processor (e.g., computer) to direct a feed forward and/or feedback control loop. In some embodiments, the program instructions cause the at least one processor to direct a closed loop and/or open loop control scheme. The control may be based at least in part on one or more sensor readings (e.g., sensor data). One controller may direct a plurality of operations. At least two operations may be directed by different controllers. In some embodiments, a different controller may direct at least two of operations (a), (b) and (c). In some embodiments, different controllers may direct at least two of operations (a), (b) and (c). In some embodiments, a non-transitory computer-readable medium cause each a different computer to direct at least two of operations (a), (b) and (c).

In some embodiments, different non-transitory computer-readable mediums cause each a different computer to direct at least two of operations (a), (b) and (c). The controller and/or computer readable media may direct any of the apparatuses or components thereof disclosed herein. The controller and/or computer readable media may direct any operations of the methods disclosed herein. The controller may be operatively (communicatively) coupled to control logic (e.g., code embedded in a software) in which its operation(s) are embodied. [0103] In some embodiments, a distributed network of controllers can be used to control one or more optically switchable windows of a facility (e.g., vehicle). For example, a network system may be operable to control a plurality of IGUs. One primary function of the network system is controlling the optical states of electrochromic devices (ECDs) (or other optically switchable devices) within the IGUs. In some implementations, one or more windows can be multi-zoned windows, for example, where each window includes two or more independently controllable ECDs or zones. In some embodiments, the network system 300 (of Fig. 3) is operable to control the electrical characteristics of the power signals provided to the IGUs. For example, the network system can generate and communicate tinting instructions (also referred to herein as “tint commands”) which control voltages applied to the ECDs within the IGUs.

[0104] In some embodiments, another function of the network system is to acquire status information from the IGUs (hereinafter “information” is used interchangeably with “data”). For example, the status information for a given IGU can include an identification of, or information about, a current tint state of the ECD(s) within the IGU. The network system also can be operable to acquire data from various sensors, such as temperature sensors, photosensors (also referred to herein as light sensors), humidity sensors, air flow sensors, occupancy sensors, gas sensors (e.g., carbon dioxide sensors, oxygen sensors, or the like), motion sensors, geolocation sensors, or any combination thereof, whether integrated on or within the IGUs or located at various other positions in, on or around the facility (e.g., vehicle).

[0105] The network system can include any suitable number of distributed controllers having various capabilities or functions. In some implementations, the functions and arrangements of the various controllers are defined hierarchically. For example, the network system can include a plurality of distributed window controllers (WCs), a plurality of network controllers (NCs), and a master controller (MC). In some implementations, the MC can communicate with and control tens or hundreds of NCs. In various implementations, the MC issues high level instructions to the NCs over one or more wired and/or wireless links. The instructions can include, for example, tint commands for causing transitions in the optical states of the IGUs controlled by the respective NCs. Each NC can, in turn, communicate with and control a number of WCs over one or more wired and/or wireless links. For example, each NC can control tens or hundreds of the WCs. Each WC can, in turn, communicate with, drive or otherwise control one or more respective IGUs over one or more wired and/or wireless links.

[0106] In some embodiments, the MC issues communications including tint commands, status request commands, data (for example, sensor data) request commands or other instructions. The MC 308 may issue such communications periodically, at certain predefined times of day (which may change based at least in part on the day of week or year), or based at least in part on the detection of particular events, conditions or combinations of events or conditions (for example, as determined by acquired sensor data or based at least in part on the receipt of a request initiated by a user or by an application or a combination of such sensor data and such a request). In some embodiments, when the MC determines to cause a tint state change in a set of one or more IGUs, the MC generates or selects a tint value corresponding to the desired tint state. In some embodiments, the set of IGUs is associated with a first protocol identifier (ID) (for example, a BACnet ID). The MC then generates and transmits a communication — referred to herein as a “primary tint command” — including the tint value and the first protocol ID over the link via a first communication protocol (for example, a BACnet compatible protocol). The MC may address the primary tint command to the particular NC that controls the particular one or more WCs that, in turn, control the set of IGUs to be transitioned.

[0107] In some embodiments, the NC receives the primary tint command including the tint value and the first protocol ID and maps the first protocol ID to one or more second protocol IDs. Each of the second protocol IDs may identify a corresponding one of the WCs. The NC may subsequently transmit a secondary tint command including the tint value to each of the identified WCs over the link via a second communication protocol. For example, each of the WCs that receives the secondary tint command can then select a voltage or current profile from an internal memory based at least in part on the tint value to drive its respectively connected IGUs to a tint state consistent with the tint value. Each of the WCs may then generate and provide voltage or current signals over the link to its respectively connected IGUs to apply the voltage or current profile, for example.

[0108] In some embodiments, the various targets (devices e.g., sensors, emitters, IGUs, displays, and any other devices disclosed herein) are (e.g., advantageously) grouped into zones of targets (e.g., of EC windows). At least one zone (e.g., each of which zones) can include a subset of the targets (e.g., IGUs). For example, at least one (e.g., each) zone of targets (e.g., IGUs) may be controlled by one or more respective intermediate controllers (e.g., NCs) and one or more respective local controllers (e.g., WCs) controlled by these intermediate controllers (e.g., NCs). In some examples, at least one (e.g., each) zone can be controlled by a single intermediate controller (e.g., NC) and two or more local controllers (e.g., WCs) controlled by the single intermediate controller (e.g., NC). For example, a zone can represent a logical grouping of the targets (e.g., IGUs). Each zone may correspond to a set of targets (e.g., IGUs) in a specific location or area of the vehicle that are driven together based at least in part on their location. For example, a vehicle may have four faces or sides (e.g., a front side, a rear side, a left side, and a right side). In such a didactic example, each zone may correspond to the set of electrochromic windows on a particular side. For example, a left-side zone may include tintable (e.g., electrochromic) windows corresponding to a front left-side window and a back left-side window. In some embodiments, a zone may correspond to interior tintable windows, such as one or more tintable windows between rows of seats in the vehicle. At least one (e.g., each) zone may correspond to a set of targets (e.g., IGUs) that share one or more physical characteristics (for example, device parameters such as size or age). In some embodiments, a zone of targets (e.g., IGUs) is grouped based at least in part on one or more non-physical characteristics such as, for example, whether the targets (e.g., IGUs) are adjacent to a driver or operator of the vehicle, whether the targets (e.g., IGUs) are between an interior and an exterior of the vehicle, or the like. In some embodiments, a zone may correspond to a group of interior windows, such as interior windows between particular rows of a vehicle or the like. In one example, a zone corresponds to interior windows between a front row and a rear (i.e. , passenger) row of a car. In another example, a zone corresponds to interior windows in an airplane between two rows of passenger seats. In yet another example, a zone corresponds to interior windows between particular rows (e.g., rows 20 - 25, rows 1 - 5, or the like) on a particular side of an airplane (e.g., the left side, the right side, etc.).

[0109] In some embodiments, at least one (e.g., each) intermediate controller (e.g., NC) is able to address all of the targets (e.g., IGUs) in at least one (e.g., each) of one or more respective zones. For example, the MC can issue a primary tint command to the intermediate controller (e.g., NC) that controls a target zone. The primary tint command can include an (e.g., abstract) identification of the target zone (hereinafter also referred to as a “zone ID”). For example, the zone ID can be a first protocol ID such as that just described in the example above. In such cases, the intermediate controller (e.g., NC) receives the primary tint command including the tint value and the zone ID and maps the zone ID to the second protocol IDs associated with the local controllers (e.g., WCs) within the zone. In some embodiments, the zone ID is a higher level abstraction than the first protocol IDs. In such cases, the intermediate controller (e.g., NC) can first map the zone ID to one or more first protocol IDs, and subsequently map the first protocol IDs to the second protocol IDs.

[0110] In some embodiments, the MC is coupled to one or more outward-facing networks via one or more wired and/or wireless links. For example, the MC can communicate acquired status information or sensor data to remote computers, mobile devices, servers, databases in or accessible by the outward-facing network. In some embodiments, various applications, including third party applications or cloud-based applications, executing within such remote devices are able to access data from or provide data to the MC. In some embodiments, authorized users or applications communicate requests to modify the tint states of various IGUs to the MC via the network. For example, the MC can first determine whether to grant the request (for example, based at least in part on power considerations or based at least in part on whether the user has the appropriate authorization) prior to issuing a tint command. The MC may then calculate, determine, select or otherwise generate a tint value and transmit the tint value in a primary tint command to cause the tint state transitions in the associated IGUs.

[0111] In some embodiments, a user submits such a request from a computing device, such as a desktop computer, laptop computer, tablet computer or mobile device (for example, a smartphone). The user’s computing device may execute a client-side application that is capable of communicating with the MC, and in some examples, with a master controller application executing within the MC. In some embodiments, the client-side application may communicate with a separate application, in the same or a different physical device or system as the MC, which then communicates with the master controller application to affect the desired tint state modifications. For example, the master controller application or other separate application can be used to authenticate the user to authorize requests submitted by the user. The user may select a target to be manipulated (e.g., the IGUs to be tinted), and directly or indirectly inform the MC of the selections, e.g., by entering an enclosure ID (e.g., vehicle identifier) via the client-side application.

[0112] In some embodiments, a mobile circuitry of a user (e.g., mobile electronic device or other computing device) can communicate, e.g., wirelessly with various local controllers (e.g., WCs). For example, a client-side application executing within a mobile circuitry of a user (e.g., mobile device) can transmit wireless communications including control signals related to a target to the local controller to control the target, which target is communicatively coupled to the local controller (e.g., via the network). For example, a user may initiate directing a tint state control signals to a WC to control the tint states of the respective IGUs connected to the WC. For example, the user can use the client-side application to control (e.g., maintain or modify) the data sampling rate of a particular sensor. For example, a user may initiate directing a light intensity change control signals to a local controller to control the light of a lighting device (e.g., an interior light or other lighting device) communicatively coupled to the local controller. For example, a user may initiate directing a media projection change control signals to a local controller to control the media displayed by a media device coupled to the local controller. The wireless communications can be generated, formatted and/or transmitted using various wireless network topologies and protocols, for example. [0113] In some embodiments, the control signals sent to the local controller (e.g., WC) from a mobile circuitry (e.g., device) of a user (or other computing device) override a previously sent signal (e.g., a tint value previously received by the WC from the respective NC). The previously sent signal may be automatically generated, e.g., by the control system. In other words, the local controller (e.g., WC) may provide the applied voltages to the target (e.g., IGUs) based at least in part on the control signals from the mobile circuitry of the user (e.g., user’s computing device), e.g., rather than based at least in part on the predetermined signal (e.g., the tint value). For example, a control algorithm or rule set stored in and executed by the local controller (e.g., WC) may dictate that one or more control signals from a mobile device of a user (e.g., an authorized user’s computing device) that will take precedence over a respective signal received from the control system (e.g., a tint value received from the NC). In some embodiments, such as in high demand cases, control signals (such as a tint value from the NC) take precedence over any control signals received by the local controller (e.g., WC) from a mobile circuitry of a user (e.g., a user’s computing device). A control algorithm or rule set may dictate that control signal (e.g., relating to tint) overrides from only certain users (or groups or classes of users) may take precedence based at least in part on permissions granted to such users. In some instances, other factors including time of day or the location of the target (e.g., IGUs) may influence the permission to override a predetermined signal of the control system.

[0114] In some embodiments, based at least in part on the receipt of a control signal from a mobile circuity of a user (e.g., an authorized user’s computing device), the MC uses information about a combination of known parameters to calculate, determine, select and/or otherwise generate a command signal (e.g., relating to a tint value) that provides (e.g., lighting) conditions requested by a (e.g., typical) user, e.g., while in some instances also using power efficiently. For example, the MC may determine a state of a target based at least in part on preset preferences defined by or for the particular user that requested the target status change via the mobile circuitry (e.g., via the computing device). For example, the MC may determine the tint value based at least in part on preset preferences defined by or for the particular user that requested the tint state change via the computing device. For example, the user may be required to enter a password or otherwise login or obtain authorization to request a change in a state of a target (e.g., tint state change). The MC may determine the identity of the user based at least in part on a password, a security token and/or an identifier of the particular mobile circuitry (e.g., mobile device or other computing device). In some embodiments, the MC may determine the identity of the user based on facial recognition, fingerprint recognition, and/or other biometric signatures. In some embodiments, biometric signatures may be determined automatically (e.g., without explicit user input) for example by a sensor disposed on and/or in the vehicle (e.g., on and/or near a steering wheel of the vehicle, on and/or near a dashboard of the vehicle, etc.). Example sensors that may be used for collecting data used to obtain biometric signatures for determining an identity of an occupant of a vehicle include infrared sensors, cameras, motion sensors, or the like. After determining the identity of the user, the MC may then retrieve preset preferences for the user, and use the preset preferences alone or in combination with other parameters (such as power considerations and/or information from various sensors) to generate and transmit a status change of the target (e.g., tint value for use in tinting the respective IGUs, a level of airflow of a particular vent, etc.).

[0115] In some embodiments, the network system includes wall switches, dimmers, or other (e.g., tint-state) controlling devices. A wall switch generally refers to an electromechanical interface connected to a local controller (e.g., WC). The wall switch can convey a target status change (e.g., tint) command to the local controller (e.g., WC), which can then convey the target status change (e.g., tint) command to an upper level controller such as an intermediate controller (e.g., NC). Such control devices can be collectively referred to as “wall devices,” although such devices need not be limited to wall-mounted implementations (for example, such devices also can be located on a ceiling of a vehicle, a dashboard of a vehicle, an inside of a door of a vehicle, or the like). For example, such a switch may be used for use in controlling the state of a target (e.g., tint states of the adjoining IGUs). For example, the IGUs in a particular region of a vehicle can be grouped into a zone (e.g., a left side zone, a right-side zone, a zone corresponding to a particular row of seats, or the like). Each of the switches can be operated by an end user (for example, an occupant of the vehicle) to control the state of grouped targets (e.g., to control tint state or other functions or parameters of the IGUs associated with the zone). For example, at certain times of the day, the adjoining IGUs may be tinted to a dark state to reduce the amount of light energy entering the vehicle from the outside (for example, to reduce a need for AC for cooling). For example, at certain times of the day, the adjoining heaters may be turned on to a warmer temperature to facilitate occupant comfort. In some embodiments, a user can operate the wall device to communicate one or more control signals to cause a (e.g., tint state) transition from one state of a target to another state (e.g., from the dark state to a lighter tint state of an IGU).

[0116] In some embodiments, each wall device includes one or more switches, buttons, dimmers, dials, or other physical user interface controls enabling the user to select a particular tint state or to increase or decrease a current tinting level of the IGUs associated with the wall device. The wall device may include a display having a touchscreen interface enabling the user to select a particular tint state (for example, by selecting a virtual button, selecting from a dropdown menu or by entering a tint level or tinting percentage) or to modify the tint state (for example, by selecting a “darken” virtual button, a “lighten” virtual button, or by turning a virtual dial or sliding a virtual bar). In some embodiments, the wall device includes a docking interface enabling a user to physically and communicatively dock a mobile circuitry (e.g., portable device such as a smartphone, multimedia device, remote controller, virtual reality device, tablet computer, or other portable computing device (for example, an IPHONE, IPOD or IPAD produced by Apple, Inc. of Cupertino, CA)). The mobile circuitry may be embedded in a vehicle (e.g., car, motorcycle, drone, airplane). The mobile circuitry may be embedded in a robot. A circuitry may be embedded in (e.g., be part of) a virtual assistant Al technology, speaker, (e.g., smart speaker such as Google Nest, or Amazon Echo Dot). Coupling of the mobile circuitry to the network may be initiated by a user’s presence in the enclosure, or by a user’s coupling (e.g., whether remote or local) to the network. Coupling of the user to the network may be secure (e.g., having one or more security layers, and/or require one or more security tokens (e.g., keys)). The presence of the user in the enclosure may be sensed (e.g., automatically) by using the sensor(s) that are coupled to the network. The minimum distance from the sensor at which the user is coupled to the network may be predetermined and/or adjusted. A user may override its coupling to the network. The user may be a driver, owner, occupant, etc. of the vehicle. The user may be the user of the mobile circuitry. The ability to couple the mobile circuitry to the network may or may not be overridden by the user. The ability to alter the minimum coupling distance between the mobile circuitry and the network may or may not be overridden by the user. There may be a hierarchy of overriding permissions. The hierarchy may depend on the type of user and/or type of mobile circuitry. For example, an employee may be permitted to allow or prevent coupling of her/his personal cellular phone and/or car to the network. For example, a visitor (e.g., passenger of the vehicle) may be prevented from having the visitor’s mobile circuitry connected to the network. The coupling to the network may be automatic and seamless (e.g., after the initial preference have been set). Seamless coupling may be without requiring input from the user.

[0117] In such an example, the user can control the tinting levels via input to the mobile circuitry (e.g., portable device), which is then received by the wall device through the docking interface and subsequently communicated to the control system (e.g., to the MC, NC, or WC). The mobile circuitry (e.g., portable device) may include an application for communicating with an API presented by the wall device.

[0118] In some embodiments, the wall device can transmit a request for a status change of a target (e.g., a tint state change) to the control system (e.g., to the MC). The control system (e.g., MC) might first determine whether to grant the request (for example, based at least in part on power considerations and/or based at least in part on whether the user has the appropriate authorizations or permissions). The control system (e.g., MC) could calculate, determine, select, and/or otherwise generate a status change (e.g., tint) value and transmit the status change (e.g., tint) value in a primary status change (e.g., tint) command to cause the target to change (e.g., cause the tint state transitions in the adjoining IGUs). For example, each wall device may be connected with the control system (e.g., the MC therein) via one or more wired links (for example, over communication lines such as CAN or Ethernet compliant lines and/or over power lines using power line communication techniques). For example, each wall device could be connected with the control system (e.g., the MC therein) via one or more wireless links. The wall device may be connected (via one or more wired and/or wireless connections) with an outward-facing network, which may communicate with the control system (e.g., the MC therein) via the link.

[0119] In some embodiments, the control system identifies the target (e.g., target device) associated with the wall device based at least in part on previously programmed or discovered information associating the wall device with the target. For example, the MC identifies the IGUs associated with the wall device based at least in part on previously programmed or discovered information associating the wall device with the IGUs. A control algorithm or rule set can be stored in and executed by the control system (e.g., the MC therein) to dictate that one or more control signals from a wall device take precedence over a tint value previously generated by the control system (e.g., the MC therein), for example. In certain times, a control algorithm or rule set stored in and executed by the control system (e.g., the MC therein) may be used to dictate that the tint value previously generated by the control system (e.g., the MC therein) takes precedence over any control signals received from a wall device. For example, the control algorithm or rule set may dictate that previously generated tint values are to override user specified tint values in instances in which the user specified tint values may cause a safety issue. For example, user specified tint values may be overridden in favor of tint values determined by the control system and/or algorithm in an instance in which a tint state darker than is allowed by safety regulations is requested for one or more windows of the vehicle and at a time in which the vehicle is determined by the control system to be in motion.

[0120] In some embodiments, based at least in part on the receipt of a request or control signal to change to a state of a target (e.g., tint-state-change request or control signal) from a wall device, the control system (e.g., the MC therein) uses information about a combination of known parameters to generate a state change (e.g., tint) value that provides lighting conditions desirable for a typical user. Accordingly, the control system (e.g., the MC therein) may use power more efficiently. In some embodiments, the control system (e.g., the MC therein) can generate the state change (e.g., tint) value based at least in part on preset preferences defined by or for the particular user that requested the (e.g., tint) state change of the target via the wall device. For example, the user may be required to enter a password into the wall device or to use a security token or security fob such as the I BUTTON or other 1-Wire device to gain access to the wall device. The control system (e.g., the MC therein) may then determine the identity of the user, based at least in part on the password, biometric signals, security token and/or security fob. The control system (e.g., the MC therein) may retrieve preset preferences for the user. The control system (e.g., the MC therein) may use the preset preferences alone or in combination with other parameters (such as power considerations or information from various sensors, historical data, whether the vehicle is currently in motion, and/or user preference) to calculate, determine, select and/or otherwise generate a tint value for the respective IGUs.

[0121] In some embodiments, the wall device transmits a tint state change request to the appropriate control system (e.g., to the NC therein). A lower level of the control system (e.g., to the NC therein) may communicate the request, or a communication based at least in part on the request, to a higher level of the control system (e.g., to the MC). For example, each wall device can be connected with a corresponding NC via one or more wired links. In some embodiments, the wall device transmits a request to the appropriate NC, which then itself determines whether to override a primary tint command previously received from the MC or a primary or secondary tint command previously generated by the NC. As described below, the NC may generate tint commands without first receiving a tint command from an MC. In some embodiments, the wall device communicates requests or control signals directly to the WC that controls the adjoining IGUs. For example, each wall device can be connected with a corresponding WC via one or more wired links such as those just described for the MC or via a wireless link.

[0122] In some embodiments, the NC or the MC determines whether the control signals from the wall device should take priority over a tint value previously generated by the NC or the MC. As described above, the wall device is able to communicate directly with the NC. However, in some examples, the wall device can communicate requests directly to the MC or directly to a WC, which then communicates the request to the NC. In some embodiments, the wall device is able to communicate requests to a customer-facing network, which then passes the requests (or requests based therefrom) to the NC either directly or indirectly by way of the MC. For example, a control algorithm or rule set stored in and executed by the NC or the MC can dictate that one or more control signals from a wall device take precedence over a tint value previously generated by the NC or the MC. In some embodiments (e.g., such as in times in which a user-specified control signal would violate a safety standard), a control algorithm or rule set stored in and executed by the NC or the MC dictates that the tint value previously generated by the NC or the MC takes precedence over any control signals received from a wall device.

[0123] In some embodiments, based at least in part on the receipt of a tint-state-change request or control signal from a wall device, the NC can use information about a combination of known parameters to generate a tint value that provides lighting conditions desirable for a typical user. In some embodiments, the NC or the MC generates the tint value based at least in part on preset preferences defined by or for the particular user that requested the tint state change via the wall device. For example, the user may be required to enter a password into the wall device or to use a security token or security fob such as the IBUTTON or other 1- Wire device to gain access to the wall device. In this example, the NC can communicate with the MC to determine the identity of the user, or the MC can alone determine the identity of the user, based at least in part on the password, biometric signatures, security token or security fob. The MC may then retrieve preset preferences for the user, and use the preset preferences alone or in combination with other parameters (such as power considerations or information from various sensors) to calculate, determine, select, or otherwise generate a tint value for the respective IGUs.

[0124] In some embodiments, the control system (e.g., the MC therein) is coupled to an external database (or “data store” or “data warehouse”). The database can be a local database coupled with the control system (e.g., the MC therein) via a wired hardware link, for example. In some embodiments, the database is a remote database or a cloud-based database accessible by the control system (e.g., the MC therein) via an internal private network or over the outward-facing network. Other computing devices, systems, or servers also can have access to read the data stored in the database, for example, over the outward-facing network. One or more control applications or third party applications could also have access to read the data stored in the database via the outward-facing network. In some embodiments, the control system (e.g., the MC therein) stores in the database a record of all tint commands including the corresponding tint values issued by the control system (e.g., the MC therein). The control system (e.g., the MC therein) may also collect status and sensor data and store it in the database (which may constitute historical data).

The local controllers (e.g., WCs) may collect the sensor data and/or status data from the enclosure and/or from other devices (e.g., IGUs) or media disposed in the enclosure, and communicate the sensor data and/or status data to the respective higher level controller (e.g., NCs) over the communication link. The data may move up the control chain, e.g., to the MC. For example, the controllers (e.g., NCs or the MC) may themselves be communicatively coupled (e.g., connected) to various sensors (such as light, temperature, occupancy, air flow, gas concentration, geolocation, or other sensors) within the vehicle, as well as (e.g., light, temperature, and/or geolocation) sensors positioned on, around, or otherwise external to the vehicle (for example, on a roof of the vehicle, an outside frame of the vehicle, or the like). In some embodiments, the control system (e.g., the NCs or the WCs) may also transmit status and/or sensor data (e.g., directly) to the database for storage.

[0125] In some embodiments, the network system is suited for integration with a smart thermostat service, alert service (for example, fire detection), security service and/or other appliance automation service. On example of a home automation service is NEST®, made by Nest Labs of Palo Alto, California, (NEST® is a registered trademark of Google, Inc. of Mountain View, California). As used herein, references to a vehicle device control system can in some implementations also encompass, or be replaced with, such other automation services.

[0126] In some embodiments, the control system (e.g., the MC therein) and a separate automation service, such as a vehicle device control system, can communicate via an application programming interface (API). For example, the API can execute in conjunction with a (e.g., master) controller application (or platform) within the controller (e.g., MC), and/or in conjunction with a vehicle management application (or platform) within the vehicle device control system. The controller (e.g., MC) and the vehicle device control system can communicate over one or more wired links and/or via the outward-facing network. For example, the vehicle device control system may communicate instructions for controlling the IGUs to the controller (e.g., MC), which then generate and transmit primary status (e.g., tint) commands of the target to the appropriate lower level controller(s) (e.g., to the NCs). The lower hierarchical level controllers (e.g., the NCs or the WCs) could communicate directly with the vehicle device control system (e.g., through a wired/hardware link and/or wirelessly through a wireless data link). In some embodiments, the vehicle device control system also receives data, such as sensor data, status data, and associated timestamp data, collected by one or more of the controllers in the control system (e.g., by the MC, the NCs, and/or the WCs). For example, the controller (e.g., MC) can publish such data over the network. In some embodiments in which such data is stored in a database, the vehicle device control system can have access to some or all of the data stored in the database.

[0127] In some embodiments, the controller (e.g., “the MC”) collectively refers to any suitable combination of hardware, firmware and software for implementing the functions, operations, processes, or capabilities described. For example, the MC can refer to a computer that implements a master controller application (also referred to herein as a “program” or a “task”). For example, the controller (e.g., MC) may include one or more processors. The processor(s) can be or can include a central processing unit (CPU), such as a single core or a multi-core processor. The processor can additionally include a digital signal processor (DSP) or a network processor in some examples. The processor could also include one or more application-specific integrated circuits (ASICs). The processor is coupled with a primary memory, a secondary memory, an inward-facing network interface, and an outward-facing network interface. The primary memory can include one or more high-speed memory devices such as, for example, one or more random-access memory (RAM) devices including dynamic-RAM (DRAM) devices. Such DRAM devices can include, for example, synchronous DRAM (SDRAM) devices and double data rate SDRAM (DDR SDRAM) devices (including DDR2 SDRAM, DDR3 SDRAM, and DDR4 SDRAM), thyristor RAM (T-RAM), and zero-capacitor (Z-RAM®), among other suitable memory devices.

[0128] In some embodiments, the secondary memory can include one or more hard disk drives (HDDs) or one or more solid-state drives (SSDs). In some embodiments, the memory can store processor-executable code (or “programming instructions”) for implementing a multi-tasking operating system such as, for example, an operating system based at least in part on a Linux® kernel. The operating system can be a UNIX®- or Unix-like-based operating system, a Microsoft Windows®-based operating system, or another suitable operating system. The memory may also store code executable by the processor to implement the master controller application described above, as well as code for implementing other applications or programs. The memory may also store status information, sensor data, or other data collected from network controllers, window controllers and various sensors.

[0129] In some embodiments, the controller (e.g., MC) is a “headless” system; that is, a computer that does not include a display monitor or other user input device. For example, an administrator or other authorized user can log in to or otherwise access the controller (e.g., MC) from a remote computer or mobile computing device over a network to access and retrieve information stored in the controller (e.g., MC), to write or otherwise store data in the controller (e.g., MC), and/or to control various: functions, operations, processes and/or parameters implemented or used by the controller (e.g., MC). The controller (e.g., MC) can include a display monitor and a direct user input device (for example, a mouse, a keyboard and/or a touchscreen).

[0130] In some embodiments, the inward-facing network interface enables one controller (e.g., MC) of the control system to communicate with various distributed controllers and/or various targets (e.g., sensors). The inward-facing network interface can collectively refer to one or more wired network interfaces and/or one or more wireless network interfaces (including one or more radio transceivers). For example, the inward-facing network interface can enable communication with downstream controllers (e.g., NCs) over the link.

Downstream may refer to a lower level of control in the control hierarchy.

[0131] In some embodiments, the outward-facing network interface enables the controller (e.g., MC) to communicate with various computers, mobile circuitry (e.g., mobile devices), servers, databases, and/or cloud-based database systems, over one or more networks. The outward-facing network interface can collectively refer to one or more wired network interfaces and/or one or more wireless network interfaces (including one or more radio transceivers). In some embodiments, the various applications, including third party applications and/or cloud-based applications, executing within such remote devices can access data from or provide data to the controller (e.g., MC) or to the database via the controller (e.g., MC). For example, the controller (e.g., MC) may include one or more application programming interfaces (APIs) for facilitating communication between the controller (e.g., MC) and various third party applications. Some examples of APIs that controller(s) (e.g., MC) can enable can be found in PCT Patent Application No.

PCT/US15/64555 (Attorney Docket No. VIEWP073WO) filed December 8, 2015, and titled MULTIPLE INTERACTING SYSTEMS AT A SITE, which is incorporated herein by reference in its entirety. For example, third-party applications can include various monitoring services including thermostat services, alert services (e.g., fire detection, window breakage detection, etc.), security services, and/or other appliance automation services. Additional examples of monitoring services and systems can be found in PCT Patent Application No.

PCT/US2015/019031 (Attorney Docket No. VIEWP061WO) filed March 5, 2015 and titled MONITORING SITES CONTAINING SWITCHABLE OPTICAL DEVICES AND CONTROLLERS, which is incorporated herein by reference in its entirety.

[0132] In some embodiments, one or both of the inward-facing network interface and the outward-facing network interface can include a Building Automation and Control network (BACnet) compatible interface. BACnet is a communications protocol typically used in building automation and control networks and defined by the ASHRAE/ANSI 135 and ISO 16484-5 standards. The BACnet protocol broadly provides mechanisms for computerized building automation systems and devices to exchange information, e.g., regardless of the particular services they perform. For example, BACnet can be used to enable communication among (i) heating, ventilating, and air-conditioning control (HVAC) systems, (ii) lighting control systems, (iii) access and/or security control systems, (iv) fire detection systems, or (v) any combination thereof, as well as their associated equipment. In some examples, one or both of the inward-facing network interface and the outward-facing network interface can include an oBIX (Open Building Information Exchange) compatible interface or another RESTful Web Services-based interface.

[0133] In some embodiments, the controller (e.g., MC) can calculate, determine, select and/or otherwise generate a preferred state for the target (e.g., a tint value for one or more IGUs) based at least in part on a combination of parameters. For example, the combination of parameters can include time and/or calendar information such as the time of day, day of year or time of season. The combination of parameters may include solar calendar information such as, for example, the direction of the sun relative to a current location of a vehicle and/or target (e.g., IGUs). The direction of the sun relative to the vehicle and/or target (e.g., IGUs) may be determined by the controller (e.g., MC) based at least in part on time and/or calendar information, e.g., together with information known about the geographical location of the vehicle on Earth and the direction that the target (e.g., IGUs) faces (e.g., in a North-East-Down coordinate system). It should be noted that geographical location information associated with a vehicle may be determined using various geolocation techniques, such as using Global Positioning System (GPS) or other satellite-based techniques, or the like. In some embodiments, the geographical location information may be based at least in part on a planned route of the vehicle (e.g., provided by and/or obtained from a navigation application, provided by and/or obtained from a driver of the vehicle, or the like). The combination of parameters also can include exterior and/or interior environmental conditions. For example, the environmental conditions can include the outside temperature (external to the vehicle), the inside temperature (within the vehicle, or within the vehicle adjacent to the target IGUs), or the temperature within the interior volume of the IGUs. The combination of parameters may include information about the weather (for example, whether it is clear, sunny, overcast, cloudy, raining or snowing). Parameters such as the time of day, day of year, and/or direction of the sun can be programmed into and tracked by the control system (e.g., the MC therein). Parameters (such as the outside temperature, inside temperature, and/or IGU temperature) can be obtained from sensors in, on or around the vehicle or sensors integrated with the target (e.g., on or within the IGUs). At times the target can comprise a sensor. Examples of algorithms, routines, modules, or other means for generating IGU tint values are described in U.S. Patent Application No. 13/772,969, filed February 21, 2013 and titled CONTROL METHOD FOR TINTABLE WINDOWS, and in PCT Patent Application No. PCT/US15/029675, filed May 7, 2015 and titled CONTROL METHOD FOR TINTABLE WINDOWS, each of which is hereby incorporated by reference in its entirety.

[0134] In some embodiments, at least one (e.g., each) device (e.g., ECD) within each IGU is capable of being tinted, e.g., responsive to a suitable driving voltage applied across the EC stack. The tint may be to (e.g., virtually) any tint state within a continuous tint spectrum defined by the material properties of the EC stack. However, the control system (e.g., the MC therein) may be programmed to select a tint value from a finite number of discrete tint values (e.g., tint values specified as integer values). In some such implementations, the number of available discrete tint values can be at least 2, 4, 8, 16, 32, 64, 128 or 256, or more. For example, a 2-bit binary number can be used to specify any one of four possible integer tint values, a 3-bit binary number can be used to specify any one of eight possible integer tint values, a 4-bit binary number can be used to specify any one of sixteen possible integer tint values, a 5-bit binary number can be used to specify any one of thirty-two possible integer tint values, and so on. At least one (e.g., each) tint value can be associated with a target tint level (e.g., expressed as a percentage of maximum tint, maximum safe tint, and/or maximum desired or available tint). For didactic purposes, consider an example in which the MC selects from among four available tint values: 0, 5, 10 and 15 (using a 4-bit or higher binary number). The tint values 0, 5, 10 and 15 can be respectively associated with target tint levels of 60%, 40%, 20% and 4%, or 60%, 30%, 10% and 1%, or another desired, advantageous, or suitable set of target tint levels.

[0135] Fig. 3 shows a block diagram of an example master controller (MC) 300. The MC 300 can be implemented in or as one or more computers, computing devices or computer systems (herein used interchangeably where appropriate unless otherwise indicated). For example, the MC 300 includes one or more processors 302 (also collectively referred to hereinafter as “the processor 302”). Processor 302 can be or can include a central processing unit (CPU), such as a single core or a multi-core processor. The processor 302 can additionally include a digital signal processor (DSP) or a network processor in some examples. The processor 302 could also include one or more application-specific integrated circuits (ASICs). The processor 302 is coupled with a primary memory 304, a secondary memory 306, an inward-facing network interface 308 and an outward-facing network interface 310. The primary memory 304 can include one or more high-speed memory devices such as, for example, one or more random-access memory (RAM) devices including dynamic-RAM (DRAM) devices. Such DRAM devices can include, for example, synchronous DRAM (SDRAM) devices and double data rate SDRAM (DDR SDRAM) devices (including DDR2 SDRAM, DDR3 SDRAM, and DDR4 SDRAM), thyristor RAM (T-RAM), and zero- capacitor (Z-RAM®), among other suitable memory devices.

[0136] In some embodiments, in some implementations the MC and the NC are implemented as a master controller application and a network controller application, respectively, executing within respective physical computers or other hardware devices. For example, each of the master controller application and the network controller application can be implemented within the same physical hardware. Each of the master controller application and the network controller application can be implemented as a separate task executing within a single computer device that includes a multi-tasking operating system such as, for example, an operating system based at least in part on a Linux® kernel or another suitable operating system.

[0137] In some embodiments, the master controller application and the network controller application can communicate via an application programming interface (API). In some embodiments, the master controller and network controller applications communicate over a loopback interface. By way of reference, a loopback interface is a virtual network interface, implemented through an operating system, which enables communication between applications executing within the same device. A loopback interface is typically identified by an IP address (often in the 127.0.0.0/8 address block in IPv4, or the 0:0:0:0:0:0:0:1 address (also expressed as: 1) in IPv6). For example, the master controller application and the network controller application can each be programmed to send communications targeted to one another to the IP address of the loopback interface. In this way, when the master controller application sends a communication to the network controller application, or vice versa, the communication does not need to leave the computer.

[0138] In some embodiments wherein the MC and the NC are implemented as master controller and network controller applications, respectively, there are generally no restrictions limiting the available protocols suitable for use in communication between the two applications. This generally holds true regardless of whether the master controller application and the network controller application are executing as tasks within the same or different physical computers. For example, there is no need to use a broadcast communication protocol, such as BACnet, which limits communication to one network segment as defined by a switch or router boundary. For example, the oBIX communication protocol can be used in some implementations for communication between the MC and the NCs.

[0139] In some embodiments, each of the NCs is implemented as an instance of a network controller application executing as a task within a respective physical computer. In some embodiments, at least one of the computers executing an instance of the network controller application also executes an instance of a master controller application to implement the MC. For example, while only one instance of the master controller application may be actively executing in the network system at any given time, two or more of the computers that execute instances of network controller application can have an instance of the master controller application installed. In this way, redundancy is added such that the computer currently executing the master controller application is no longer a single point of failure of the entire system. For example, if the computer executing the master controller application fails or if that particular instance of the master controller application otherwise stops functioning, another one of the computers having an instance of the master network application installed can begin executing the master controller application to take over for the other failed instance. In some embodiments, more than one instance of the master controller application may execute concurrently. For example, the functions, processes, or operations of the master controller application can be distributed to two (or more) instances of the master controller application.

[0140] Fig. 4 shows a block diagram of an example network controller (NC) 400, which can be implemented in or as one or more network components, networking devices, computers, computing devices, or computer systems (herein used interchangeably where appropriate unless otherwise indicated). Reference to “the NC 400” collectively refers to any suitable combination of hardware, firmware, and software for implementing the functions, operations, processes or capabilities described. For example, the NC 400 can refer to a computer that implements a network controller application (also referred to herein as a “program” or a “task”). NC 400 includes one or more processors 402 (also collectively referred to hereinafter as “the processor 402”). In some embodiments, the processor 402 is implemented as a microcontroller or as one or more logic devices including one or more application-specific integrated circuits (ASICs) or programmable logic devices (PLDs), such as field- programmable gate arrays (FPGAs) or complex programmable logic devices (CPLDs).

When implemented in a PLD, the processor can be programmed into the PLD as an intellectual property (IP) block or permanently formed in the PLD as an embedded processor core. The processor 402 may be or may include a central processing unit (CPU), such as a single core or a multi-core processor. The processor 402 is coupled with a primary memory 404, a secondary memory 406, a downstream network interface 408, and an upstream network interface 410. In some embodiments, the primary memory 404 can be integrated with the processor 402, for example, as a system-on-chip (SOC) package, or in an embedded memory within a PLD itself. The NC 400 may include one or more high-speed memory devices such as, for example, one or more RAM devices. In some embodiments, the secondary memory 406 can include one or more solid-state drives (SSDs) storing one or more lookup tables or arrays of values. The secondary memory 406 may store a lookup table that maps first protocol IDs (for example, BACnet IDs) received from the MC to second protocol IDs (for example, CAN IDs) each identifying a respective one of the WCs, and vice versa. In some embodiments, the secondary memory 406 stores one or more arrays or tables. The downstream network interface 408 enables the NC 400 to communicate with distributed WCs and/or various sensors. The upstream network interface 410 enables the NC 400 to communicate with the MC and/or various other computers, servers, or databases.

[0141] In some embodiments, when the MC determines to tint one or more IGUs, the MC writes a specific tint value to the AV in the NC associated with the one or more respective WCs that control the target IGUs. For example, the MC may generate a primary tint command communication including a BACnet ID associated with the WCs that control the target IGUs. The primary tint command also can include a tint value for the target IGUs. The MC may direct the transmission of the primary tint command to the NC using a network address such as, for example, an IP address or a MAC address. Responsive to receiving such a primary tint command from the MC through the upstream interface, the NC may unpackage the communication, map the BACnet ID (or other first protocol ID) in the primary tint command to one or more CAN IDs (or other second protocol IDs), and write the tint value from the primary tint command to a first one of the respective AVs associated with each of the CAN IDs.

[0142] In some embodiments, the NC then generates a secondary tint command for each of the WCs identified by the CAN IDs. Each secondary tint command may be addressed to a respective one of the WCs by way of the respective CAN ID. For example, each secondary tint command also can include the tint value extracted from the primary tint command. The NC may transmit the secondary tint commands to the target WCs through the downstream interface via a second communication protocol (for example, via the CANOpen protocol). In some embodiments, when a WC receives such a secondary tint command, the WC transmits a status value back to the NC indicating a status of the WC. For example, the tint status value can represent a “tinting status” or “transition status” indicating that the WC is in the process of tinting the target IGUs, an “active” or “completed” status indicating that the target IGUs are at the target tint state or that the transition has been finished, or an “error status” indicating an error. After the status value has been stored in the NC, the NC may publish the status information or otherwise make the status information accessible to the MC or to various other authorized computers or applications. In some embodiments, the MC requests status information for a particular WC from the NC based at least in part on intelligence, a scheduling policy, or a user override. For example, the intelligence can be within the MC. A scheduling policy can be stored in the MC, another storage location within the network system, or within a cloud-based system.

[0143] In some embodiments, the NC handles some of the functions, processes, or operations that are described above as being responsibilities of the MC. In some embodiments, the NC can include additional functionalities or capabilities not described with reference to the MC. For example, the NC may also include a data logging module (or “data logger”) for recording data associated with the IGUs controlled by the NC. In some embodiments, the data logger records the status information included in each of some or all of the responses to the status requests. For example, the status information that the WC communicates to the NC responsive to each status request can include a tint status value (S) for the IGUs, a value indicating a particular stage in a tinting transition (for example, a particular stage of a voltage control profile), a value indicating whether the WC is in a sleep mode, a tint value (C), a set point voltage set by the WC based at least in part on the tint value (for example, the value of the effective applied voltage VEff), an actual voltage level VAct measured, detected or otherwise determined across the ECDs within the IGUs, an actual current level lAct measured, detected or otherwise determined through the ECDs within the IGUs, and various sensor data, for example, collected from photosensors or temperature sensors integrated on or within the IGUs. The NC 500 may collect and queue status information in a messaging queue like RabbitMC, ActiveMQ or Kafka and stream the status information to the MC for subsequent processing such as data reduction/compression, event detection, etc., as further described herein.

[0144] In some embodiments, the data logger within the NC collects and stores the various information received from the WCs in the form of a log file such as a comma-separated values (CSV) file or via another table-structured file format. For example, each row of the CSV file can be associated with a respective status request, and can include the values of C, S, VEff, VAct and lAct as well as sensor data (or other data) received in response to the status request. In some implementations, each row is identified by a timestamp corresponding to the respective status request (for example, when the status request was sent by the NC, when the data was collected by the WC, when the response including the data was transmitted by the WC, or when the response was received by the NC). In some embodiments, each row also includes the CAN ID or other ID associated with the respective WC.

[0145] In some embodiments, each row of the CSV file includes the requested data for all of the WCs controlled by the NC. The NC may sequentially loop through all of the WCs it controls during each round of status requests. In some embodiments, each row of the CSV file is identified by a timestamp (for example, in a first column), but the timestamp can be associated with a start of each round of status requests, rather than each individual request. In one specific example, columns 2-6 can respectively include the values C, S, VEff, VAct and lAct for a first one of the WCs controlled by the NC, columns 7-11 can respectively include the values C, S, VEff, VAct and lAct for a second one of the WCs, columns 12-16 can respectively include the values C, S, VEff, VAct and lAct for a third one of the WCs, and so on and so forth through all of the WCs controlled by the NC. The subsequent row in the CSV file may include the respective values for the next round of status requests. In some embodiments, each row also includes sensor data obtained from photosensors, temperature sensors, or other sensors integrated with the respective IGUs controlled by each WC. For example, such sensor data values can be entered into respective columns between the values of C, S, VEff, VAct and lAct for a first one of the WCs but before the values of C, S, VEff, VAct and lAct for the next one of the WCs in the row. Each row can include sensor data values from one or more external sensors, for example, positioned on a vehicle. The NC may send a status request to the external sensors at the end of each round of status requests.

[0146] In some embodiments, the NC translates between various upstream and downstream protocols, for example, to enable the distribution of information between WCs and the MC or between the WCs and the outward-facing network. For example, the NC may include a protocol conversion module responsible for such translation or conversion services. The protocol conversion module may be programmed to perform translation between any of a number of upstream protocols and any of a number of downstream protocols. For example, such upstream protocols can include UDP protocols such as BACnet, TCP protocols such as oBix, other protocols built over these protocols as well as various wireless protocols. Downstream protocols can include, for example, CANopen, other CAN-compatible protocol, and various wireless protocols including, for example, protocols based at least in part on the IEEE 802.11 standard (for example, WiFi), protocols based at least in part on the IEEE 802.15.4 standard (for example, ZigBee, 6L0WPAN, ISA100.11a, WirelessHART or MiWi), protocols based at least in part on the Bluetooth standard (including the Classic Bluetooth, Bluetooth high speed and Bluetooth low energy protocols and including the Bluetooth v4.0, v4.1 and v4.2 versions), or protocols based at least in part on the EnOcean standard (I SO/I EC 14543-3-10).

[0147] In some embodiments, the NC uploads the information logged by the data logger (for example, as a CSV file) to the MC on a periodic basis, for example, every 24 hours. For example, the NC can transmit a CSV file to the MC via the File Transfer Protocol (FTP) or another suitable protocol over an Ethernet data link 316. The status information may be stored in a database or made accessible to applications over the outward-facing network.

[0148] In some embodiments, the NC includes functionality to analyze the information logged by the data logger. For example, an analytics module can be provided in the NC to receive and/or analyze the raw information logged by the data logger (e.g., in real time). In real time may include within at most 15 seconds (sec.), 30sec., 45sec., Iminute (min), 2min., 3min. 4min., 5min, 10min., 15min. or 30min from receipt of the logged information by the data logger, and/or from initiation of the operation (e.g., from receipt and/or from start of analysis). In some embodiments, the analytics module is programmed to make decisions based at least in part on the raw information from the data logger. In some embodiments, the analytics module communicates with the database to analyze the status information logged by the data logger after it is stored in the database. For example, the analytics module can compare raw values of electrical characteristics such as VEff, VAct and lAct with expected values or expected ranges of values and flag special conditions based at least in part on the comparison. For example, such flagged conditions can include power spikes indicating a failure such as a short, an error, or damage to an ECD. The analytics module may communicate such data to a tint determination module or to a power management module in the NC.

[0149] In some embodiments, the analytics module filters the raw data received from the data logger to more intelligently or efficiently store information in the database. For example, the analytics module can be programmed to pass only “interesting” information to a database manager for storage in the database. For example, interesting information can include anomalous values, values that otherwise deviate from expected values (such as based at least in part on empirical or historical values), or for specific periods when transitions are happening. Examples of data manipulation (e.g., filtering, parsing, temporarily storing, and efficiently storing long term in a database) can be found in PCT Patent Application No. PCT/US 15/029675 (Attorney Docket No. VIEWP049X1WO) filed May 7,

2015 and titled CONTROL METHOD FOR TINTABLE WINDOWS that is hereby incorporated by reference in its entirety.

[0150] In some embodiments, a database manager module (or “database manager”) in the control system (e.g., in the NC) is configured to store information logged by the data logger to a database on a periodic basis, for example, at least every hour, every few hours, or every 24 hours. The database can be an external database such as the database described above. In some embodiments, the database can be internal to the controller (e.g., the NC). For example, the database can be implemented as a time-series database such as a Graphite database within the secondary memory of the controller (e.g., of the NC) or within another long term memory within the controller (e.g., the NC). For example, the database manager can be implemented as a Graphite Daemon executing as a background process, task, sub-task or application within a multi-tasking operating system of the controller (e.g., the NC). A time-series database can be advantageous over a relational database such as SQL because a time-series database is more efficient for data analyzed over time. [0151] In some embodiments, the database can collectively refer to two or more databases, each of which can store some or all of the information obtained by some or all of the NCs in the network system. For example, it can be desirable to store copies of the information in multiple databases for redundancy purposes. The database can collectively refer to a multitude of databases, each of which is internal to a respective controller (e.g., NC), e.g., such as a Graphite or other times-series database. It can be beneficial to store copies of the information in multiple databases such that requests for information from applications including third party applications can be distributed among the databases and handled more efficiently. For example, the databases can be periodically or otherwise synchronized, e.g., to maintain consistency.

[0152] In some embodiments, the database manager filters data received from the analytics module to more intelligently and/or efficiently store information, e.g., in an internal and/or external database. For example, the database manager can be programmed to store (e.g., only) “interesting” information to a database. Interesting information can include anomalous values, values that otherwise deviate from expected values (such as based at least in part on empirical or historical values), and/or for specific periods when transitions are happening. More detailed examples of how data manipulation (e.g., how raw data can be filtered, parsed, temporarily stored, and efficiently stored long term in a database) can be found in PCT Patent Application No. PCT/US 15/029675 (Attorney Docket No. VI EWP049X1 WO) filed May 7, 2015 and titled CONTROL METHOD FOR TINTABLE WINDOWS that is hereby incorporated by reference herein in its entirety.

[0153] In some embodiments, a status determination module of a target is included in the controller (e.g., the NC, the MC, or the WC), e.g., for calculating, determining, selecting, or otherwise generating status values for the target. For example, a tint determination module can be included in the controller (e.g., the NC, the MC, or the WC) for calculating, determining, selecting, or otherwise generating tint values for the IGUs. For example, the status (e.g., tint) determination module can execute various algorithms, tasks, or subtasks to generate tint values based at least in part on a combination of parameters. The combination of parameters can include, for example, the status information collected and stored by the data logger. The combination of parameters also can include time or calendar information such as the time of day, day of year or time of season. The combination of parameters can include solar calendar information such as, for example, the direction of the sun relative to the target (e.g., IGUs). The combination of parameters can include one or more characteristics of the enclosure environment that comprise gaseous concentration (e.g., VOC, humidity, carbon dioxide, or oxygen), debris, gas type, gas flow velocity, gas flow direction, gas (e.g., atmosphere) temperature, humidity, noise level, or light level (e.g., brightness). The combination of parameters can include the outside parameters (e.g., temperature) external to the enclosure (e.g., vehicle), the inside parameter (e.g., temperature) within the enclosure (e.g., vehicle), and/or the temperature within the interior volume of the IGUs. The combination of parameters can include information about the weather (for example, whether it is clear, sunny, overcast, cloudy, raining or snowing). Parameters such as the time of day, day of year, and/or direction of the sun, can be programmed into and tracked by the control system (e.g., that includes the NC). Parameters such as the outside temperature, inside temperature, and/or IGU temperature, can be obtained from sensors in, on or around the vehicle or sensors integrated on or within the IGUs, for example. In some embodiments, various parameters are provided by, or determined based at least in part on, information provided by various applications including third party applications that can communicate with the controller(s) (e.g., NC) via an API. For example, the network controller application, or the operating system in which it runs, can be programmed to provide the API.

[0154] In some embodiments, the target status (e.g., tint) determination module determines status (e.g., tint) value(s) of the target based at least in part on user overrides, e.g., received via various mobile circuitry (e.g., device) applications, wall devices and/or other devices. In some embodiments, the status (e.g., tint) determination module determines status (e.g., tint) values based at least in part on command(s) or instruction(s) received by various applications, e.g., including third party applications and/or cloud-based applications. For example, such third party applications can include various monitoring services including thermostat services, alert services (e.g., fire detection), security services and/or other appliance automation services. Additional examples of monitoring services and systems can be found in PCT/US2015/019031 (Attorney Docket No. VIEWP061WO) filed 5 March 2015 and titled MONITORING SITES CONTAINING SWITCHABLE OPTICAL DEVICES AND CONTROLLERS that is incorporated herein by reference in its entirety. Such applications can communicate with the status (e.g., tint) determination module and/or other modules within the controller(s) (e.g., NC) via one or more APIs. Some examples of APIs that the controller(s) (e.g., NC) can enable are described in PCT Patent Application No.

PCT/US15/64555 (Attorney Docket No. VIEWP073WO) filed December 8, 2015 and titled MULTIPLE INTERFACING SYSTEMS AT A SITE, that is incorporated herein by reference in its entirety.

[0155] In some embodiments, the analytics module compares values of VEff, VAct and lAct as well as sensor data obtained in real time and/or previously stored within the database with expected values or expected ranges of values and flag special conditions based at least in part on the comparison. For example, the analytics module can pass such flagged data, flagged conditions or related information to a power management module. For example, such flagged conditions can include power spikes indicating a short, an error, or damage to a smart window (e.g., an ECD). In some embodiments, the power management module modifies operations based at least in part on the flagged data or conditions. For example, the power management module can delay status (e.g., tint) commands of a target until power demand has dropped, stop commands to troubled controller(s) (e.g., local controller such as WC) (and put them in idle state), start staggering commands to controllers (e.g., lower hierarchy controllers such as WCs), manage peak power, and/or signal for help.

[0156] Fig. 5 shows an example network controller (NC) 500 including a plurality of modules. NC 500 is coupled to an MC 502 and a database 504 by an interface 510, and to a WC 506 by an interface 508. In the example, internal modules of NC 500 include data logger 512, protocol conversion module 514, analytics module 516, database manager 518, tint determination module 520, power management module 522, and commissioning module 524.

[0157] In some embodiments, a controller (e.g., WC) or other network device includes, or is operatively coupled to, a sensor or device ensemble (e.g., that includes at least one sensor). For example, a plurality of sensors or a device ensemble may be organized into a sensor ensemble (also referred to herein as a “digital architectural element”). A device ensemble may comprise a circuit board, such as a printed circuit board, e.g., in which a number of sensors are adhered or affixed to the circuit board. Sensor(s) can be reversibly removed from a sensor module. For example, a sensor may be plugged into and/or unplugged out of, the circuit board. Sensor(s) may be individually activated and/or deactivated (e.g., using a switch). The circuit board may comprise a polymer. The circuit board may be transparent or non-transparent. The circuit board may comprise metal (e.g., elemental metal and/or metal alloy). The circuit board may comprise a conductor. The circuit board may comprise an insulator. The circuit board may comprise any geometric shape (e.g., rectangle or ellipse). The circuit board may be configured (e.g., may be of a shape) to allow the ensemble to be disposed in a frame portion such as a framing portion (e.g., of a window). The circuit board may be configured (e.g., may be of a shape) to allow the ensemble to be disposed in a frame (e.g., door frame and/or window frame). The circuit board may be configured (e.g., may be of a shape) to allow the ensemble to be disposed in or on a portion of a vehicle, such a dashboard of a vehicle, a frame of a door of a vehicle, an interior ceiling of a vehicle, or the like. The frame may comprise one or more holes, e.g., to allow the sensor(s) to obtain (e.g., accurate) readings. The circuit board may be enclosed in a wrapping. The wrapping may comprise flexible or rigid portions. The wrapping may be flexible. The wrapping may be rigid (e.g., be composed of a hardened polymer, from glass, or from a metal (e.g., comprising elemental metal or metal alloy). The wrapping may comprise a composite material. The wrapping may comprise carbon fibers, glass fibers, and/or polymeric fibers.

The wrapping may have one or more holes, e.g., to allow the sensor(s) to obtain (e.g., accurate) readings. The circuit board may include an electrical connectivity port (e.g., socket). The circuit board may be connected to a power source (e.g., to electricity). The power source may comprise renewable and/or non-renewable power source.

[0158] Fig. 6 shows diagram 600 having an example of an ensemble of sensors organized into a sensor module. Sensors 610A, 610B, 610C, and 610D are shown as included in device ensemble 605. An ensemble of sensors organized into a sensor module may include at least 1, 2, 4, 5, 8, 10, 20, 50, or 500 sensors. The sensor module may include a number of sensors in a range between any of the aforementioned values (e.g., from about 1 to about 1000, from about 1 to about 500, or from about 500 to about 1000). Sensors of a sensor module may comprise sensors configured and/or designed for sensing a parameter comprising: temperature, humidity, carbon dioxide, particulate matter (e.g., between 2.5 pm and 10 pm), total volatile organic compounds (e.g., via a change in a voltage potential brought about by surface adsorption of volatile organic compound), ambient light, audio noise level, pressure (e.g. gas, and/or liquid), acceleration, time, radar, lidar, radio signals (e.g., ultra-wideband radio signals), passive infrared, glass breakage, or movement detectors. The device ensemble (e.g., 605) may comprise non-sensor devices (e.g., emitters), such as buzzers and light emitting diodes. Examples of device ensembles and their uses can be found in U.S. Patent Application Serial Number 16/447169 filed June 20, 2019, titled “SENSING AND COMMUNICATIONS UNIT FOR OPTICALLY SWITCHABLE WINDOW SYSTEMS” that is incorporated herein by reference in its entirety.

[0159] In some embodiments, an increase in the number and/or types of sensors may be used to increase a probability that one or more measured property is accurate and/or that a particular event measured by one or more sensor has occurred. In some embodiments, sensors of a device ensemble may cooperate with one another. In an example, a radar sensor of device ensemble may determine presence of a number of individuals in an enclosure (e.g., vehicle). A processor (e.g., processor 615) may determine that detection of presence of a number of individuals in an enclosure is positively correlated with an increase in carbon dioxide concentration. In an example, the processor-accessible memory may determine that an increase in detected infrared energy is positively correlated with an increase in temperature as detected by a temperature sensor. In some embodiments, network interface (e.g., 650) may communicate with other device ensembles similar to the device ensemble. The network interface may additionally communicate with a controller. [0160] Individual sensors (e.g., sensor 610A, sensor 610D, etc.) of a device ensemble may comprise and/or utilize at least one dedicated processor. A device ensemble may utilize a remote processor (e.g., 654) utilizing a wireless and/or wired communications link. A device ensemble may utilize at least one processor (e.g., processor 652), which may represent a cloud-based processor coupled to a device ensemble via the cloud (e.g., 650). Processors (e.g., 652 and/or 654) may be located within the vehicle, external to the vehicle (e.g., in facility owned by the manufacturer of the window/controller/ device ensemble, a facility owned by a cloud-services provider, and/or at any other facility), or at any other location. In various embodiments, as indicated by the dotted lines of Fig. 6, device ensemble 605 is not required to comprise a separate processor and network interface. These entities may be separate entities and may be operatively coupled to ensemble 605. The dotted lines in Fig. 6 designate optional features. In some embodiments, onboard processing and/or memory of one or more ensemble of sensors may be used to support other functions (e.g., via allocation of ensembles(s) memory and/or processing power to the network infrastructure of a building).

[0161] In some embodiments, sensor data is exchanged among various network devices and controllers. The sensor data may also be accessible to remote users (e.g., inside the same vehicle) for retrieval using personal electronic devices, for example. Applications executing on remote devices to access sensor data may also provide commands for controllable functions such as tint commands for a window controller. An example window controller(s) is described in PCT Patent Application No. PCT/US16/58872, titled CONTROLLERS FOR OPTICALLY-SWITCHABLE DEVICES, filed October 26, 2016, and in US Patent Application No. 15/334,832, titled CONTROLLERS FOR OPTICALLY- SWITCHABLE DEVICES, filed October 26, 2016, each of which is herein incorporate by reference in its entirety.

[0162] In some embodiments, the controller (e.g., NC) periodically requests status information from lower hierarchy controller(s) (e.g., from the WCs it controls). For example, the controller (e.g., NC) can communicate a status request to at least one (e.g., each) of the lower hierarchy controller(s) (e.g., from the WCs it controls) at a frequency of at least every few seconds, every few tens of seconds, every minute, every few minutes, or after any requested period of time. In some embodiments, at least one (e.g., each) status request is directed to a respective one of the lower hierarchy controllers (e.g., WCs) using the CAN ID or other identifier of the respective lower hierarchy controller(s) (e.g., WCs). In some embodiments, the controller (e.g., NC) proceeds sequentially through all of the lower hierarchy controllers (e.g., WCs) it controls during at least one (e.g., each) round of status acquisition. The controller (e.g., NC) can loop through at least two (e.g., all) of the lower hierarchy controllers (e.g., WCs) it controls such that a status request is sent to these lower hierarchy controllers (e.g., WCs) sequentially in the round of status acquisition. After a status request has been sent to a given lower hierarchy controller (e.g., WC), the upper hierarchy level controller (e.g., NC) may wait to receive the status information from one lower hierarchy controller (e.g., WC), e.g., before sending a status request to the next one of the lower hierarchy controllers (e.g., WC) in the round of status acquisition.

[0163] In some embodiments, after status information has been received from all of the lower hierarchy controllers (e.g., WCs) that the upper hierarchy controller (e.g., NC) controls, the upper hierarchy controller (e.g., NC) performs a round of status change (e.g., tint) command distribution to the target (e.g., to the IGU). For example, in some implementations, at least one (e.g., each) round of status acquisition is followed by a round of tint command distribution, which is then followed by a next round of status acquisition and a next round of tint command distribution, and so on. In some embodiments, during a round of status (e.g., tint) command distribution to the controller of the target, the controller (e.g., NC) proceeds to send a tint command to the lower hierarchy controller (e.g., WC) that the higher hierarchy controller (e.g., NC) controls. In some embodiments, the hierarchy controller (e.g., NC) proceeds sequentially through all of the lower hierarchy controllers (e.g., WCs) it controls during the round of tint command distribution. In other words, the hither hierarchy (e.g., NC) controller loops through (e.g., all of) the lower hierarchy controllers (e.g., WCs) it controls such that a status (e.g., tint) command is sent to (e.g., each of) the lower hierarchy controllers (e.g., WCs) sequentially in the round of status (e.g., tint) command distribution to change the status of the target (e.g., change the tint state of the IGU).

[0164] In some embodiments, a status request includes one or more instructions indicating what status information is being requested from the respective lower hierarchy controller (e.g., local controller such as a WC). In some embodiments, responsive to the receipt of such a request, the respective lower hierarchy controllers (e.g., WC) responds by transmitting the requested status information to the higher hierarchy controller (e.g., NC)

(e.g., via the communication lines in an upstream set of cables). In some other embodiments, each status request by default causes the lower hierarchy controllers (e.g., WC) to transmit a predefined set of information for the set of targets (e.g., IGUs, sensors, emitters, or media) it controls. The status information that the lower hierarchy controllers (e.g., WC) communicates to the upper hierarchy controller (e.g., NC) responsive to the status request, can include a (e.g., tint) status value (S) for the target (e.g., IGUs). For example, indicating whether the targets (e.g., IGUs) is undergoing a status change (e.g., tinting transition) or has finished a status change (e.g., tinting transition, or light intensity change). The tint status value S or another value can indicate a particular stage in a tinting transition (for example, a particular stage of a voltage control profile). In some embodiments, the status value S or another value indicates whether the lower hierarchy controller (e.g., WC) is in a sleep mode. The status information communicated in response to the status request also can include the status (e.g., tint) value (C) for the target (e.g., IGUs), for example, as set by the controller (e.g., MC or the NC). The response also can include a set point voltage set by the lower hierarchy controller (e.g., WC) based at least in part on the status (e.g., tint) value (e.g., the value of the effective applied VEff). In some embodiments, the response includes a near real-time actual voltage level VAct measured, detected, or otherwise determined across the ECDs within the IGUs (for example, via the amplifier and the feedback circuit). In some embodiments, the response includes a near real-time actual current level lAct measured, detected, or otherwise determined through the ECDs within the IGUs (for example, via the amplifier and the feedback circuit). The response also can include various near real-time sensor data, for example, collected from photosensors or temperature sensors integrated on or within the IGUs.

[0165] In some embodiments, voice and/or gesture control is used to interact with a target (e.g., an optically switchable device). Such control methods may be more convenient compared to more conventional control methods, e.g., that may require a user to touch or otherwise physically interact with a particular component (e.g., switch, knob, keypad, touchscreen, etc.). Voice control may be beneficial for users, e.g., for a driver of a vehicle, and/or for users with certain disabilities.

[0166] In some embodiments, voice and/or gesture control is used to implement any type of manipulation of a target (e.g., any type of command on an optically switchable device, any type of command on a media display, or the like). For example, voice and/or gesture control may be used to implement tinting commands for a target, or for a group or zone of targets. For example, the command may be for a single optically switchable device (e.g., “change window 1 to tint 4” or “make window 1 darker”), or for a group or zone of optically switchable devices (e.g., “change the windows in zone 1 to tint 4” or “make the windows in zone 1 darker” or “make the windows in zone 1 much darker,” or “make the left side windows darker,” etc.). The commands may relate to discrete optical states to which the relevant optically switchable device(s) should change (e.g., discrete tint levels, or other discrete optical states) or relative changes in the optical states of the optically switchable device(s) (e.g., darker, lighter, more reflective, less reflective, e.g., or “the car is too dark, please lighten it up” or “it’s hot in here” (letting the system know to darken the windows and block heat gain) etc.). Where relative changes are used, the control system may be designed and/or configured to implement incremental (e.g., step) changes (e.g., 10% darker or lighter) in the optical state of the optically switchable device to carry out the command. The degree of each incremental (e.g., step) change may be pre-defined. In some embodiments, the control system is designed and/or configured to implement incremental (e.g., step) changes of a size and/or degree specified by the user. Such command(s) may be modified by any relative words used in the command (e.g., “very” or “a little bit,” or “lighter” or “darker” etc.).

[0167] In some embodiments, voice control is also be used to set a schedule for the target (e.g., optically switchable device). For example, a user may direct the optically switchable device(s) to tint at particular times/days (e.g., “make the right side windows darker Monday through Friday from 8 to 9 am” (e.g., at a time the user may use the vehicle to drive to work) or “the morning sun makes it hot in here” (letting the system know to tint the windows during the morning hours when the sun impinges on that side of the vehicle) or “make the windows dark when I am parked” (letting the system know that the windows are to be tinted while the vehicle is parked, e.g., for security). Similarly, voice control can be used to implement tinting rules for the optically switchable device (e.g., “tint the windows in zone 1 to tint 4 when it’s sunny outside” or “tint the windows if the temperature inside this vehicle is above 70°F”). In some embodiments, any rules that can be implemented on a network of optically switchable devices (including any other networked components such as thermostat, vehicle device control system, electronic device, etc.) can be initiated via voice control.

[0168] In some embodiments, voice control is implemented on various components of control architecture for the target (e.g., smart window system), e.g., onboard window controllers or other window controllers, network controllers, master controllers, wall switches (e.g., interfaces with control components) and/or a separate device that interfaces with any or all of the aforementioned devices and/or components.

[0169] In some embodiments, gesture control is used to control the target. The gesture control may or may not use a limited command set (e.g., at times due to a lesser number of movements that would need to be recognized compared to the more expansive dictionary of words that can be recognized when using voice control). For example, gesture control can be used to implement many types of commands. For example, gesture control can be used to indicate that a particular target (e.g., window) or group of targets (e.g., windows) should change their state (e.g., change to a lighter or darker state (or other optical states if non- electrochromic optically switchable devices are used)). The user may indicate the target(s) (e.g., window(s)) to be changed, e.g., by pointing to the relevant target(s) (e.g., window(s)). Indication of the target may trigger coupling of the gesture with the target. The user may indicate the desired change by raising or lowering their hands or arms, by opening or closing their palms, or by pinching or expanding their fingers, for instance. In some embodiments, a gesture may implicitly indicate a change of state of a target (e.g., window) or group of targets (e.g., windows). For example, a gesture corresponding to a user wiping their brow or fanning their face may indicate that the user is hot, and therefore, that one or more windows are to be tinted darker A gesture may include a facial expression pattern (e.g., comprising particular sequences of movements of a portion of a person’s face, such as eyebrows, a mouth, lips, nose, etc.), a motion pattern (e.g., comprising particular sequences of movements of portions of a person’s body, such as hands, fingers, head, etc.), and/or a voice pattern (e.g., comprising sequences of words or emitted sounds). A dictionary of recognized gestures may be created to define the types of commands that can be accomplished via gesture control. More expansive gesture dictionaries may enable finer, more complex control of the optically switchable devices. There may be some degree of tradeoff in terms of ease of use, with smaller gesture dictionaries being potentially easier for users to master.

[0170] In some embodiments, the gestures are detected using at least one sensor. The sensor may be communicatively coupled to the network. The sensor may be an optical sensor (e.g., a camera such as a video camera). In some embodiments, the sensor may be a sensor capable of performing WiFi sensing and/or RF sensing, an infrared sensor, a motion sensor, a microphone, or the like. The sensor(s) (e.g., camera) may be provided on any available device, and in some examples is provided as part of a wall unit, as part of a device that interfaces with a wall unit (e.g., a smartphone, tablet, or other electronic device), as part of a hand-held device (e.g., smartphone, tablet, or other electronic device), on an electrochromic window or frame, or as part of any other device that is configured to control an electrochromic or other optically switchable window. For example, a user may gesture while holding, wearing, or otherwise moving a sensing device that is configured to sense movement, and/or acceleration, etc. The readings on the sensing device may be used to help determine what gesture a user has made. The movement sensing device may include one or more accelerometers (e.g., 3-axis accelerometer), gyroscopes, magnetometers, cameras, or the like (and may be included in a virtual reality (VR) interface, such as the Oculus Quest or Oculus Rift available from Facebook Technologies, LLC, of Menlo Park, California. Find attached the document with a comment regarding OVRPlayerController. The mobile circuitry may be, or be included in, a user controller, a character controller, and/or a player controller. It should be noted that, in some embodiments, gestures may be detected and/or identified without a sensing device that a user holds, wears, and/or moves. For example, in some embodiments, gestures are detected using at least one sensor that directly senses a gesture pattern (e.g., a facial expression pattern, a motion pattern, and/or a voice pattern) produced by the user by capturing motions and/or vocalizations produced or generated by the user. [0171] In some embodiments, the sensing device is a fitness device (e.g., any of various wearable devices from Fitbit Inc. or Jawbone, each in San Francisco, CA), watch (e.g., from Apple Inc. of Cupertino, CA or Pebble Technology Corporation in Palo Alto, CA), or similar wearable device. In some embodiments, relative positioning , velocity, acceleration, and/or Doppler effect is used to determine changes in gesture as commands to change the status of the target. In some embodiments, image recognition software is used to determine changes in gesture as commands to change the status of the target. In some embodiments, facial recognition software is used to determine changes in facial expressions as commands to change the tint level of windows. The gesture may comprise facial or bodily gesture (e.g.., of limbs or part of limbs). The gesture may comprise kinesthetic movement. The gesture may comprise a physical movement of a body part. The gesture may comprise a corporal, and/or anatomic movement. The movement may comprise a muscular movement. The movement may comprise a movement of one or more bones (e.g., by moving their adjoining muscle(s).

[0172] In some embodiments, a type of command that may be initiated via voice control is to turn off “listening mode.” The sound sensor (e.g., listening device) may be operatively (e.g., communicatively) coupled to the network. When listening mode is on, the device that listens for commands is able to pick up oral commands. When listening mode is off, the device that listens for commands is not able to pick up, hear, and/or record such commands. For example, the device that listens for commands may be part of a (e.g., window) controller, IGU, wall device, and/or another electronic device (e.g., phone, tablet, etc.). A user may request to turn listening mode off for increased privacy, and/or energy savings, etc. In some cases, the user may request that listening mode turn off for a specified time period (e.g., the duration of a meeting), for example. In order to turn listening mode back on, the user may press a button/touchscreen (e.g., on the device that listens for commands, on the window controller, IGU, wall device, or other electronic device) or otherwise indicate that listening mode should turn back on. Devices may indicate when listening mode is on and/or off. In one example, one or more lights (e.g., LEDs) may indicate whether listening mode is on or off. The light may be turned on to indicate that listening mode is on, and off to indicate that listening mode is off (or vice versa). In another example, a first light or light color may indicate that listening mode is on, and a second light or light color may indicate that listening mode is off. In another example, devices can use an audio cue, e.g., may emit a tone, e.g., periodically, as a reminder to the user that listening mode is inactive (or active). In certain implementations, listening mode may be deactivated for a period of time (e.g., for at least about 1 minute, 10 minutes, 30 minutes, 1 hour, 2 hour, 3 hours, 1 day, etc.), after which listening mode may automatically be reactivated. The period of time over which listening mode remains deactivated may be chosen by the user, or may be preset, for example. In some embodiments, listening mode is activated by default. Listening mode may be on unless it is turned off (e.g., permanently turned off, or turned off for a period of time, as mentioned herein). In some embodiments, the default setting is that listening mode is off (e.g., listening mode does not activate unless a command is received to turn listening mode on).

[0173] In some embodiments, where gesture command is used, the user can control whether a relevant device that interprets gesture commands is in a “watching mode.” Like the listening mode, the watching mode can be turned on and off. When a device is in watching mode, it is able to sense and interpret gesture commands, for example. When the watching mode is off, the device is not able to sense, record, and/or process gesture commands. Details provided herein related to listening mode may similarly apply to watching mode. The device that interprets the gesture may or may not be part of the control system. The gesture interpreting device may comprise a circuitry (e.g., may comprise a processor). The gesture interpreting device may be communicatively coupled to the network and/or to the control system. The gestures may be interpreted with respect to a virtual image of the enclosure in which the controllable target (e.g., an IGU, a sensor, a light, or a media) is disposed in. The gestures may be interpreted with respect to a target it is coupled to (e.g., pointed at).

[0174] In some embodiments, one or more voice commands are used to ask a question to the system controlling the target (e.g., optically switchable device (or some component on the network on which the optically switchable device is installed)). The questions may relate directly to the target (e.g., actuator, or optically switchable device), or more generally, to any target (e.g., optically switchable device) or group of targets (e.g., devices) communicatively coupled to (e.g., on) the network, for example. For instance, a user may ask what the current optical state is for a particular optically switchable device (e.g., “what’s the tint level of window 17”). Similarly, a user may ask what the upcoming behavior will be for a particular optically switchable device (e.g., “when is the next time the windows in the vehicle will begin to get darker?”). The questions may also relate to any other information to which the network has access. For instance, a user may ask about weather data (e.g., temperature data, cloud data, precipitation data, forecast data, etc.), location data (e.g., “where am I?” or “how long until I arrive at my destination?” or “is there a coffee shop on my route?” etc.”), access data (e.g., “am I allowed to control the tint level of the windows in this vehicle?”), etc. A user may ask about any environmental characteristic of the enclosure (e.g., as delineated herein). A user may ask for an explanation of why the target (e.g., optically switchable device) is performing in a certain way. In one example, a user might ask, “why is window 1 tinting?” and the system may explain in response to the query, “clouds expected to clear in 20 minutes, tinting in anticipation of bright sun.” This feature may be particularly useful in cases where the optically switchable device is programmed to execute rules that might not be immediately observable and/or understandable to a user. The answer may be provided visually (e.g., on a screen), as a printed material, or aurally (e.g., through a speaker).

[0175] In some embodiments, a voice command is used to control the degree of privacy in the enclosure (e.g., room), e.g., with respect to (e.g., wireless) communications. In some embodiments, optically switchable windows are patterned to include one or more antenna that may be used to block or allow particular wavelengths to pass through the windows. When activated, these patterned antennae can provide increased security/privacy by blocking cell phone communications, Wi-Fi communications, etc. Examples of patterned antennae and related privacy considerations can be found in PCT Application No.

PCT/US15/62387, filed November 24, 2015, and titled WINDOW ANTENNAS that is incorporated herein by reference in its entirety.

[0176] In some embodiments where voice and/or gesture control are used, one or more dictionaries are defined. For voice control, the dictionaries may define a set of words and/or phrases that the system is configured to interpret/understand. Similarly, for gesture control, the dictionaries may define a set of gestures that the system is configured to interpret/understand. Dictionaries may be tiered, e.g., given a command in a first level dictionary, a new dictionary at a second level may be initiated for receiving commands, and once received, yet another level dictionary may be actuated. In this way, individual dictionaries need not be overly complex, and the end user can quickly get to the command structure they desire. In some embodiments, (e.g., when the target is a media display) the gestures are interpreted as cursor movement on a media projection.

[0177] Examples of words or phrases that may be defined include names/identifications for each optically switchable device or group of devices (e.g., “window 1,” “group 1,” “zone 1,” “left side windows,” “right side windows,” “rear windows,” “front windows,” “windshield,” “windows between rows 1 and 2,” etc.). Such names/identifications may also be based at least in part on the location of the optically switchable devices. In this respect, the dictionaries may be defined to include words that identify optically switchable devices based at least in part on location (e.g., “rear,” “front,” “left side,” “right side,” etc.”), and/or words that provide a relation between the user (or some other person) and the optically switchable device being identified (e.g., “’’driver’s window,” “front seat passenger window,” etc.,). Words or phrases may be defined that indicate names/identifications for other controllable target devices, such as particular air vents (e.g., “driver air vent,” “front passenger air vent,” “back row air vents,” etc.), particular media displays (e.g., “driver’s media display,” “back row media display,” “Deepa’s media display,” etc.), particular speakers or audio devices (e.g., “front speakers,” “rear speakers,” etc.), lighting devices (e.g., “rear row lights,” “front row lights,” etc.) or other controllable devices.

[0178] In some embodiments, the dictionaries also define words related to the desired commands that can be instructed. For example, the dictionaries may include words like “tint,” “clear,” “clearest,” “darker,” “darkest,” “lighter,” “lightest,” “more,” “less,” “very,” “a little,” “tint level,” “tintl,” “tint2,” etc. Any words likely to be used by a person when instructing the optically switchable device when using verbal commands may be included in the dictionary.

In cases where the system is configured to allow a user to set a schedule or rules for the behavior of the optically switchable device, the dictionary or dictionaries can include any words needed to understand such commands (e.g., “Monday,” “Tuesday through Friday,” “morning,” “afternoon,” “bedtime,” “sunrise,” “if,” “then,” “when,” “don’t,” “cloudy,” “sunny,” “degrees,” “someone,” “no one,” “movement,” “only,” etc.). Similarly, in cases where the system is configured to allow a user to ask a question, the dictionary or dictionaries can include any words needed to understand the types of questions the system is designed to answer.

[0179] In some embodiments, there is a tradeoff between larger dictionaries, which may enable finer control, more natural and/or flexible commands, and more complex functions (e.g., answering any question where the answer is available on the internet), compared to smaller dictionaries, which may be easier for people to master, and which may enable faster and/or more local processing. Smaller dictionaries may be used in a tiered format, where access to successive dictionaries is afforded by a user providing the proper voice or gesture command in one dictionary in order to be allowed access to the next dictionary.

[0180] In some embodiments, a single dictionary may be used. In other embodiments, two or more dictionaries may be used, and the dictionary that is used at a particular time depends on what type of command, or what portion of a command a user is trying to convey. For example, a first dictionary may be used when a user is identifying which optically switchable device and/or other controllable device they wish to control, and a second dictionary may be used when the user is identifying what they want the optically switchable device and/or other controllable device to do. The first dictionary could include any words needed to identify the relevant optically switchable device and/or other controllable device, while the second dictionary could include any words needed to interpret what the user wants the optically switchable device and/or other controllable device to do. Such contextual dictionaries can provide a limited sub-set of words that the system is configured to understand and/or interpret whenever the particular dictionary is being used. This may make it easier to interpret a user’s commands.

[0181] In some embodiments, one or more dictionaries may be tailored to particular users. The dictionaries for defining and/or determining which electrochromic window(s) a user desires to switch may be limited based at least in part on which windows the user is authorized to switch, for instance. In one example, user A is allowed to switch windows 1-5, while user B is allowed to switch windows 6-10. The dictionary or dictionaries used to transcribe and/or interpret commands from user A may be limited to identifying windows 1-5, while the dictionary or dictionaries used to transcribe and/or interpret commands from user B may be limited to identifying windows 6-10.

[0182] In some embodiments, each dictionary includes certain keywords that allow the user to navigate through the system more easily. Such keywords may include phrases such as “help,” “back,” “go back,” “previous,” “undo,” “skip,” “restart,” “start over,” “stop,” “abort,” etc. When a user requests help, the system may be configured to communicate to the user (e.g., visually and/or aurally) the words, phrases, commands, windows, controllable devices, etc. that the system is currently configured to accept/understand based at least in part on the dictionary that is being used at a given time. For instance, if a user requests help while the system is accessing a dictionary that defines the different windows available for switching, the system may communicate that the available inputs at that time are, e.g., “window 1 ,” “window 2, “window 3,” “group 1,” etc.

[0183] In some embodiments, the system acts to ensure that a user is authorized to make a particular command before the command is executed. This can prevent unauthorized users from making changes to the optically switchable devices and/or other controllable devices. For example, permissions may be set such that only a driver, owner, lessor, etc. of a vehicle may make changes to controllable devices. It may be desirable to ensure that people who do not have authority to change the optical state of the optically switchable devices are prevented from doing so, e.g. for security or safety purposes. For example, it may be beneficial to ensure that the (e.g., only) people who are able to initiate a change in the target (e.g., optical transitions) via voice or gesture command are authorized to do so.

[0184] In some embodiments, authorization is done by having a user “log in” to the system to identify himself or herself. This may be done by logging into an application on an electronic device (e.g., smartphone, tablet, etc.), by keying in a code, electronically recognizing a code, by fingerprinting, eye pattern identification, facial identification, or voicing a passcode, etc. In another example, voice recognition may be used to confirm the identity of a user. In a further example, facial recognition, fingerprint scanning, retinal scanning, or other biometric-based methods may be used to confirm the identity of a user. Different authorization procedures may be best suited for different applications and/or contexts. In a particular example, a user may be automatically authorized. Such authorization may be based at least in part on a physical authorization token (e.g., an RFID badge, a BLE beacon, UWF beacon, etc. having appropriate identification information), and the proximity of the physical authorization token to a sensor that reads the token. The sensor may be provided on an optically switchable device or adjacent thereto (e.g., in a frame portion of the IGU such as in a mullion), on a controller in communication with the optically switchable device, on a wall unit in communication with the optically switchable device, etc. The verification may occur locally (e.g., on the sensor that reads the token, on an optically switchable device, on a controller, on a wall unit, etc.), and/or in the cloud.

[0185] In some embodiments, authorization occurs whenever it is needed, and authorization may expire after a set amount of time has passed, or after the user has been idle for a set amount of time (e.g., after 24 hours, or after 1 hour, or after 10 minutes). In some embodiments, authorization may occur when a driver or passenger first sits in a vehicle, and the authorization may expire when the driver or passenger exits the vehicle. The time period used for auto-logging out may depend on the setting in which the target (e.g., windows) are installed or projected. For example, whether the target(s) (e.g., windows) are in a public area or a private area). In some cases, authorization may not expire until a user logs out (e.g., using any available method including, but not limited to, orally requesting a logout, pressing a logout button, etc.). In some embodiments, authorization occurs each time a command is made. In some embodiments, authorization occurs in stages even when interpreting a single command. In a first authorization stage, it may be determined whether the user has authorization to make any changes on the network, and in a second authorization stage, it may be determined whether the user has authorization to make the particular change that the user has requested and/or initiated.

[0186] In some embodiments, the authorization process is used to limit the dictionaries used to interpret the voice and/or gesture commands. For example, the dictionary or dictionaries for a particular user may exclude one or more specified targets (e.g., optically switchable devices (or groups/zones of such devices)) that the user is not authorized to control. In one example, a user may be only authorized to control the optically switchable devices in zone 1 and zone 2, so the dictionary or dictionaries used to interpret commands for this user may include “zone 1” and “zone 2” while excluding “zone 3.” For example, in some embodiments, users corresponding to passengers of a vehicle may be authorized to control passenger side windows or controllable devices and may not be authorized to control driver side windows or controllable devices. Any other words needed to interpret and/or understand the command may also be included in the dictionary.

[0187] In some embodiments, a voice and/or gesture control system includes several modules that may be used when practicing the disclosed voice and/or gesture control embodiments. These modules may be implemented separately or together, as appropriate for a particular application. The modules may be provided in separate pieces of hardware, and/or may control a variety of processors. The modules may be executed concurrently or non-concurrently (e.g., sequentially). A module may be independently implemented on a controller (e.g., the window controller, the network controller, and/or the master controller), an optically switchable device, a wall device, a router, a remote processor, and/or any other target (e.g., as disclosed herein). In some embodiments, one or more of the modules are implemented on a processor and/or a processing unit of a media controller or of a window controller. Within each module, any relevant processing may be done locally and/or remotely. The processing may be done in a central location and/or device, or it may be distributed throughout a number of locations and/or devices.

[0188] In some embodiments, the voice and/or gesture control system includes a voice recognition module which converts and/or transcribes speech to text. In other words, the input to this module may be speech (spoken by a user and captured/recorded by a microphone), and the output from this module may be a text string or file. This module may be implemented using a number of commercially available speech to text products, services, and/or libraries. As one example, Carnegie Mellon University of Pittsburgh, PA provides a number of open source speech software resources that may be used such as CMU Sphinx. Additional examples include various Dragon products available from Nuance Communications, Inc. in Burlington, MA, and Tazti, available from Voice Tech Group, Inc. of Cincinnati, OH. The voice recognition module may also be implemented using custom software designed specifically for voice control related to optically switchable devices and/or other controllable devices.

[0189] In some embodiments, the voice and/or gesture control system includes a command processing module which interprets text in order to determine the desired command instruction. In other words, the input to this module may be a text file (which may be generated by the voice recognition module), while the output may be a set of commands and/or instructions that can be interpreted by the window controller (or by another controller on the network) to cause the relevant target (e.g., sensor, emitter, media, media display, or optically switchable device) to initiate the requested command. This function may also be referred to as language processing or natural language processing. Similar to the speech recognition module, the command processing module may be implemented using a number of available products and/or services, or using software specifically developed for the particular application.

[0190] In some embodiments, the voice and/or gesture control system includes an authentication module which is used to practice the authorization and/or security techniques discussed herein. For example, the authorization module may be used to ensure that the person giving the command is authorized to make the command. The authentication module may comprise a blockchain procedure and/or embedded encryption key(s). The blockchain procedure may comprise (e.g., peer-to-peer) voting. The encryption key(s) may be linked to a target (e.g., a device). The authentication module may be designed to ensure that only authorized devices can connect to a given network, facility, and/or service. The module may compare the optically switchable device and/or other controllable device identified in the command to a list of optically switchable devices and/or other controllable device that the user is authorized to control. In cases where a user tries to control an optically switchable device and/or other controllable device that they are not authorized to control, the authentication module may be configured to notify the user (e.g., visually, in print, and/or aurally) that they are not authorized to control the relevant optically switchable device and/or other controllable device. In other cases, no action is taken when an un-authorized command is given (e.g., no notification to the user, and no change to the target status (e.g., no switching of the optically switchable device, no status change to a controllable device, etc.)). The authentication may consider the identification of the user data such as age of the user, whether the user is licensed to operate the vehicle, whether the user is currently a driver or operator of the vehicle or is a passenger of the vehicle, or the like. The identification of the user may be provided to the authentication module. Examples of authentication (e.g., using blockchain procedure) can be found in PCT patent application serial number PCT/US20/70123 that is incorporated herein by reference in its entirety.

[0191] In some embodiments, the voice and/or gesture control system includes a command execution module which executes the commands on the relevant optically switchable device(s) and/or other controllable devices. The command may be executed on a master controller, network control ler(s), and/or window control ler(s). In one example, the command may be executed by instructing the master controller to send all windows in a particular group or zone to a desired tint level. Generally, the command may be executed on and/or by any of the control apparatus, or by any of the control methods described herein.

[0192] In some embodiments, the voice and/or gesture control system includes a response generation module that generates a response. The response can be communicated to the user by a response communication module. The response generated by the response generation module may be a text response (e.g., displayed optically, displayed in print, and/or sounded). The text response may be displayed to the user, e.g., optically on a screen, using the response communication module. For example, the response communication module may convert the text response into a speech response (e.g., in a sound file) that is played to the user. Any appropriate text-to-speech methods may be used to accomplish this. For example, the response communication module may convert the text response to hard print, e.g., on a paper. Generally, the response generation module and the response communication module may work together to generate and/or communicate a response to the user.

[0193] In some embodiments, a response may be provided to a query of the communication module (e.g., automatically, for example, by the control system), which response may be communicated via a response generation module. One purpose of the response generation module and/or the response communication module may be to notify the user what command has been understood by the control system. Similarly, any of these modules can be used to notify the user, e.g., regarding any action that the optically switchable device and/or other controllable device is taking in response to the user’s command. In one example, the response generation module may generate a response that repeats the basic command given by the user to alter a status of a target (e.g., “window 1 to tint 4” or “tint window 1 to tint 4 when it becomes sunny”). The response may then be communicated to the user via the response communication module. The response generation module and/or response communication module may be used to ask for clarification from the user. For instance, if it is unclear whether the user wants to change window 1 or window 2, the response generation module may be used to prompt the user for clarification and/or further information.

[0194] Fig. 7 shows an example voice and/or gesture control system 700 comprising various modules. Functional modules within control system 700 include voice recognition module 702, command processing module 704, authentication module 706, command execution module 708, response generation module 710, and response communication module 712.

[0195] In operation of some embodiments, the voice and/or gesture control system implements a method for controlling (e.g., altering) a status of a target, e.g., controlling one of more devices using voice control. At least one microphone may be configured and positioned to receive voice commands. The microphone may be located at any portion of a vehicle in which the target is disposed, for example, in an enclosure where the target is disposed, for example, on the target (e.g., on an optically switchable device), on a wall device or on another electronic device such as a smartphone, tablet, laptop, PC, etc. One example command includes “turn window 1 to tint 4.” For example, if listening mode is on, then the microphone is able to listen for and/or record voice commands from a user. Once recorded, the voice command may be converted and/or transcribed into a text command.

[0196] In some embodiments, the voice-to-text conversion is influenced by one or more dictionaries as described above. For example, words or phrases that sound similar to words or phrases stored in the relevant dictionary may be converted to the words/phrases stored in the dictionary, even if not exactly the same. In a particular example, a user gives the command to “switch window 1 to tint 4,” but the voice recognition module initially interprets the command as “switch window 1 to tint floor.” If the relevant dictionary or dictionaries associated with the voice recognition module defines phrases such as “window 1,” “window 2,” “tint 1 ,” “tint 2,” “tint 3,” and “tint 4,” but does not include any phrases with the word “floor,” the voice recognition module may recognize that the user likely said “tint 4” rather than the initially understood “tint floor,” which has no relevant meaning in the associated dictionary or dictionaries. In other words, the results of the text-to-speech operation may be limited or otherwise influenced by the relevant dictionaries being used.

[0197] In some embodiments, the text command is interpreted. This interpretation may be done by the command processing module. Like the voice-to-text conversion, the interpretation of the text command in operation 1007 may be influenced by the dictionary or dictionaries being used. This operation may involve specifically identifying which target or targets (e.g., optically switchable device or devices) the user is requesting to change, and/or identifying the particular requested change.

[0198] In some embodiments, it is determined whether the user is authorized to make the requested command. The authorization may be done by the authentication module, for example. If the user is not authorized to make the requested command, operation may end where either (1) nothing happens, or (2) a response is generated to notify the user that they are unauthorized to make the command. The response may be provided visually (e.g., through a visual display (e.g., on or adjacent to an optically switchable window), a wall device, or other electronic device), in print form, and/or aurally (e.g., by playing a sound file via speakers on an optically switchable device, wall device, or other electronic device).

[0199] In some embodiments, a response to the user is generated if the user is authorized to make the requested command. The response may be generated by the response generation module. The response may confirm that the requested command is taking place. The response may be communicated to the user by the response communication module. The response may be presented to the user visually (e.g., on a display), in print form (e.g., hard print), and/or aurally (e.g., via speakers). The display and/or speakers may be provided on an optically switchable device, a wall device, or other electronic device (e.g., smartphone, tablet, laptop, PC, etc.).

[0200] Fig. 8 illustrates a flowchart for a method 800 of controlling one or more optically switchable devices (e.g., electrochromic windows) and/or other controllable devices using voice control. The method 800 begins at operation 801, when a user provides a voice command. The voice command may be given in a variety of ways depending on the configuration of the voice control system and the robustness of the voice control processing, for instance.

[0201] Next, at operation 803 it is determined whether listening mode is on. When listening mode is on, the microphone can listen for and/or record voice commands from a user. When listening mode is off, the microphone can be off or otherwise not accepting voice commands related to the optically switchable devices. One example where the microphone can remain “on” while listening mode is “off,” is when the microphone is located in a user’s cell phone and the user is making an unrelated call on their cell phone. The determination in operation 803 may be made passively. If listening mode is not on (e.g., is “off”), the microphone will not pick up and/or record the voice command that was made in operation 801, and nothing will happen, as indicated at operation 804. In some embodiments, a user may optionally activate listening mode manually, as indicated at operation 802. Where this is the case, the method may continue at operation 801 where the user repeats the command. If listening mode is on at operation 803, the method continues with operation 805, where the voice command is converted/transcribed into a text command. The voice-to-text conversion may be done by the voice recognition module.

[0202] Next, at operation 807, the text command is interpreted. This interpretation may be done by the command processing module. Like the voice-to-text conversion discussed in relation to operation 805, the interpretation of the text command in operation 807 may be influenced by the dictionary or dictionaries being used. This operation may involve specifically identifying which optically switchable device or devices the user is requesting to change and identifying the particular requested change. For instance, if the command provided by the user is “switch window 1 to tint 4,” the interpretation may involve determining (1) that the user is requesting a change for window 1, and (2) that the requested change relates to switching the window to tint state 4. [0203] The text command interpretation at operation 807 (as well as the voice-to-text conversion at operation 805) may be influenced by user preferences and/or user permissions. For instance, if a user makes a voice command to “make the windows darker,” the system may interpret which windows are desired to be switched based at least in part on which windows the user typically switches and/or based at least in part on which windows the user is allowed to switch. As another example, if a user issues a voice command to “play Movie X,” the system may determine which media displays are to be used to present the identified media content (e.g., a rear passenger media display, etc.) based at least in part on a location of the user that issued the voice command, whether the vehicle is in motion, or the like.

[0204] At operation 809, it is determined whether the user is authorized to make the requested command. The authorization may be done by the authentication module, for example. If the user is not authorized to make the requested command, the method ends at operation 810 where either (1) nothing happens, or (2) a response is generated to notify the user that they are unauthorized to make the command. The response may be provided visually (e.g., through a visual display on an optically switchable window, a wall device, or other electronic device) and/or aurally (e.g., by playing a sound file via speakers on an optically switchable device, wall device, or another electronic device). Further details related to response generation are provided below.

[0205] If the user is authorized to make the requested command, the method can continue at operation 811, where the text command is executed. The command may be executed using any of the methods and systems described herein. The command may be executed using the command execution module. In some embodiments, the command may be executed over a network on which the optically switchable device is installed, and may involve one or more window controllers, network controllers, and/or master controllers. For example, operation 811 involves carrying out the command requested by the user in operation 801.

[0206] At operation 813, a response to the user is generated. The response may be generated by the response generation module. The response may confirm that the requested command is taking place. The response may specifically indicate the content of the command such that the user knows whether she was understood correctly. One example response may be “switching window 1 to tint 4.” A simpler positive response such as “ok,” or a green light and/or a tone may let the user know she was heard, without specifically repeating the content of the command (e.g., using the response generation module and/or the response communication module). In a particular example, the response may include a request that the user confirm that the system has correctly understood the desired command. In such a case, the command may not be executed until such confirmation is received from the user.

[0207] At operation 815, the response is communicated to the user. The response may be communicated to the user by the response communication module. The response may be presented to the user visually (e.g., on a display) and/or aurally (e.g., via speakers). The display and/or speakers may be provided on an optically switchable device, a wall device, or other electronic device (e.g., smartphone, tablet, laptop, PC, etc.). The display and/or speakers may be provided in the same unit as the microphone, or they may be provided in separate units. In certain cases where an aural response is provided, the response generation may involve generating the desired text of the response (e.g., using the response generation module), and then generating and playing a sound file that corresponds to the desired text (e.g., using response communication module). The method 800 may be practiced in a variety of ways. In some embodiments, certain operations occur in a different order from what is shown in Fig. 8.

[0208] In some embodiments, the voice control method involves using two or more dictionaries. Fig. 9 illustrates a flowchart for an example of a method 900 for controlling one or more optically switchable devices and/or other controllable devices using two or more voice-control-related dictionaries. The method 900 of Fig. 9 is similar to the method 800 of Fig. 8, except that the command is interpreted in a piecemeal fashion, with different dictionaries applying to different portions of the command. Many of the operations illustrated in Fig. 9 are the same as those presented in Fig. 8, and for the sake of brevity the description will not be repeated.

[0209] In an embodiment of method 900, after it is determined that the listening mode is on in operation 903, part 1 of the voice command is converted to part 1 of the text command using a first dictionary in operation 925. The particular dictionary that is used may correspond to the part of the text that is being interpreted. Next, it is determined whether there are additional parts of the voice command to interpret/convert to text in operation 926. If there are additional parts of the voice command to interpret, the method continues at operation 927, where the dictionary is optionally switched to another dictionary. The next dictionary that is chosen may correspond to the next part of the command that is to be interpreted. The method then continues back at operation 925, where part 2 of the voice command is converted to part 2 of the text command, optionally using a different dictionary than was used in connection with part 1 of the command. The loop of operations 925, 926, and 927 continues until all of the parts of the command have been converted to text using the appropriate dictionaries.

[0210] In one example, the full voice command is “switch window 1 to tint 4.” One part of the voice command (e.g., part 1) may relate to identifying which optically switchable devices the user desires to switch, in this case “window 1.” Another part of the voice command (e.g., part 2) may relate to identifying what the desired command/ending optical state is, in this case switching to “tint 4.” The different parts of the command may be structured as desired for a particular system. More structured commands may be easier to process and/or interpret, which may make local processing a more attractive option. Less structured commands may be harder to process and/or interpret, which may make remote processing a more attractive option.

[0211] In some embodiments, after all parts of the voice command have been converted to text, the different parts of the text command are joined together to define the full text command, and the method continues at operation 907. The remaining portions of the method are the same as those described in relation to Fig. 8.

[0212] The examples in Figs. 8 and 9 were provided for a target that is an IGU and the status change is a tint change of the IGU. Any status change to any target can be implemented in a similar manner.

[0213] In some embodiments where gesture command is used in place of voice command, a mobile circuitry, or a sensor (e.g., of a camera) may be used instead of (or in addition to) a microphone, in order to perceive and record the user’s command. The mobile circuitry may be communicatively coupled to the network that is communicatively coupled to a digital twin of the facility (e.g., enclosure) in which the target is disposed. Instead of a voice recognition module, a gesture recognition module may be employed for analyzing the mobile circuitry and/or sensor (e.g., camera) data. For example, a user may be positioned within a field of view of a camera so that movements of the user can be captured which are carried out according to a desired control action to be taken in connection with controllable targets (e.g., devices) such as tintable windows. For example, movements of the user can be captured by the mobile device manipulated by the user (e.g., moved by the user) which are carried out according to a desired control action to be taken in connection with controllable targets (e.g., devices) such as tintable windows.

[0214] Fig. 10 shows an example of a user device 1005 connected to an access point 1010, which is further connected to a switch 1015. Switch 1015 may be connected to both router 1020 and controller (i.e., control unit) 1025. Router 1020 may include firewall protection to enhance security. The controller 1025 may be a window controller, network controller, or master controller. If the controller 1025 is not a window controller, the controller 1025 may relay instructions to relevant window controllers over the network.

[0215] Fig. 11A shows an example wherein the device 1105 is connected to access point 1110, which is connected to controller 1125. Each of these connections may be wired and/or wireless. Fig. 11 B shows an example wherein the device 1105 is directly connected to the controller 1125. This connection may be wired and/or wireless. Fig 11C shows an example wherein device 1105 is connected to the cloud 1130 (e.g., the Internet). The cloud 1130 is also connected with router 1120, which is connected to switch 1115, which is connected to controller 1125. The connections may be wired and/or wireless, as appropriate for a particular application. In a particular example, the device 1105 can be a smartphone, which connects wirelessly (e.g., via a communication network that is capable of transmitting at least a third, fourth, or fifth generation communication (e.g., 3G, 4G, or 5G communication)) with the cloud 1130.

[0216] In some embodiments, the interactive systems to be controlled by a user include media (e.g., visual and/or audio content) for display, e.g., to vehicle occupants. The display may include stills or video projection arrangements. The display may include transparent organic light-emitting devices (TOLED). The display may be integrated as a display construct with window panel(s) (e.g., frame(s)). Examples of display constructs can be found in U.S. provisional patent application serial number 62/975,706 filed on February 12, 2020, titled “TANDEM VISION WINDOW AND MEDIA DISPLAY,” that is incorporated herein in its entirety.

[0217] In some embodiments, a display construct is coupled with a viewing (e.g., a tintable viewing) window. The viewing window may include an integrated glass unit (IGU). The display construct may include one or more glass panes. The display (e.g., display matrix) may comprise a light emitting diode (LED). The LED may comprise an organic material (e.g., organic light emitting diode abbreviated herein as “OLED”). The OLED may comprise a transparent organic light emitting diode display (abbreviated herein as “TOLED”), which TOLED is at least partially transparent. The display may have at its fundamental length scale 2000, 3000, 4000, 5000, 6000, 7000, or 8000 pixels. The display may have at its fundamental length scale any number of pixels between the aforementioned number of pixels (e.g., from about 2000 pixels to about 4000 pixels, from about 4000 pixels to about 8000 pixels, or from about 2000 pixels to about 8000 pixels). A fundamental length scale may comprise a diameter of a bounding circle, a length, a width, or a height. The fundamental length scale may be abbreviated herein as “FLS.” The display construct may comprise a high resolution display. For example, the display construct may have a resolution of at least about 550, 576, 680, 720, 768, 1024, 1080, 1920, 1280, 2160, 3840, 4096, 4320, or 7680 pixels, by at least about 550, 576, 680, 720, 768, 1024, 1080, 1280, 1920, 2160, 3840, 4096, 4320, or 7680 pixels (at 30Hz or at 60Hz). The first number of pixels may designate the height of the display and the second pixels may designates the length of the display. For example, the display may be a high resolution display having a resolution of 1920 x 1080, 3840 c 2160, 4096 c 2160, or 7680 c 4320. The display may be a standard definition display, enhanced definition display, high definition display, or an ultra-high definition display. The display may be rectangular. The image projected by the display matrix may be refreshed at a frequency (e.g., at a refresh rate) of at least about 20 Hz, 30 Hz, 60 Hz, 70 Hz, 75 Hz, 80 Hz, 100 Hz, or 120 Hertz (Hz). The FLS of the display construct may be at least 20”, 25”, 30”, 35”, 40”, 45”, 50”, 55”, 60”, 65”, 80”, or 90 inches (“). The FLS of the display construct can be of any value between the aforementioned values (e.g., from about 20” to about 55”, from about 55” to about 100”, or from about 20” to about 100”).

[0218] In some embodiments, at least a portion of a window surface in a vehicle is utilized to display the various media using the glass display construct. The display may be utilized for (e.g., at least partial) viewing an environment external to the window (e.g., outdoor environment), e.g. when the display is not operating. The display may be used to display media (e.g., as disclosed herein), to augment the external view with (e.g., optical) overlays, augmented reality, and/or lighting (e.g., the display may act as a light source). The media display may be used for entertainment and non-entertainment purposes. The media display may be used for video conferencing. For example, the media display may be used for work (e.g., data analysis, drafting, and/or video conferencing). For example, the media display may be used for educational, health, safety, purchasing, monetary, or entertainment purposes. The media may present occupants of the vehicle in which the media display is disposed and remote users, e.g., in a collage, overlaid, and/or bifurcated display. The media may be manipulated (e.g., by utilizing the display construct). Utilizing the display construct can be direct or indirect. Indirect utilization of the media may be using an input device such as an electronic mouse, or a keyboard. The input device may be communicatively (e.g., wired and/or wirelessly) coupled to the media display. Direct utilization may be by using the display construct as a touch screen using a user (e.g., finger) or a contacting device (e.g., an electronic pen or stylus).

[0219] In some embodiments, the media may be displayed by a transparent media display construct. The transparent display construct that is configured to display media, may be disposed on, or coupled (e.g., attached) to, a window, a door, a wall, a divider, a seat or seat back, a dashboard, or to any other physical element of a vehicle. The physical element may be a fixture or a non-fixture. The physical element (e.g., window, wall, or divider) may be static or mobile (e.g., a moving window, door, or partition). The physical element may comprise a tintable window. The physical element may comprise a tintable substance (e.g., an optically switchable device such as an electrochromic device). The optically switchable device may alter its transparency, absorbance, or color, e.g., at least in the visible spectrum. A user may control the usage of the media display and/or tint state of the tintable window, e.g., separately or as linked to each other. A user in one enclosure looking out of the enclosure through the transparent media display, may optionally see both the media, and the external environment of the enclosure through the media display.

[0220] Embodiments described herein relate to vision windows with a tandem (e.g., transparent) display construct. In certain embodiments, the vision window is an electrochromic window. The electrochromic window may comprise a solid state and/or inorganic electrochromic (EC) device. The vision window may be in the form of an insulated glass unit (IGU). When the IGU includes an electrochromic (abbreviated herein as “EC”) device, it may be termed an “EC IGU.” The EC IGU can tint (e.g., darken) an enclosure (e.g., vehicle) in which it is disposed and/or provide a tinted (e.g., darker) background as compared to a non-tinted IGU. The tinted IGU can provide a background preferable (e.g., necessary) for acceptable (e.g., good) contrast on the (e.g., transparent) display construct.

In another example, windows with (e.g., transparent) display constructs can replace televisions (abbreviated herein as “TVs”) in commercial and residential applications. Together, the (e.g., transparent) display construct and EC IGU can provide visual privacy glass function, e.g. because the display can augment the privacy provided by EC glass alone.

[0221] One embodiment, depicted in FIG. 12, includes an electrochromic (EC) window lite, or IGU or laminate, combined with a transparent display. The transparent display area may be co-extensive with the EC window viewable area. An electrochromic lite, 1210, including a transparent pane with an electrochromic device coating thereon and bus bars for applying driving voltage for tinting and bleaching, is combined with a transparent display panel, 1220, in a tandem fashion. In this example, 1210 and 1220 are combined using a sealing spacer, 1230, to form an IGU, 1200. The transparent display may be a standalone lite for the IGU, or be e.g. a flexible panel laminated or otherwise attached to a glass lite, and that combination is the other lite of the IGU. In some embodiments, the transparent display is the, or is on the, inboard lite of the IGU, for use by the vehicle occupants. In other embodiments, an electrochromic device coating and transparent display mechanism are combined on a single substrate. In some embodiments, a laminate, rather than an IGU, are formed from 1210 and 1220, without a sealing spacer. [0222] The transparent display can be used for many purposes. For example, the display can be used for conventional display or projection screen purposes, such as displaying video, presentations, digital media, teleconferencing, web-based meetings including video, security warnings to occupants and/or people outside the vehicle (e.g., emergency response personnel) and the like. The transparent display can also be used for displaying controls for the display, the electrochromic window, an electrochromic window control system, a security system, and/or the like. In certain embodiments, the transparent display can be used as a physical alarm element. That is, the electrochromic lite of an IGU can be used as a breakage detector to indicate a security breach of the vehicle (e.g., a smashed window). The transparent display could also, alone or in combination with the electrochromic lite, serve this function. In one example, the electrochromic lite is used as a breakage detection sensor, i.e. , breaking the EC pane triggers an alarm. The transparent display may also serve this function, and/or be used as a visual alarm indicator, e.g., displaying information to occupants and/or external emergency personnel. For example, in certain implementations, a transparent display may have a faster electrical response than the electrochromic lite, and thus could be used to indicate alarm status, for example, externally to firefighters, etc. or internally to occupants, e.g., to indicate the nature of the threat and/or escape routes. In one embodiment, breakage of the outboard electrochromic lite sends a signal to the transparent display, via the window controller, such that the transparent display conveys a security breach. In one embodiment, the transparent display flashes a warning message and/or flashes red, e.g., the entire transparent display pane may flash brightly in red to indicate trouble and be easily seen, e.g., a large window flashing in this manner would be easily noticeable to occupants and/or outside personnel. In another example, one or more neighboring windows may indicate damage to a window. In certain embodiments, one or more transparent displays may be used to display a message to first responders, indicating both the location and nature of the emergency.

[0223] The electrochromic window can be used as a contrast element to aid visualization of the transparent display, e.g., by tinting the EC pane the transparent display will have higher contrast. In turn, the transparent display can be used to augment the color, hue, %T, switching speed, etc. of the electrochromic device. There are many novel symbiotic relationships that can be exploited by the combination of EC window and transparent display technology. When the EC pane and the transparent display are both in their clear state, IGU 1200 appears and functions as a conventional window. Transparent display 1220 may have some visually discernable conductive grid pattern but otherwise is transparent, and can be uni- or bidirectional in the display function. One of ordinary skill in the art would appreciate that as transparent display technology advances, the clarity and transparency of such devices will improve. Improvements in micro and nanostructured addressable grids, as well as transparent conductor technology, allow for transparent displays where there is no visually discernable conductive grid.

[0224] Figure 13 depicts an electrochromic insulated glass unit, 1350, with an on-glass transparent display, 1375, used as a control interface for IGU 1350. Display 1375 may be wired to an onboard controller which is, e.g., housed in the secondary sealing volume of the IGU. The wiring for the transparent display 1375 may pass through the glass, around the edge of the glass, or may be wirelessly connected to the onboard (or offboard) controller (not shown). When the transparent display 1375 is not in use, it is essentially transparent and colorless, so as not to detract from the aesthetics of the IGU’s viewable area. Transparent display 1375 may be adhesively attached to the glass of the IGU. Wiring to the control unit of the window may pass around or through the glass upon which the display is attached. The display may communicate with a window controller or control system wirelessly via one or more antenna, which may also be transparent.

[0225] In some embodiments, the display construct comprises a hardened transparent material such as plastic or glass. The glass may be in the form of one or more glass panes. For example, the display construct may include a display matrix (e.g., an array of lights) disposed between two glass panes. The array of lights may include an array of colored lights. For example, an array of red, green, and blue colored lights. For example, an array of cyan, magenta, and yellow colored lights. The array of lights may include light colors used in electronic screen display. The array of lights may comprise an array of LEDs (e.g., OLEDs, e.g., TOLEDs). The matrix display (e.g., array of lights) may be at least partially transparent (e.g., to an average human eye). The transparent OLED may facilitate transition of a substantial portion (e.g., greater than about 30%, 40%, 50%, 60%, 80%, 90% or 95%) of the intensity and/or wavelength to which an average human eye senses. The matrix display may form minimal disturbance to a user looking through the array. The array of lights may form minimal disturbance to a user looking through a window on which the array is disposed. The display matrix (e.g., array of lights) may be maximally transparent. At least one glass pane of the display construct may be of a regular glass thickness. The regular glass may have a thickness of at least about 1 millimeters (mm), 2mm, 3mm, 4mm, 5mm, or 6 mm. The regular glass may have a thickness of a value between any of the aforementioned values (e.g., from 1mm to 6mm, from 1mm to 3mm, from 3mm to about 4mm, or from 4mm to 6mm). At least one glass pane of the display construct may be of a thin glass thickness. The thin glass may have a thickness of at most about 0.4 millimeters (mm), 0.5 mm, 0.6 mm, 0.7 mm, 0.8mm, or 0.9mm thick. The thin glass may have a thickness of a value between any of the aforementioned values (e.g., from 0.4mm to 0.9mm, from 0.4mm to 0.7mm, or from 0.5mm to 0.9mm). The glass of the display construct may be at least transmissive (e.g., in the visible spectrum). For example, the glass may be at least about 80%, 85%, 90%, 95%, or 99% transmissive. The glass may have a transmissivity percentage value between any of the aforementioned percentages (e.g., from about 80% to about 99%). The display construct may comprise one or more panes (e.g., glass panes). For example, the display construct may comprise a plurality (e.g., two) of panes. The glass panes may have (e.g., substantially) the same thickness, or different thickness. The front facing pane may be thicker than the back facing pane. The back facing pane may be thicker than the front facing pane. Front may be in a direction of a prospective viewer (e.g., in front of display construct 101, looking at display construct 101). Back may be in the direction of a (e.g., tintable) window (e.g., 102). One glass may be thicker relative to another glass. The thicker glass may be at least about 1.25*, 1.5*, 2*, 2.5*, 3*, 3.5*, or 4* thicker than the thinner glass. The symbol “*” designates the mathematical operation of “times.” The transmissivity of the display construct (that including the one or more panes and the display matrix (e.g., light-array or LCD)) may be of at least about 20%, 30%, 35%, 40%, 45%, 50%, 60%, 70%, 80%, or 90%. The display construct may have a transmissivity percentage value between any of the aforementioned percentages (e.g., from about 20% to about 90%, from about 20% to about 50%, from about 20% to about 40%, from about 30% to about 40%, from about 40% to about 80%, or from about 50% to about 90%). A higher transmissivity parentage refers to higher intensity and/or broader spectrum of light that passes through a material (e.g., glass). The transmissivity may be of visible light. The transmissivity may be measured as visible transmittance (abbreviated herein as “Tvis”) referring to the amount of light in the visible portion of the spectrum that passes through a material. The transmissivity may be relative to the intensity of incoming light. The display construct may transmit at least about 80%, 85%, 90%, 95%, or 99% of the visible spectrum of light (e.g., wavelength spectrum) therethrough. The display construct may transmit a percentage value between any of the aforementioned percentages (e.g., from about 80% to about 99%). In some embodiments, instead of an array of lights, a liquid crystal display is utilized.

[0226] In some embodiments, diverse types of interfaces are employed for providing user control of interactive targets (e.g., systems, devices, and/or media). The interactive targets can be controlled, e.g., using control interface(s). The control interface may be local and/or remote. The control interface may be communicated through the network. The control system may be communicatively coupled to the network, to which the target(s) are communicatively coupled. An example of a control interface comprises manipulating a digital twin (e.g., representative model) of a vehicle. For example, one or more interactive devices (e.g., optically switchable windows, sensors, emitters, and/or media displays) may be controlled using a mobile circuitry. The mobile circuitry may comprise a gaming-type controller (e.g., a pointing device) or a virtual reality (VR) user interface. When an additional new device is installed in the vehicle and is coupled to the network, the new target (e.g., device) may be detected (e.g., and included into the digital twin). The detection of the new target and/or inclusion of the new target into the digital twin may be done automatically and/or manually. For example, the detection of the new target and/or inclusion of the new target into the digital twin may be without requiring (e.g., any) manual intervention.

[0227] In some embodiments, a digital twin comprises a digital model of the vehicle. The digital twin is comprised of a virtual three dimensional (3D) model of the vehicle. The vehicle may include static and/or dynamic elements. For example, the static elements may include representations of a structural feature of the vehicle and the dynamic elements may include representations of an interactive device with a controllable feature. The 3D model may include visual elements. The visual elements may represent vehicle fixture(s). The fixture may comprise a partition, a wall, a floor, door, baggage compartment, a fixed lamp, electrical panel, \ a window, one or more seats, one or more drink holders, a front windshield, a rear windshield, mirrors (e.g., rearview and/or side view mirrors), or the like. In some instances (e.g., in a case of a cruise ship) The fixture may comprise a structural (e.g., walk-in) closet, a fixed lamp, electrical panel, elevator shaft. The fixtures may be affixed to the structure. The visual elements may represent non-fixture(s). The non-fixtures may comprise a person a media projection. The visual elements may represent features comprising a floor, wall, door, window, people, and/or interactive target(s)). The digital twin may be similar to virtual worlds used in computer gaming and simulations, representing the environment of the real vehicle. The 3D model may comprise structural details related to the design of the vehicle, such as a 3D model, elevation details, floor plans, and/or project settings related to the vehicle. The 3D model may comprise annotation (e.g., with two dimensional (2D) drafting element(s)). The 3D model may facilitate access to information from a model database of the vehicle. The 3D model may be updated during the lifecycle of the vehicle. The update may be periodically, intermittently, on occurrence of an event, in real time, on availability of manpower, and/or at a whim. The digital twin may comprise the 3D model, and may be updated in relation to (e.g., when) the 3D model of the vehicle is updated. The digital twin may be linked to the 3D model (e.g., and thus linked to its updates). In real time may include within at most 15 seconds (sec.), 30sec., 45sec., Iminute (min), 2min., 3min. 4min., 5min, 10min., 15min. or 30min. from the occurrence of a change in the enclosure (e.g., a change initiated by the user).

[0228] In some embodiments, the digital twin (e.g., 3D model of the vehicle) is defined at least in part by using one or more sensors (e.g., optical, acoustic, pressure, gas velocity, and/or distance measuring sensor(s)), to determine the layout of the real vehicle. Usage of sensor data can be used exclusively to model the environment of the enclosure. Usage of sensor data can be used in conjunction with a 3D model of the vehicle to model the environment of the enclosure. The model of the vehicle may be obtained before, during, and/or after the vehicle has been constructed. The model of the vehicle can be updated (e.g., manually and/or using the sensor data) during operation of the vehicle (e.g., in real time). In real time may include, during occurrence of a change of, or in, the vehicle. In real time may include within at most 2h, 4h, 6h, 8h, 12h, 24h, 36h, 48h, 60h, or 72h from the occurrence of a change of, or in, the vehicle.

[0229] In some embodiments, dynamic elements in the digital twin include target (e.g., device) settings. The target setting may comprise (e.g., existing and/or predetermined): tint values, temperature settings, and/or light switch settings. The target settings may comprise available actions in media displays. The available actions may comprise menu items or hotspots in displayed content. The available actions may comprise initiating presentation of media content by a media display and/or manipulating playback of media content by the media display (e.g., pausing, stopping, adjusting volume, rewinding, fast-forwarding, and/or any other type of manipulation of playback). The digital twin may include virtual representation of the target and/or of movable objects (e.g., doors, a rearview mirror, side mirrors, seat backs, neck rests, arm rests, air vents, etc.), and/or occupants (actual images from a camera or from stored avatars). In some embodiments, the dynamic elements can be targets (e.g., devices) that are newly plugged into the network, and/or disappear from the network (e.g., due to a malfunction or relocation). The digital twin can reside in any circuitry (e.g., processor) operatively coupled to the network. The circuitry in which the digital circuitry resides may be in the vehicle, outside of the vehicle, and/or in the cloud. In some embodiments, a two-way link is maintained between the digital twin and a real circuitry. The real circuitry may be part of the control system. The real circuitry may be included in the master controller, network controller, intermediate controller, local controller, or in any other node in a processing system (e.g., in the vehicle or outside of the vehicle). For example, the two-way link can be used by the real circuitry to inform the digital twin of changes in the dynamic and/or static elements so that the 3D representation of the enclosure can be updated, e.g., in real time. In real time may include, during occurrence of a change of, or in, the enclosure. In real time may include within at most 15seconds (sec.), 30sec., 45sec., Iminute (min), 2min., 3min. 4min., 5min, 10min., 15min. or 30min. from the occurrence of a change in, the enclosure. The two-way link may be used by the digital twin to inform the real circuitry of manipulative (e.g., control) actions entered by a user on a mobile circuitry. The mobile circuitry can be a remote controller (e.g., comprising a handheld pointer, manual input buttons, or touchscreen).

[0230] In some embodiments, one or more mobile circuitry devices of a user are aligned with (e.g., linked to) the virtual 3D “digital twin” model of the vehicle (or any portion thereof), e.g., via WiFi or other network connections. The mobile circuitry may comprise a remote (e.g., mobile) control interface. The mobile circuitry may include a pointer, gaming controller, and/or virtual reality (VR) controller. For example, the mobile circuitry may have no interaction with the physical vehicle, e.g., other than forwarding network communications via the aligned communication channel to and/or from the digital twin. The user interaction may not be direct and/or physical with any device being controlled in the enclosure. The user interaction of the user with the target may be indirect. The interaction of the user with the target may be devoid of tactile touch, optical ray projection, and/or vocal sound. The control actions taken by the user to control the target may be based at least in part on a relative position of the digital circuitry manipulated by a user, relative to the modeled space in the digital twin (e.g., virtual movement within the modeled enclosure). The control actions taken by the user to control the target may be not based on (e.g. and are oblivious to) the spatial relationship between the user and the digital twin. For example, a user may use a remote control pointing device, and point to a presentation portion. The presentation may be displayed on a TOLED display construct disposed in the line of sight between a user and a window (e.g., smart window). The coupling between the mobile circuitry and the target may be time based and/or may be action based. For example, the user may use the point the remote controller to the presentation, and by this couple with the presentation. The coupling may initiate on pointing in a duration that exceeds a duration threshold. The coupling may initiate by clicking the remote controller while pointing. The user may then point to a position that triggers a dropdown menu in the presentation. The dropdown menu may be visible (i) when the pointing may exceed a time threshold (ii) when the user presses button(s) on the remote controller (e.g., action based), and/or (iii) when the user performs a gesture (e.g., as disclosed herein). The user may then choose from the menu. The choice may be initiated (i) when the pointing may exceed a time threshold (ii) when the user presses button(s) on the remote controller (e.g., action based), and/or (iii) when the user performs a gesture (e.g., as disclosed herein). The actions of the user done in conjunction with the mobile circuitry (e.g., remote controller) may be communicated to the network, and thereby to the digital twin that is in turn communicated to the target. And thus, the user may indirectly communicate with the target through the digital twin. The mobile circuitry (e.g., remote controller) may be located with respect to the enclosure at one time, at time intervals, and/or continuously. Once a relative location of the mobile circuitry (e.g., remote controller) with the enclosure is determined, the user may use the remote controller anywhere (e.g., inside the enclosure, or outside of the enclosure). Outside of the enclosure may comprise in the vehicle or outside of the vehicle.

[0231] In some embodiments, the mobile circuitry (e.g., remote controller) can control a (e.g., any) interactive and/or controllable target (e.g., device) in the vehicle or any portion thereof, as long as (i) the target and (ii) the mobile circuitry (e.g., remote controller) are communicatively coupled to the digital twin (e.g., using the network). For example, the vehicle may comprise interactive targets comprising one or more sensors, emitters, tintable windows, or media displays, which devices are coupled to a communication network. In some embodiments, the user interacts with the digital twin from within the vehicle or from an (e.g., arbitrary) location outside the vehicle. For example, a remote controller device can comprise a virtual reality (VR) device, e.g., having a headset (e.g., a binocular display) and/or a handheld controller (e.g., motion sensor with or without input buttons). The mobile circuitry may comprise an Oculus Virtual Reality Player Controller (OVRPlayerController). In some embodiments, a remote control interface may be used which provides (i) visual representation to the user of the digital twin for navigation in the virtual vehicle, and/or (ii) user input actions for movement within the 3D model. The user input actions may include (1) pointing to an intended interactive target to be controller (e.g., to alter status of the target),

(2) gestures, and/or (3) button presses, to indicate a selection action to be taken with the mobile circuitry (e.g., remote controller). The remote controller may be used to manipulate an interactive target by pointing towards them (e.g., for coupling), gesturing in other directions, and/or pressing one or more buttons operatively coupled to the mobile circuitry (e.g., buttons disposed on an envelope of the mobile circuitry). Interfacing between the mobile circuitry and the digital twin may not be carried out through a screen depicting the digital twin. Interfacing between the user and the digital twin may not be carried out through a screen showing the digital twin. Interfacing between the mobile circuitry and the digital model may not require (e.g., any) optical sensor as facilitator). Some embodiments employ a different mode of input from augmented reality applications that operate through interaction with a screen (e.g., by using an optical sensor such as a camera).

[0232] In some embodiments, a mobile circuitry (e.g., handheld controller) without any display or screen is used, which display or screen may depict a digital representation of the enclosure and/or the target. For example, instead of virtual navigation within the enclosure by the user, the actual location of the user can be determined in order to establish the location of the user in the digital twin, e.g., to use as a reference in connection with a pointing action by the user. For example, the mobile circuitry (e.g., handheld controller) may include geographic tracking capability (e.g., GPS, UWB, BLE, and/or dead-reckoning) so that location coordinates of the mobile circuitry can be transmitted to the digital twin using any suitable network connection established by the user between the mobile circuitry and the digital twin. For example, a network connection may at least partly include the transport links used by a hierarchical controller network within a vehicle. The network connection may be separate from the controller network of the vehicle (e.g., using a wireless network such as a cellular network).

[0233] In some embodiments, a user may couple to a requested target. The coupling may comprise a gesture using the mobile circuitry. The coupling may comprise an electronic trigger in the mobile circuitry. The coupling may comprise a movement, pointing, clicking gesture, or any combination thereof. For example, the coupling may initiate at least in part by pointing to the target for a period of time above a threshold (e.g., that is predetermined). For example, the coupling may initiate at least in part by clicking a button (e.g., a target selection button) on a remote controller that includes the mobile circuitry. For example, the coupling may initiate at least in part by moving the mobile circuitry towards a direction of the target.

For example, the coupling may initiate at least in part by pointing a frontal portion of the mobile circuitry in a direction of the target (e.g., for a time above a first threshold) and clicking a button (e.g., for a time above a second threshold). The first and second thresholds can be (e.g., substantially) the same or different.

[0234] Fig. 14 shows an example embodiment of a control system in which a real, physical enclosure (e.g., vehicle) 1400 includes a controller network for managing interactive network devices under control of a processor 1401 (e.g., a master controller). The structure and contents of enclosure 1400 are represented in a 3-D model digital twin 1402 as part of a modeling and/or simulation system executed in a computing asset. The computing asset may be co-located with or remote from enclosure 1400 and processor (e.g., master controller) 1401. A network link 1403 in enclosure 1400 connects processor 1401 with a plurality of network nodes including an interactive target 1405. Interactive target 1405 is represented as a virtual object 1406 within digital twin 1402. A network link 1404 connects processor 1401 with digital twin 1402.

[0235] In the example of Fig. 14, a user located in enclosure 1400 carries a handheld control 1407 having a pointing capability (e.g., to couple with the target 1405). The location of handheld control 1407 may be tracked, for example, via a network link with digital twin 1402 (not shown). The link may include some transport media contained within network 1403. Handheld controller 1407 is represented as a virtual handheld controller 1408 within digital twin 1402. Based at least in part on the tracked location and pointing capability of handheld controller 1407, when the user initiates a pointing event (e.g., aiming at a particular target and/or pressing an action button on the handheld controller) it is transmitted to digital twin 1402. Accordingly, digital twin 1402 with the target (e.g., represented as a digital ray 1409 from the tracked location within digital twin 1402). Digital ray 1409 intersects with virtual device 1406 at a point of intersection 1410. A resulting interpretation of actions made by the user in the digital twin 1402 is reported by digital twin 1402 to processor 1401 via network link 1404. In response, processor 1401 relays a control message to interactive device 1405 to initiate a commanded action in in accordance with a gesture (or other input action) made by the user.

[0236] Fig. 15 shows an example method corresponding to the embodiment of Fig. 14. For example, a user carrying a mobile circuitry (e.g., handheld remote controller) in an enclosure (e.g., a vehicle) represented by the digital twin, may wish to interact with a particular interactive target. In operation 1500, the user couples to the target, e.g., by pointing and/or clicking with the tracked remote controller to signify a requested control action. The mobile circuitry may couple to the target by pointing towards it (e.g., for a period of time longer than a threshold time). The mobile circuitry may couple to the target by a coupling command. The coupling command may comprise tactile, oral, visual, and/or written command. The coupling may comprise any voice and/or gesture command disclosed herein. The coupling may comprise pressing a button that is operatively (e.g., communicatively) coupled to the mobile circuitry, to the target, and/or to the digital twin.

[0237] In some embodiments, the mobile circuitry may be directional in at least two directions. For example, the mobile circuity may have a front direction and a back direction. For example, the mobile circuitry may be able to distinguish between at least two, three, four, five, or six spatial directions. The directions may comprise up, down, front, back, right, or left. The directions may comprise north, south, east, and west. The directions may be relative directions, e.g., relative to the previous position of the mobile circuitry. The directions may be absolute directions (e.g., within a measurable error range). The directions may be in accordance with a Global Positioning System (GPS). Coupling of the mobile circuitry (e.g., remote controller) and the target (e.g., media projection) may comprise pointing a front direction of the mobile circuitry towards the target, e.g., for a time above a threshold. Using a network communication route from the remote controller to the digital twin, an intersection between the mobile circuitry and the target may be mapped digitally in the digital twin. The intersection may be from the tracked location of the mobile circuitry (e.g., handheld controller) along a digital ray indicated by pointing direction, e.g., to identify any requested interactive target (e.g., device and/or control element on a device). In the example shown in Fig. 15, a remote controller that is communicatively coupled to the digital twin (e.g. and tracked through a network communication route) points to a target disposed in an enclosure in operation 1501. A virtual digital ray can be envisioned from the pointed remote controller to the target towards which the remote controller directionally points. The network communication route may comprise a (e.g., separate) network connection. In operation 1502, it is determined whether any predetermined event (e.g., any control event) is associated with the point of intersection at the interactive target. For example, the point of intersection may indicate a particular air vent target. An event associated with pointing and/or clicking on the air vent may be a change in the on/off state of the air vent. If no associated event is found for the point of intersection, then no action is taken, and the method ends at an operation 1503. If an associated event is found, then the method proceeds to operation 1504 to send an event command from the digital twin to a processor (e.g., controller) operatively coupled with the air vent in the enclosure. In operation 1605, the processor receives the event command and triggers the associated event in the corresponding physical enclosure. Triggering the associated even may be by sending the command to an appropriate controller for the interactive device (e.g., a tint command send to a corresponding window controller).

[0238] In some embodiments, social interaction and/or communication is provided via the digital twin. When the digital twin is coupled to a communication network, it (e.g., intrinsically) allows for a social experience where remote participants join the vehicle and interact with targets (e.g., devices or media) therein via the digital twin. The concept of the digital twin may enable multi-user participation in manipulating an interactive target disposed in an enclosure; whether the participants are in the enclosure or not, and/or whether the participants are local or remote. For example, a plurality of users may access (e.g., interact with) the digital twin at the same time in a way that is perceptible to the other users. For example, when users employ VR headsets with visual displays and audio communication, they may see and/or hear one another in the virtual space represented by the digital twin.

For example, when users employ video conferencing tools, they may see and/or hear one another in the virtual space represented by the digital twin. For example, a tracked user may be represented as an avatar placed within the corresponding location in the digital twin and displayed to other users. In one example, the avatar may be used to provide virtual company to a driver of a vehicle on a long road trip. The avatar may be generic and/or may include photographic data that may be stored in advance or captured during an interaction of the user with the digital twin (e.g., using a camera or other personal identifier). The personal identifier may comprise facial recognition, fingerprint scanning, retinal scanning, or other biometric-based methods used to confirm an identity of a user.

[0239] Fig. 16 shows an example in which multiple users interact socially via a digital twin which provides access to controllable features of interactive target(s) within an enclosure environment. For example, a vehicle network 1600 may include a network communication link between a master controller, network controllers, window controllers, and interactive targets such as sensors, actuators, emitters, media display, computing devices, and/or electrochromic windows. Fig. 16 represents a group of individuals meeting, in which a mobile circuitry (e.g., a mobile phone) 1602 is connected to a vehicle network 1600 by a communication link (e.g., WiFi) 1601 for providing a media presentation. A media display

1605 (e.g., on a display construct) is coupled to building network 1600 by a communication link 1603. Media display 1605 may be associated with (e.g., fastened to, adhered to, disposed in, on , or adjacent to, or the like) with a tintable window and/or other optically switchable device. The tintable window and/or the other optically switchable device may be disposed in or on a vehicle (e.g., an interior window between rows of seats, a window between an interior and an exterior of the vehicle, or the like). Thus, media content for a presentation (e.g., a computer application such as a spreadsheet or slideshow) generated by device 1602 can be transmitted to media display 1605 for display. The media content can also be sent to a digital twin 1610 over a link 1611 so that it can be represented as a visible element in digital twin 1610. The media content can instead be transmitted over a direct link (e.g., Bluetooth (BLE) or WFi) between device 1602 and media display 1605. There may be a parallel connection of device 1602 to network 1600 so that the media content can be provided to digital twin 1610 or the simulation model can be maintained without including the media content in the digital twin.

[0240] Digital twin 1610 is accessible to a user 1613 via a communication link 1612 between digital twin 1610 and/or user interface equipment. For example, the user interface equipment can include a VR headset 1614 and a VR handheld controller 1615. Another user 1621 accesses digital twin 1610 at the same time via a communication link 1620. User 1621 may have a VR headset 1622 and a VR handheld controller 1623. In some embodiments, the digital twin 1610 may include dynamic elements for the room containing the group meeting

1606 (e.g., representations of persons seated within the vehicle, and/or instantaneous views of the media content being displayed in the vehicle). Digital twin 1610 may provide for exchanging audio signals captured by microphones (e.g., disposed in the room and/or the VR equipment) for reproduction for the other participants.

[0241] In some embodiments, network communication among a controller (e.g., MC), digital twin, user mobile circuitry (e.g., remote controller), and local interactive devices includes mono- or bi-directional messaging capability. For example, a combination of local area networks and/or wide area networks with appropriate gateways may be configured to facilitate (i) exchanging messages, (ii) updating of a digital twin, and/or (ii) user remote interaction with a target (e.g., for remotely controlling the interactive target). The messages may be relevant to a status change of the target, and/or to users of a meeting (without or with relation to the target, without or with relation to the enclosure in which the target is disposed, and with or without relation to the subject matter of the meeting). The controller may be configured (e.g., by appropriate software programming) to interact with the digital twin. The interaction may be for providing data identifying changes to static elements and the states of dynamic elements included in the digital twin. The digital twin may be configured to provide (i) intuitive capabilities to manipulate a target remotely, (ii) a virtual reality experience to at least one user to navigate a virtual 3D model of the enclosure, (iii) to investigate various dynamic states in the digital twin, and/or (iv) to exchange interactive (e.g., control) actions (e.g., events) related to the target, which actions are initiated by at least one user, e.g., via a virtual-reality interface. The remote manipulation may or may not comprise an electromagnetic and/or acoustic beam directed from the remote controller to the target. In some embodiments, remote manipulation may be devoid of an electromagnetic and/or acoustic beam directed from the remote controller to the target. In some embodiments, the communication coupling of the remote controller with the target may be (e.g., only) through the network that is communicatively coupled to the digital twin. In some embodiments, the communication coupling of the remote controller with the target may be (e.g., only) through the digital twin (e.g., using the network as a communication pathway that communicatively coupled the target, the digital twin, and the remote controller (comprising the mobile circuitry). The communication coupling may comprise wired and/or wireless communication. The digital twin may be configured to process a user input event, e.g., (i) to identify whether it corresponds to a valid command related to the target (e.g., from a predetermined list of valid control actions of the target) and/or (ii) to forward valid commands (e.g., to at least one controller or directly to the target) for manipulating the target (e.g., manipulating a state of the target that is manipulatable). In some embodiments, at least one controller monitors its ongoing exchange of data and/or commands with the local interactive target, e.g., to collect and/or forward updated information for the digital twin. The updated information may include any dynamic change of state, e.g., resulting from remote event(s) initiated by the user(s).

[0242] In some embodiments, messaging sequences include one or more data messages and one or more command messages exchanges between (i) one or more local targets and the processor, (ii) the processor and the digital twin, and/or (iii) the digital twin and the mobile circuitry. For example, a processor (e.g., a controller such as a master controller) may send a data message to the digital twin when one or more new targets join the network from time to time. The data may represent new static and/or dynamic elements for inclusion in the digital twin 3D model of the vehicle. The data may represent changes in a (e.g., system) state for a dynamic element of a target. [0243] In some embodiments, the mobile circuitry and the digital twin exchange one or more messages that enable a user to control (including to monitor and/or alter) operation of real targets (e.g., by manipulating their virtual twin elements in digital twin). For example, a user may activate their mobile circuitry (e.g., a remote gaming controller such as a VR headset and handheld VR controller (e.g., a point and click button)) to create a link with the digital twin. In some embodiments, upon an initial connection the digital twin and mobile circuitry exchange data messages with data for displaying a simulated scene in the digital twin, e.g., according to a default starting position. For example, a virtual simulation may begin at an entrance to the enclosure, or at any other point of interest (e.g., chosen by a user). In some embodiments when the user is actually located in the enclosure being represented, the starting position may correspond to the current location of the user (e.g., an initiate message may provide geographic coordinates of a GPS-equipped user remote controller). Data or commands within messages between the mobile circuitry and the digital twin may include navigation actions (resulting in updated views being returned from the digital twin) and/or control actions (e.g., point and click) to indicate a desired change in an alterable state of a target.

[0244] In some embodiments, the digital twin validates a received control action, e.g., by mapping the control action to an indicated location in the digital twin and/or checking against a list of valid actions. For example, the digital twin may only send a message to the processor (e.g., controller) when the control action event of the user corresponds to an identifiable and authorized interaction. When a valid interaction is found, a command message may be transmitted from the digital twin to the processor (e.g., controller), and forwarded to the affected target. After executing the command, one or more acknowledgement messages may propagate back to the digital twin and the 3D model of the digital twin may optionally be updated accordingly. For example, after executing a change in a tint value of an insulated glass unit (IGU), the digital twin model of the IGU may be adjusted to show a corresponding change in tint level.

[0245] Fig. 17 is an example messaging sequence during operation of a control system in an enclosure (e.g., a vehicle for which a digital twin has been constructed) including a controller and/or processor 1700, one or more interactive and interconnected targets (e.g., devices) 1702. One or more new targets may join the network from time to time. For example, a new target sends a joining message 1704 to the processor and/or controller 1700 upon its interconnection. The new target may, for example, represent new static and/or dynamic elements for inclusion within the digital twin 3-D model. For example, when a new static element has been added, then a new static element message 1705 is transmitted from processor and/or controller 1700 to digital twin 1701. The processor and/or controller 1700 and targets 1702 may (e.g., continuously or intermittently) exchange data and/or command messages 1706, e.g., as part of their normal operation. In some embodiments, controller and/or processor 1700 may identify changes manifested with the exchange of data and commands and/or messages (e.g., 1706) that result in a changed system state for a dynamic element. Accordingly, processor and/or controller 1700 may send a new dynamic element message 1707 to digital twin 1701. Digital twin 1701 may then update the digital twin (e.g., 3D model of the enclosure) to reflect the new state (e.g., tint state of a window or contents of a display screen in a media presentation).

[0246] In the example of Fig. 17, network interactions completely separate from the interactions of processor and/or controller 1700 are conducted by the user (whether the user is remotely located or in the enclosure). For example, mobile circuitry (e.g., embedded in a remote controller) 1703 and digital twin 1701 exchange messages that enable a user to monitor and/or alter operation of real targets 1702, e.g., by manipulating their virtual twin elements in digital twin 1701. For example, a user may activate their mobile circuitry (e.g., a remote gaming controller such as a VR headset and handheld VR controller (e.g., a point and click button)) to cause an initial message 1708 to be sent to digital twin 1701. In response, digital twin 1701 may send a starting point message 1709 to mobile circuitry 1703. The starting point message may include, e.g., data for displaying a simulated scene in the digital twin, e.g., according to a default starting position. For example, a virtual simulation may begin at an entrance to the enclosure, or at any other point of interest (e.g., chosen by a user).

[0247] In the example of Fig. 17, the user may invoke a gesture (e.g., movement) and/or button presses on their remote controller that includes the mobile circuitry 1703, e.g., to navigate through various locations in the 3D model. Corresponding navigation action messages 1710 may be transmitted from mobile circuitry 1703 to digital twin 1701, and data for updated views are returned from digital twin 1701 to mobile circuitry 1703 to view updated view messages 1711. Once the user approaches a requested interactive target in the simulation, the user may initiate a control action (e.g., point and click) causing a control action message 1712 to be sent to digital twin 1701.

[0248] In some embodiments, digital twin 1701 validates control actions by mapping the control action to an indicated location in the 3D model and/or checking against a list of valid actions. When a valid control action event is detected, digital twin 1701 may send a command message 1713 to processor and/or controller 1700 to identify the corresponding target and the corresponding change of state (e.g., toggling of an identified lighting circuit, or selection of a menu item in a projected display of a laptop presentation). A command message 1714 may be transmitted from processor and/or controller 1700 to the affected target 1702. After executing the command, target 1702 may send an acknowledgement message 1715 to processor and/or controller 1700. If the change is among the dynamic elements included in the digital twin, then processor and/or controller 1700 may send an updated dynamic element message 1716 to digital twin 1701. If the current simulation being viewed by the user includes the dynamic element, then an updated view message 1717 may be sent to remote controller 1703, e.g., to provide new data adjusted for the new dynamic state.

[0249]

[0250] At times, it may be requested and/or advantageous to reduce (e.g., eliminate) direct contact between a user and a target apparatus (e.g., surface of the target apparatus). For example, reducing direct user interaction between the user and a target apparatus may reduce a risk of pathogen infection (e.g., fungi, virus, and/or bacteria), which pathogen resides in the (e.g., surface) of the device. The pathogen may be contagious and/or disease causing. The target apparatus may be an interactive target. The target apparatus may be disposed in an enclosure. The target apparatus may be a third party apparatus. The target apparatus may be a service device (e.g., a device offering service(s) to a user).

[0251] In some embodiments, the target apparatus is operatively coupled to a network. The network is operatively coupled, or includes, a control system (e.g., one or more controllers such as a hierarchal control system). In some embodiments, a mobile circuitry of a user is paired to a target apparatus (e.g., service device). The target apparatus may receive an identification tag when operatively (e.g., communicatively) coupled to the network (e.g., and to the control system). The target apparatus may be operatively coupled to a mobile circuitry through the network (e.g., using indirect coupling). The coupling between the mobile circuitry and the target apparatus may be through an application of the vehicle and/or of the target apparatus. There may not be a requirement for a physical proximity between the target apparatus and the mobile circuitry (e.g., and the user). The target apparatus may be selected using information related to a location of the user and/or the mobile circuitry of the user. The user may be located at a distance of at most 50 meters (m), 25m, 10 m, 5 m, 2 m,

1.5 m, 0.5 m, or 0.2 m from the target apparatus. The user may be located at a distance between any of the above-mentioned distances from the target apparatus (e.g., from about 50m to about 0.2m, from about 50m to about 25m, from about 25m to about 0.2m). The distance between the user and the target apparatus may be larger than the distance required for pairing between devices (e.g., Bluetooth type pairing). There may be no need for any physical proximity between the user (and/or the mobile circuitry of the user), and the target apparatus (e.g., service device). The user may select the target apparatus (e.g., service device) from a list (e.g., dropdown menu). The user may be required to operatively couple the mobile circuitry to the network to which the target apparatus is coupled. The communication between the mobile circuitry and the service device can be mono-directional (e.g., from the mobile circuitry to the target apparatus, or vice versa), or bidirectional between the target apparatus and the mobile circuitry (e.g., through the network). One user may control one or more target apparatuses (e.g., service device). One target apparatus may be controlled by one or more users. A plurality of users may send requests to one target apparatus, which requests may be placed in a queue (e.g., based on a prioritization scheme such as time of receipt, urgency, and/or user permissions or authorizations).

[0252] In some embodiments, the target apparatus is identified by the network upon connection to the network (which connection may be wired and/or wireless). The target apparatus may be identified via an identification code (e.g., RFID, QR-ID, barcode). In some embodiments, the identification code is not a visible (e.g., scannable) identification code.

The identification code may comprise non-contact identification (e.g., electromagnetic and/or optical). The optically recognized identification may be a machine-readable code, e.g., consisting of an array of black and white squares or lines (e.g., barcode or a Quick Response (QR) code). The electromagnetic identifier may comprise radio-frequency identification (RFID). The RFID may be ultra-high frequency RFID. The identifier may comprise a transponder (e.g., RF transponder), a receiver, a transmitter, or an antenna. The identifier may be passive or active (e.g., transmit electromagnetic radiation). The identifier may comprise near field communication (NFC).

[0253] In some embodiments, a user may control the target apparatus (e.g., service device). For example, a user may control mechanical, electrical, electromechanical, and/or electromagnetic (e.g., optical and/or thermal) actions of the target apparatus. For example, the user may control a physical action of the target apparatus. For example, the user may control if the target apparatus is turned on or off, if any controllable compartment thereof is open or closed, direct directionality (e.g., left, right, up, down), enter and/or change settings, enable or deny access, transfer data to memory, reset data in the memory, upload and/or download software or executable code to the target apparatus, cause executable code to be run by a processor associated with and/or incorporated in the target apparatus, change channels, change volume, cause an action to return to a default setting and/or mode. The user may change a set-point stored in a data set associated with the target apparatus, configure or reconfigure software associated with the target apparatus. The memory can be associated with and/or be part of the target apparatus. [0254] In some embodiments, the target apparatus is operatively (e.g., communicatively) coupled to the network (e.g., communication, power and/or control network) of the enclosure. Once the target apparatus becomes operatively coupled to the network of the enclosure, it may be part of the targets controlled via the digital twin. The new target (e.g., third party target) may offer one or more services to a user.. The service device may include media players (e.g., which media may include music, video, television, and/or internet),. The target apparatus may comprise a television, recording device (e.g., video cassette recorder (VCR), digital video recorder (DVR), or any non-volatile memory), Digital Versatile Disc or Digital Video Disc (DVD) player, digital audio file player (e.g., MP3 player), cable and/or satellite converter set-top box (“STBs”), amplifier, compact disk (CD) player, game console, electrically controlled drapery (e.g., blinds), tintable window (e.g., electrochromic window), fan, HVAC system (e.g., air vents, air conditioning, or the like), thermostat, or personal computer,. The command may be initiated by contacting the target, or by communicating (e.g., remotely) with the target. For example, a user may press a button on the target apparatus to initiate presentation of media content. For example, a user may interact with the target apparatus through usage of the mobile circuitry. The mobile circuitry may comprise a cellular phone, a touchpad, or a laptop computer. For example, a user may use gestures to interact with the target apparatus (e.g., pointing, snapping, tapping, etc.).

[0255] In some embodiments, the network may be a low latency network. The low latency network may comprise edge computing. For example, at least one (e.g., any) controller of the (e.g., hierarchal) control system can be a part of the computing system. For example, at least one (e.g., any) circuitry coupled to the network can be a part of the computing system. Latency (e.g., lag or delay) may refer to a time interval between a cause and its effect of some physical change in the system being observed. For example, latency may physically be a consequence of the limited velocity with which any physical interaction can propagate. For example, latency may refer to a time interval between a stimulation and a response to the stimulus. For example, the latency may refer to a delay before a transfer of data begins following an instruction for transfer of the data. The network may comprise fiber optics. The latency may be at least about 3.33 microseconds (ps), or 5.0 ps for every kilometer of fiber optic path length. The latency of the network may be at most about 100 milliseconds (ms), 75ms, 50ms, 25ms, 10ms, 5ms, 4ms, 3ms, 2ms, 1ms, or 0.5ms. The latency of the network may be of any value between the aforementioned values (e.g., from about 100ms to about 0.5ms, from about 100ms to about 50ms, from about 50ms to about 5ms, or from about 5ms to about 0.5ms). The network may comprise a packet-switched network. The latency may be measured as he time from the source sending a packet to the destination receiving it (e.g., one way latency). The latency may be measured one-way latency from source to destination plus the one-way latency from the destination back to the source (e.g., round trip latency).

[0256] In some embodiments, the mobile circuitry includes an application related to the target apparatus (e.g., third party device). The application may depict one or more service options offered by the target apparatus. For example, if the target apparatus is a media display, the application may be a third-party application which provides various media content items (e.g., television shows, movies, audio books, music videos, podcasts, etc.) for presentation on the media display. The third-party application may be associated with a media content provisioning service, a social networking service, or the like.

[0257] In some embodiments, a position of a user within an enclosure (e.g., vehicle) is determined. The position can be determined using one or more sensors. The user may carry a tag. The tag may include radio frequency identification (e.g., RFID) technology (e.g., transceiver), Bluetooth technology, and/or Global Positional System (GPS) technology. The radio frequency may comprise ultrawide band radio frequency. The tag may be sensed by one or more sensors disposed in the enclosure. The sensor(s) may be disposed in a device ensemble. The device ensemble may comprise a sensor or an emitter. The sensor(s) may be operatively (e.g., communicatively) coupled to the network. The network may have low latency communication, e.g., within the enclosure. The radio waves (e.g., emitted and/or sensed by the tag) may comprise wide band, or ultra-wideband radio signals. The radio waves may comprise pulse radio waves. The radio waves may comprise radio waves utilized in communication. The radio waves may be at a medium frequency of at least about 300 kilohertz (KHz), 500 KHz, 800 KHz, 1000 KHz, 1500 KHz, 2000 KHz, or 2500 KHz. The radio waves may be at a medium frequency of at most about 500 KHz, 800 KHz, 1000 KHz, 1500 KHz, 2000 KHz, 2500 KHz, or 3000 KHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300KHz to about 3000 KHz). The radio waves may be at a high frequency of at least about 3 megahertz (MHz), 5 MHz, 8 MHz, 10 MHz, 15 MHz, 20 MHz, or 25 MHz. The radio waves may be at a high frequency of at most about 5 MHz, 8 MHz, 10 MHz, 15 MHz, 20 MHz, 25 MHz, or 30 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3MHz to about 30 MHz). The radio waves may be at a very high frequency of at least about 30 Megahertz (MHz), 50 MHz, 80 MHz, 100 MHz, 150 MHz, 200 MHz, or 250 MHz. The radio waves may be at a very high frequency of at most about 50 MHz, 80 MHz, 100 MHz, 150 MHz, 200 MHz, 250 MHz, or 300 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 30MHz to about 300 MHz). The radio waves may be at an ultra-high frequency of at least about 300 kilohertz (MHz), 500 MHz, 800 MHz, 1000 MHz, 1500 MHz, 2000 MHz, or 2500 MHz. The radio waves may be at an ultra-high frequency of at most about 500 MHz, 800 MHz, 1000 MHz, 1500 MHz, 2000 MHz, 2500 MHz, or 3000 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300MHz to about 3000 MHz). The radio waves may be at a super high frequency of at least about 3 gigahertz (GHz), 5 GHz, 8 GHz, 10 GHz, 15 GHz, 20 GHz, or 25 GHz. The radio waves may be at a super high frequency of at most about 5 GHz, 8 GHz, 10 GHz, 15 GHz, 20 GHz, 25 GHz, or 30 GHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3GHz to about 30 GHz).

[0258] In some embodiments, the identification tag of the occupant comprises a location device. The location device (also referred to herein as “locating device”) may compromise a radio emitter and/or receiver (e.g., a wide band, or ultra-wide band radio emitter and/or receiver). The locating device may include a Global Positioning System (GPS) device. The locating device may include a Bluetooth device. The locating device may include a radio wave transmitter and/or receiver. The radio waves may comprise wide band, or ultra- wideband radio signals. The radio waves may comprise pulse radio waves. The radio waves may comprise radio waves utilized in communication. The radio waves may be at a medium frequency of at least about 300 kilohertz (KHz), 500 KHz, 800 KHz, 1000 KHz, 1500 KHz, 2000 KHz, or 2500 KHz. The radio waves may be at a medium frequency of at most about 500 KHz, 800 KHz, 1000 KHz, 1500 KHz, 2000 KHz, 2500 KHz, or 3000 KHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300KHz to about 3000 KHz). The radio waves may be at a high frequency of at least about 3 megahertz (MHz), 5 MHz, 8 MHz, 10 MHz, 15 MHz, 20 MHz, or 25 MHz. The radio waves may be at a high frequency of at most about 5 MHz, 8 MHz, 10 MHz, 15 MHz, 20 MHz, 25 MHz, or 30 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3MHz to about 30 MHz). The radio waves may be at a very high frequency of at least about 30 Megahertz (MHz), 50 MHz, 80 MHz, 100 MHz, 150 MHz, 200 MHz, or 250 MHz. The radio waves may be at a very high frequency of at most about 50 MHz, 80 MHz, 100 MHz, 150 MHz, 200 MHz, 250 MHz, or 300 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 30MHz to about 300 MHz). The radio waves may be at an ultra-high frequency of at least about 300 kilohertz (MHz), 500 MHz, 800 MHz, 1000 MHz, 1500 MHz, 2000 MHz, or 2500 MHz. The radio waves may be at an ultra-high frequency of at most about 500 MHz, 800 MHz, 1000 MHz, 1500 MHz, 2000 MHz, 2500 MHz, or 3000 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300MHz to about 3000 MHz). The radio waves may be at a super high frequency of at least about 3 gigahertz (GHz), 5 GHz, 8 GHz, 10 GHz, 15 GHz, 20 GHz, or 25 GHz. The radio waves may be at a super high frequency of at most about 5 GHz, 8 GHz, 10 GHz, 15 GHz, 20 GHz, 25 GHz, or 30 GHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3GHz to about 30 GHz).

[0259] In some embodiments, the locating device facilitates location within an error range. The error range of the locating device may be at most about 5 meters (m), 4m, 3m, 2m, 1m, 0.5m, 0.4m, 0.3m, 0.2m, 0.1m, or 0.05m. The error range of the locating device may be any value between the aforementioned values (e.g., from about 5m to about 0.05m, from about 5m to about 1m, from about 1m to about 0.3m, and from about 0.3m to about 0.05m). The error range may represent the accuracy of the locating device.

[0260] In some embodiments, a user seeks an action to be performed by a target apparatus (e.g., an HVAC device or component of an HVAC system, a media display, a tintable window, or the like). Ascertaining a position of the user, the application may identify eligible targets in a vicinity of the user. The user may select a requested target from the eligible targets presented by the application. Selection of the service device may allow opening its interface (e.g., and thus allow selection of its services). The user may select a requested service (e.g., presenting particular media content on a media display). The user selection may be transmitted to the target device through the network, and the target device may fulfill the request of the user. In this manner, the user is not required to physically contact the target device to perform service selection. The user may then obtain the fulfilled service. In some embodiments, the target device may be automatically selected based on proximity to the user. For example, in instances in which the target device comprises an air vent, one or more air vents closest to the user based on the position of the user may be identified. For example, in instances in which the target device comprises a media display, one or more media displays closest to a user based on the position of the user may be identified. The user may or may not view (e.g., in the application) a digital twin of the enclosure in which the target device is disposed. The user may employ gesture control to operate the target device. For example, the user may employ his mobile circuitry to point to a service choice visible on the target device, which service choice may be translated by the control system to a choice selection.

[0261]

[0262] In some examples, there are various target apparatuses (e.g., machines) of the same type in a vehicle. For example, several air vents, several tintable windows, several media displays, several speakers, etc. A user may send a request to a target apparatus type. The specific target apparatus of that type executing the request may be the one closest to the user. The position of the user may be ascertained via the network (e.g., using facial recognition, ID tag, one or more motion sensors, one or more infrared sensors, etc.). The control system may use the position of the user to identify a specific target apparatus of the requested type for executing the requested task. A user may override such recommendation of the control system. A user may request a specific target apparatus to execute the task. Certain target apparatuses may be dedicated to certain groups of user (e.g., passengers of the vehicle, a driver of the vehicle, rear row passengers of the vehicle, etc.). There may be a hierarchy in the permission provided to users to use the service apparatuses. The hierarchy may depend on the location, driver or passenger status, and/or age of the user. The hierarchy may depend on the date and time at which the request is made, and/or requested execution time of the request. The groups of users may be identified by the control system.

[0263] In some embodiments, the user toggles between gesture control mode and tap control mode. In the gesture control mode, the user can utilize the mobile circuitry to point the mobile circuitry at the target apparatus in space. In the tap control mode, the user is not required to point the mobile circuitry at the target apparatus in space, but select options related on the target apparatus, which options appear on the mobile circuitry for selection (e.g., via a dropdown menu). The selection between options presented on the mobile circuitry can be by using a touchscreen of the mobile circuitry, and/or scrolling through the options such as by using scroll functionality implemented in the mobile circuitry (e.g., represented by arrows).

[0264] In some embodiments, the interactive target is operatively coupled to the network via a computing interface. The computing interface may comprise an application programming interface (API). The countering interface may define interactions between multiple software and/or hardware intermediaries. The computing interface may identify requests that can be made, how to make those requests, the data formats that should be used, and/or any particular conventions to follow. The computing interface may provide extension mechanisms to allow a user extension of existing functionality. For example, an API can be specific to a target, or it can be designed using an industry standard (e.g., to ensure interoperability). When a user requests a service (e.g., via the computing interface) from a service device via the mobile circuitry and/or via gesture control, the message is sent to the server (e.g., as part of the control system), the service device may be informed, and may pick the request from a server queue, process the service request, and deploys (e.g., provides) the service to be picked by the user. Examples of communication interface, messaging, and control can be found in U.S. provisional patent application serial number 63/000,342 filed on March 26, 2020, titled “MESSAGING IN A MULTI CLIENT NETWORK,” which is incorporated herein by reference in its entirety. [0265] Fig. 18 shows an example method in which a service device (e.g., third party device) is connected to the network of the vehicle in operation 1800, the service device is provided an identification in 1801, the service device stays alert to any incoming request (e.g., checks the network for any incoming request) in operation 1802. A location and/or position of a user disposed in the enclosure is identified in operation 1803. Once a user opens the application, a user is provided with service devices in the vicinity of the user in operation 1804. The user may select a service device, and therein a service provided by the service device in operation 1805. The selection of the service may be through an application menu, or through gesture control. The selection of the service is transmitted to the selected service device in operation 1906 through the network, the service device then executes the request in operation 1807.

[0266] In some embodiments, states of one or more devices in a vehicle and/or associated with a vehicle are altered. In some embodiments, the one or more devices may include one or more tintable windows of the vehicle. For example, the one or more tintable windows may include windows disposed between an interior and an exterior of the vehicle, such as a front windshield window, a rear windshield window, side windows (e.g., driver’s side windows, passenger side windows, left side windows, right side windows, or the like). In some embodiments, the one or more tintable windows may include one or more windows disposed within an interior of the vehicle, such as a tintable window disposed between rows of the vehicle (e.g., between a front row and a rear row of a car, between passenger rows in an airplane, or the like). In some embodiments, the one or more tintable windows may include one or more media displays. A media display may be associated with a tintable window. For example, a media display may be adhered to, attached to, fastened to, disposed in, disposed on, or the like, to a tintable window. A media display may be associated with any tintable window of the vehicle, such as a tintable window disposed between an interior and an exterior of the vehicle (e.g., a front windshield window, a side window, or the like) and/or a tintable window disposed in an interior of the vehicle.

[0267] In some embodiments, states of the one or more devices in the vehicle and/or associated with the vehicle are altered based at least in part on data from one or more device ensembles. A device ensemble may include one or more sensors and/or one or more emitters. Examples of sensors include photosensors, temperature sensors, occupancy sensors, motion sensors, infrared sensors, humidity sensors, gas sensors (e.g., oxygen sensors, carbon dioxide sensors, VOC sensors, or the like), geolocation sensors, microphones, cameras, or the like. Examples of emitters include sound emitters (e.g., a speaker), light emitters (e.g., an LED), or the like. A device ensemble may be positioned at any location within and/or on the vehicle, such as on a door frame, on a ceiling, on a dashboard, on a floor, on a seat (e.g., on a seat back), or the like. In some embodiments, a device ensemble may be positioned on an outside portion of the vehicle (e.g., on a roof, on an outside of a door, or the like). An externally positioned device ensemble may allow for sensor measurements of external conditions, e.g., an external temperature, an external amount of ambient light, or the like. Multiple device ensembles may be used in some embodiments. In some embodiments, one or more sensors and/or one or more emitters of a device ensemble may be reversibly removeable (e.g., with an ability to remove and insert the one or more sensors and/or one or more emitters from the device ensemble in a reversible manner), for example, to allow for replacement of one or more sensors and/or one or more emitters (e.g., in case of breakage).

[0268] FIG. 19 depicts an example of a vehicle 1900. Vehicle 1900 includes tintable windows 1901, 1902, 1903, 1904, 1905, and 1906 that are disposed between an interior and an exterior of vehicle 1900. For example, tintable window 1906 is a front windshield window. As another example, tintable window 1902 is a driver-side side window. Vehicle 1900 includes a media display 1910 that is associated with tintable window 1906. Vehicle 1900 includes a device ensemble 1908. Device ensemble is mounted in a front portion of vehicle 1900.

[0269] In some embodiments, a state of at least one device in and/or associated with the vehicle is altered, for example, to achieve a target environment of the vehicle. The target environment of the vehicle may be a target interior temperature, a target glare condition associated with one or more windows of the vehicle, a target brightness state inside the vehicle, or the like. For example, in some embodiments, a state of one or more tintable windows may be altered (e.g., darkened and/or lightened) to achieve a target interior temperature of the vehicle. In one example, one or more tintable windows may be darkened to lower an interior temperature and/or cool the vehicle. In another example, one or more tintable windows may be lightened to raise an interior temperature and/or warm the interior of the vehicle by allowing more sunlight into the vehicle. As another example, in some embodiments, a state of one or more tintable windows may be altered (e.g., darkened and/or lightened) to achieve a target glare state and/or a target interior brightness. In one example, one or more tintable windows may be darkened in response to determining that one or more occupants of the vehicle are likely to prefer a darker interior (e.g., in response to determining that one or more occupants of the vehicle are reading, napping, viewing media content, etc.). In another example, one or more tintable windows may be darkened in response to determining that a driver of the vehicle or other occupant is shading their eyes due to excessive sunlight. As yet another example, in some embodiments, a state of one or more tintable windows may be altered to promote safety. In one example, one or more tintable windows may be darkened in response to determining that the vehicle is parked (e.g., to prevent theft of items in the vehicle). In another example, one or more tintable windows of a law enforcement vehicle may be darkened while the vehicle is in motion and/or operation (e.g., to prevent a surveilled person from realizing they are under surveillance, to prevent an assailant from determining a number of law enforcement officers in the vehicle, to prevent an assailant from recognizing a particular individual in the vehicle, or the like). As still another example, in some embodiments, a tintable window disposed in an interior of the vehicle may be darkened (e.g., to promote privacy). In one example, an interior tintable window may be tinted in a rideshare vehicle (e.g., in response to determining that one or more occupants of a rear row have initiated a conversation, in response to determining that one or more people have entered a rear row, or the like).

[0270] In some embodiments, a target environment of the vehicle is a target entertainment state, such as whether media content is to be presented on one or more media displays of the vehicle (e.g., to alleviate boredom of one or more occupants of the vehicle, to present health or safety information to one or more occupants of the vehicle, to present navigation information to one or more occupants of the vehicle, to present advertisements to one or more occupants of the vehicle, or the like).

[0271] In some embodiments, a target environment of a vehicle and/or a target state of occupants of the vehicle is determined based on obtained information. The obtained information may include sensor data (e.g., from one or more device ensembles in and/or on the vehicle), one or more external sensors, historical preferences of a user (e.g., a driver, operator, and/or owner of the vehicle), one or more gestures of an occupant of the vehicle, information associated with an upcoming route of the vehicle, and/or any combination thereof.

[0272] In some embodiments, the obtained information includes sensor data that includes an interior temperature of the vehicle and/or an exterior temperature outside the vehicle. As another example, the obtained information may include sensor data that indicates present lighting conditions (e.g., an amount of light incident on various windows or other portions of the vehicle).

[0273] In some embodiments, the obtained information includes historical preferences of one or more users or occupants of the vehicle. In some embodiments, historical preferences may include explicit user-configuration settings (e.g., that particular windows are to be tinted to particular states at particular times of day) and/or implicitly learned preferences (e.g., that a user frequently requests a particular tint state for particular windows at particular times of day). Implicitly learned preferences may be identified using one or more machine learning models. The machine learning models may be trained using data from one or more device ensembles of the vehicle, previously issued instructions to alter states of one or more devices of the vehicle, or the like.

[0274] In some embodiments, the obtained information includes one or more detected and/or identified gestures by an occupant of the vehicle. Gestures may include movement, e.g., of an arm, a finger, a head, an eyebrow, or the like. An identified gesture may include an identified gesture pattern. A gesture pattern may be a motion pattern, a voice pattern, and/or a facial expression pattern. Examples of motion patterns may include moving a body part in a particular direction (e.g., moving an arm up, tilting a head left, or the like), pinching fingers together, moving a hand up as if to shield the eyes, moving a hand up as if to wipe a brow or fan a face, or the like. Examples of facial expression patterns may include moving eyebrows up or down, curling a lip, squinting the eyes, etc. Examples of voice patterns may include producing a particular sequence of words and/or a particular audible sound (e.g., a groan, a shriek, or the like). Gestures and/or gesture patterns may be detected by a sensor (e.g., a camera, a motion sensor, a wireless sensing device, or the like). For example, in response to detecting a gesture that has been associated with a user being too hot (e.g., a gesture of a user waving their hand as if to fan themselves or wipe their brow), one or more tintable windows may be tinted to a darker state. As another example, in response to detecting a gesture that has been associated with excessive glare (e.g., a gesture of a user blocking their eyes with their hand), one or more tintable windows may be tinted to a darker state. As yet another example, in response to detecting a gesture that has been associated with one or more tintable windows being in an excessively lightened state (e.g., a user squinting their eyes), one or more tintable windows may be tinted to a darker state. Gestures may be associated with target states of devices based at least in part on an explicit association (e.g., an occupant of the vehicle or a manufacturer of the vehicle specifying that a particular gesture is to cause a corresponding change in state of a particular device) or based on a learned association (e.g., in response to determining that a user has previously made a particular gesture prior to manually effecting a particular change in a state of a target device). In an example, a learned association may be formed in response to determining that a user has previously shaded their eyes prior to manually darkening a tint state of a particular window (e.g., a front windshield window). Learned associations may be learned by a machine learning model. A machine learning model may be trained using a training set that includes, for example, explicit user actions (e.g., setting tint states of various tintable windows) at particular time points, detected gestures, and/or any other training data. [0275] In some embodiments, the obtained information includes information about an upcoming route of the vehicle (e.g., based on a route that is being navigated by the vehicle, based on current geolocation information, or the like). In one example, the obtained information may indicate an upcoming shaded section of road (e.g., due to a tunnel, due to a road section with a lot of trees, etc.), and therefore, one or more tintable windows may be darkened. in response to the obtained information indicative of upcoming shade. In another example, the obtained information may indicate an upcoming section of road in which sunlight is likely to be incident on particular sides of the vehicle (e.g., due to a direction the vehicle is moving relative to a current or expected position of the sun in the sky, or the like). In some embodiments, the obtained information may indicate solar positions at various times of day and/or at various geographical locations. Solar position information may be used to predict an amount of sunlight incident on different sides of the vehicle at a particular time in the future. In some embodiments, tinting of windows of the vehicle may change as the vehicle moves along one or more roads. For example, an east-facing window of the vehicle may be tinted to a relatively darkened state in the morning, and, the east-facing window of the vehicle may be lightened, for example, as time passes (e.g., as morning becomes afternoon or evening). In some embodiments, different tintable windows may be tinted to different states. For example, an east-facing window may be tinted to a darkened state in the morning and a west-facing window may be in a bleached or lightened state in the morning.

In some embodiments, an upcoming location or position of the vehicle may be based on a current speed of the vehicle, a predicted speed of the vehicle on an upcoming stretch of road (e.g., based on traffic conditions on the upcoming stretch of road), an average speed of other vehicle on the road, a target destination, and/or any other suitable parameters that may be used to determine and/or predict upcoming locations of the vehicle.

[0276] In some embodiments, a target environment and/or a target state of one or more occupants of the vehicle is determined and/or inferred based on sensor data, data indicating previous activities of the occupants, or the like. For example, a determination that a target environment is to be cooler than a current environment may be made based at least in part on information indicative of a current temperature of one or more occupants (e.g., whether the current temperature exceeds a comfort threshold), recent activities of the one or more occupants (e.g., whether one or more occupants were recently exercising), or the like. In some embodiments, recent activities of one or more occupants of the vehicle may be determined based on a geographical location of the vehicle and/or a user device associated with (e.g., paired with) the vehicle, calendar information associated with the one or more occupants, or the like, In some embodiments, a target environment may be inferred. For example, an ideal interior temperature of the vehicle may be determined based on a determined size of occupants of the vehicle (e.g., that indicate body masses of the one or more occupants). Size information may indicate whether one or more occupants are children.

[0277] In some embodiments, a target state of a device in and/or associated with a vehicle is altered and/or directed to be altered by a controller. The controller may be a local controller (e.g., a local window controller) that controls one or more devices in the vehicle. For example, the local controller may be a window controller that controls tint of one or more tintable windows operatively coupled to the window controller. A local controller may receive instructions from an intermediate controller and/or a network controller. For example, the intermediate controller and/or the network controller may transmit instructions to the local controller to affect a particular state change of the device(s) the local controller controls. In some embodiments, an intermediate controller and/or a network controller may receive instructions from a master controller. The master controller may transmit instructions based at least in part on information obtained from various sources (e.g., sensor data from one or more device ensembles associated with the vehicle, route or navigation information of the vehicle, solar position information, historical preference of occupants of the vehicle, or the like). In some embodiments, a master controller and/or an intermediate controller may be remote controllers (e.g., remote from the vehicle).

[0278] In some embodiments, alteration of one or more devices in and/or associated with a vehicle may be controlled via an application, e.g., an application executing on a mobile device that has been paired with a controller of the vehicle. Examples of such mobile devices may include a mobile phone, a tablet computer, a wearable computer (e.g., a smart watch, a VR/MR headset, or the like), and/or any other mobile device. In some embodiments, an application that controls and/or instructs alteration of one or more devices of a vehicle may be secure. For example, the application may require authentication of a user of the vehicle (e.g., based on a password, a biometric signature, etc.). As another example, communications between the application and a controller of the vehicle may be secured (e.g., encrypted). Application security may prevent unauthorized users from controlling devices of a vehicle, e.g., tinting a windshield of a vehicle while the vehicle is in motion. In some embodiments, an application may receive communications from a controller of the vehicle. For example, the application may receive communications (e.g., push notifications and/or other alerts) indicating that one or more sensors (e.g., of a device ensemble) have detected an occupant in the vehicle after the vehicle has been determined to have been parked. This may alert a driver (e.g., a parent or guardian) that a child or pet has been unintentionally left in the vehicle. [0279] FIG. 20 shows a flowchart of a method for altering states of devices in a vehicle in accordance with some embodiments. At 2000, data indicative of a present environment of a vehicle and/or a present state of occupants of the vehicle is obtained. The data may be obtained from one or more sensors (e.g., sensors of one or more device ensembles in and/or associated with the vehicle). The data may indicate a current interior or exterior temperature, a current level of various gases (e.g., oxygen, carbon dioxide, etc.) inside the vehicle, an amount of light incident on different portions of the vehicle, a current humidity inside the vehicle, a temperature of an occupant of the vehicle (e.g., as measured by an infrared sensor), whether the vehicle is in motion and/or a speed of the motion, activities of one or more occupants of the vehicle (e.g., whether one or more occupants are driving, reading, napping, watching media content, or the like), various sizes of occupants of the vehicle (e.g., whether one or more occupants are children), or the like. At 2001, information indicating a target environment of the vehicle and/or a target state of occupants of the vehicle is obtained. A target environment of the vehicle may include a target interior temperature, a target brightness level inside the vehicle, a target glare condition of one or more tintable windows, a target humidity level, a target gas concentration (e.g., a target level of oxygen, a target level of carbon dioxide, or the like), etc. A target state of occupants of the vehicle may include a requested change in temperature of the occupants (e.g., whether one or more occupants want to be hotter or colder), a requested entertainment state (e.g., whether one or more occupants want to be viewing media content for entertainment), or the like. It should be noted that a requested change or a requested state may be explicitly requested by a user or implicitly inferred by a controller associated with the vehicle (e.g., a master controller). The information may be obtained from explicitly indicated user preferences, historical user preferences, information related to a route the vehicle is currently navigating, present or future (e.g., predicted) solar positions, and/or any other information. In some embodiments, a target environment and/or a target state of the occupants may be based on the data obtained at 2001 (e.g., whether the target environment is to be cooler than a current environment because the data obtained at 2001 indicates that a temperature of one or more occupants exceeds a comfort threshold) At 2002, a state of at least one device in and/or associated with the vehicle is altered to achieve the target environment and/or to achieve the target state of the one or more occupants. The at least one device may include one or more tintable windows, one or more media displays (e.g., one or more media displays associated with, disposed in, adhered to, and/or attached to one or more tintable windows), one or more HVAC components associated with the vehicle (e.g., one or more air vents for blowing air and/or providing air conditioning), one or more security devices (e.g., vehicle alarms, vehicle lights, etc.), and/or any other controllable devices of the vehicle. [0280] In some embodiments, media content is presented on one or more media displays associated with the vehicle. A media display may be associated with a tintable window. For example, a media display may be attached to, coupled to, adhered to, disposed in and/or on, a tintable window. In some embodiments, a tint of a tintable window associated with a media display that is being used to present media content may be tinted in a manner to increase contrast between the tintable window and the media display. In some embodiments, the tintable window may be tinted to a transparent state such that the media display appears to be floating. In some embodiments, the media display may be associated with a tintable window disposed in an interior of the vehicle. For example, a media display associated with a tintable window disposed in an interior of the vehicle may be used to present targeted advertisements, e.g., in a taxi and/or rideshare car, in a commercial vehicle (e.g., an airplane, a commuter train, a ferry, etc.).

[0281] In some embodiments, media content presented on a media display includes television shows, movies, videos, podcasts, video conferences and/or video calls, playlists of media content (e.g., an ordered list or collection of videos), electronic books, slideshows, documents, and/or any other suitable media content. In some embodiments, the media content may include advertisements, health and/or safety information, or the like. In some embodiments, media content may be explicitly selected by and/or requested by a user. For example, the media content may include a video or other program that has been explicitly selected for presentation (e.g., via a third-party media content provisioning service). As another example, the media content may include video content and/or audio content associated with a video call that has been initiated and/or joined by an occupant of the vehicle. In some embodiments, media content may be streamed to the media display from a server or other remote device. In some embodiments, media content may be identified and/or selected for particular occupants of the vehicle, for example, based on explicitly indicated and/or implicitly inferred interests of one or more occupants of the vehicle.

Explicitly indicated interests may be identified from interests specified by an occupant, e.g., in a user-configuration panel associated with an application used to control and/or interact with a media display. Implicitly indicated interests may be determined and/or inferred based at least in part on previously viewed media content (e.g., viewed using a third-party media content provisioning service), information available via a linked social network profile, an inferred demographic of an occupant (e.g., whether an occupant is a child or adult, which may be inferred based on a body size of the occupant, a fundamental frequency of a voice of the occupant, or the like), and/or any other suitable factors.

[0282] In some embodiments, a media display on which to present media content is selected. For example, a media display may be selected that is closest to an occupant of the vehicle. In one example, a media display that is associated with a tintable window disposed in an interior of a vehicle may be selected in response to determining (e.g., using data from an occupancy sensor, temperature sensor, infrared sensor, motion sensor, or the like) that a back row of the vehicle is occupied. In another example, a media display that is associated with a tintable window corresponding to a front windshield of the vehicle may be selected in response to determining (e.g., using data from an occupancy sensor, temperature sensor, infrared sensor, motion sensor, or the like) that only a driver’s seat of the vehicle is currently occupied. In some embodiments, a media display proximate to a driver’s seat of the vehicle (e.g., a media display adjacent to and/or disposed on or in a front windshield window) may be inhibited from presenting media content in response to determining that the vehicle is currently in motion. In some embodiments, a controller may cause the media display to be inhibited until it is determined that the vehicle is parked.

[0283] FIG. 21 shows a flowchart of an example method for presenting media content in a vehicle in accordance with some embodiments. At 2100, information associated with one or more occupants of the vehicle and/or a state of the vehicle is obtained. In some embodiments, the information may indicate whether the vehicle is currently in motion and/or is parked. In some embodiments, the information may indicate an identity of the one or more occupants, such as whether the only occupant is a driver or operator of the vehicle, a number of passengers, whether one or more passengers are taxi or rideshare customers, explicit and/or inferred demographic information of one or more occupants of the vehicle (e.g., an inferred age based on body size), or the like. In some embodiments, the information may indicate media content preferences of one or more occupants of the vehicle, such as previously viewed media content (e.g., as obtained via one or more third-party media content provisioning services that have been linked or paired with a controller associated with the media display). At 2101 , media content to be presented by a media display of the vehicle is identified based at least in part on the information obtained. The media content may be an advertisement (e.g., a targeted advertisement based on explicit and/or inferred interests of the one or more occupants, a targeted advertisement selected based on previously viewed media content, a targeted advertisement based on a destination of the vehicle, or the like). The media content may be a television show, a movie, a playlist of media content items, a document, an electronic book, content associated with a video call or video conference, or the like. The media content may be live-streamed and/or pre-recorded. At 2102, the media content is presented by the media display. In some embodiments, the media display may be selected and/or identified based on proximity to one or more occupants of the vehicle. In some embodiments, presentation of the media content may be inhibited based on a determination that the vehicle is currently in motion, and presentation of the media content may be initiated in response to determining that the vehicle has subsequently parked. At 2103, in some embodiments, a tint setting of a tintable window associated with the media display is adjusted. For example, the tintable window may be tinted to be transparent such that the media display appears to be floating. As another example, the tintable window may be tinted to increase a contrast between the window and the media display.

[0284] In some embodiments, target apparatus(es) (e.g., service device(s)) can be discovered within a range from a user (e.g., using the network and the control system). In some embodiments, target apparatus(es) (e.g., service device(s)) can be discovered within a range from a target apparatus. The user range and the apparatus range can intersect. The range can be referred to herein as a “discovery range,” for example, a service apparatus discovery range. A target apparatus can be discovered by a user when the target apparatus discovery range intersects with the user discovery range. For example, a target apparatus can be discovered by a user when the user is in the target apparatus discovery range. The discovery can be using the network. The discovery can be displayed in a mobile circuitry (e.g., cellular phone) of the user. The range can be specific to a target apparatus, target apparatus type, or a set of target apparatus types. For example, a first range can be for manufacturing machines, a second range can be for media displays, and a third range can be for food service machines. The range can be specific to an enclosure, or to a portion of the enclosure. For example, a first discovery range can be for a lobby, a second discovery range can be for a cafeteria, and a third discovery range can be for an office or for a group of offices. The range can be fixed or adjustable (e.g., by a user, a manager, a facility owner, and/or a lessor). A first target apparatus type may have a different discovery range from a second target apparatus type. For example, a larger control range can be assigned for light switches, and shorter for beverage service devices. The larger control range can be of at most about 1 meter (m), 2m, 3m, or 5m. The shorter control range can be of at most about 0.2 m, 0.3m, 0.4m, 0.5m, 0.6m, 0.7m, 0.8m, or 0.9m. A user may detect (e.g., visually and/or using a list) devices within relevant use range of the user. Visually may comprise using icons, drawings, and/or a digital twin of the enclosure (e.g., as disclosed herein). Usage of discovery ranges may facilitate focusing (e.g., shortening) a list of target apparatuses relevant for the user to control, e.g., and prevent the user from having to select from a long list of (e.g., largely irrelevant) target apparatuses (e.g., service devices). Controlling the range can be using a position of the user (e.g., using a geolocation device such as one comprising UWB technology), and target apparatus paring (e.g., Wi-Fi pairing) to the network. The range of discovery be unconstrained by a rage dictated by direct device-user paring technology (e.g., Bluetooth pairing range). For example, when the user is located far from the target apparatus, the user may be able to couple with the target apparatus even if the device is out of the direct device-user paring technology range (e.g., user range). The third party target apparatus selected by the user may or may not incorporate a technology for direct device-user pairing technology.

[0285] In some embodiments, pulse-based ultra-wideband (UWB) technology (e.g., ECMA- 368, or ECMA-369) is a wireless technology for transmitting large amounts of data at low power (e.g., less than about 1 millivolt (mW), 0.75mW, 0.5mW, or 0.25mW) over short distances (e.g., of at most about 300 feet Q, 250’, 230’, 200’, or 150’). A UWB signal can occupy at least about 750MHz, 500 MHz, or 250MHz of bandwidth spectrum, and/or at least about 30%, 20%, or 10% of its center frequency. The UWB signal can be transmitted by one or more pulses. A component broadcasts digital signal pulses may be timed (e.g., precisely) on a carrier signal across a number of frequency channels at the same time. Information may be transmitted, e.g., by modulating the timing and/or positioning of the signal (e.g., the pulses). Signal information may be transmitted by encoding the polarity of the signal (e.g., pulse), its amplitude and/or by using orthogonal signals (e.g., pulses). The UWB signal may be a low power information transfer protocol. The UWB technology may be utilized for (e.g., indoor) location applications. The broad range of the UWB spectrum comprises low frequencies having long wavelengths, which allows UWB signals to penetrate a variety of materials, including various building fixtures (e.g., walls). The wide range of frequencies, e.g., including the low penetrating frequencies, may decrease the chance of multipath propagation errors (without wishing to be bound to theory, as some wavelengths may have a line-of-sight trajectory). UWB communication signals (e.g., pulses) may be short (e.g., of at most about 70cm, 60 cm, or 50cm for a pulse that is about 600MHz, 500 MHz, or 400MHz wide; or of at most about 20cm, 23 cm, 25cm, or 30cm for a pulse that is has a bandwidth of about 1GHz, 1.2GHz, 1.3 GHz, or 1.5GHz). The short communication signals (e.g., pulses) may reduce the chance that reflecting signals (e.g., pulses) will overlap with the original signal (e.g., pulse).

[0286] In some embodiments, an identification (ID) tag of a user can include a micro-chip. The micro-chip can be a micro- location chip. The micro-chip can incorporate auto-location technology (referred to herein also as “micro-location chip”). The micro-chip may incorporate technology for automatically reporting high-resolution and/or high accuracy location information. The auto-location technology can comprise GPS, Bluetooth, or radio-wave technology. The auto-location technology can comprise electromagnetic wave (e.g., radio wave) emission and/or detection. The radio-wave technology may be any RF technology disclosed herein (e.g., high frequency, ultra-high frequency, super high frequency. The radio wave technology may comprise UWB technology. The micro-chip may facilitate determination of its location within an accuracy of at most about 25 centimeters, 20cm, 15cm, 10 cm, or 5cm. In various embodiments, the control system, sensors, and/or antennas are configured to communicate with the micro-location chip. In some embodiments, the ID tag may comprise the micro-location chip. The micro-location chip may be configured to broadcast one or more signals. The signals may be omnidirectional signals. One or more component operatively coupled to the network may (e.g., each) comprise the micro-location chip. The micro-location chips (e.g., that are disposed in stationary and/or known locations) may serve as anchors. By analyzing the time taken for a broadcast signal to reach the anchors within the transmittable distance of the ID-tag, the location of the ID tag may be determined. One or more processors (e.g., of the control system) may perform an analysis of the location related signals. For example, the relative distance between the micro-chip and one or more anchors and/or other micro-chip(s) (e.g., within the transmission range limits) may be determined. The relative distance, know location, and/or anchor information may be aggregated. At least one of the anchors may be disposed in a floor, ceiling, wall, of a facility (e.g., vehicle). There may be at least 1, 2, 3, 4, 5, 8, or 10 anchors disposed in the enclosure (e.g., in the vehicle). At least two of the anchors may have at least of (e.g., substantially) the same X coordinate, Y coordinate, and Z coordinate (of a Cartesian coordinate system).

[0287] In some embodiments, a window control system enables locating and/or tracking one or more devices (e.g., comprising auto-location technology such as the micro location chip) and/or at least one user carrying such device. The relative location between two or more such devices can be determined from information relating to received transmissions, e.g., at one or more antennas and/or sensors. The location of the device may comprise geo positioning and/or geolocation. The location of the device may an analysis of electromagnetic signals emitted from the device and/or the micro-location chip. Information that can be used to determine location includes, e.g., the received signal strength, the time of arrival, the signal frequency, and/or the angle of arrival. When determining a location of the one or more components from these metrics, a localization (e.g., using trilateration such as triangulation) module may be implemented. The localization module may comprise a calculation and/or algorithm. The auto-location may comprise geolocation and/or geo positioning. Examples of location methods may be found in PCT Patent Application serial number PCT/US17/31106 filed on May 4, 2017 titled “WINDOW ANTENNAS,” which is incorporated herein by reference in its entirety.

[0288] In some embodiments, the position of the user may be located using one or more positional sensors. The positional sensor(s) may be disposed in the enclosure (e.g., a vehicle). The positional sensor may be part of a device ensemble or separated from a device ensemble (e.g., standalone positional sensor). The positional sensor may be operatively (e.g., communicatively) coupled to a network. The network may be a network of the vehicle. The network may be configured to transmit communication and power. The network may be any network disclosed herein. The network may extend to a room, a floor, several rooms, several floors, the building, or several buildings of the facility. The network may operatively (e.g., to facilitate power and/or communication) couple to a control system (e.g., as disclosed herein), to sensor(s), emitter(s), antenna, router(s), and/or power supply. The network may be coupled to personal computers of users (e.g., occupants) associated with the vehicle (e.g., a driver and/or passengers). The personal computers of the users may be disposed remote from the vehicle. The network may be operatively coupled to other devices that perform operations for, or associated with, the vehicle (e.g., communication machinery). The communication machinery may include media projectors, media display, touch screens, speakers, and/or lighting (e.g., entry, exit, and/or security lighting).

[0289] In some embodiments, at least one device ensemble includes at least one processor and/or memory. The processor may perform computing tasks (e.g., including machine learning and/or artificial intelligence related tasks). In this manner the network can allow low latency (e.g., as disclosed herein) and faster response time for applications and/or commands. In some embodiments, the network and circuitry coupled thereto may form a distributed computing environment (e.g., comprising CPU, memory, and storage) for application and/or service hosting to store and/or process content close to the user’s mobile circuitry (e.g., cellular device, pad, or laptop).

[0290] In some embodiments, the network is coupled to device ensemble(s). The device ensemble may perform (e.g., in real time) sensing and/or tracking of occupants in an enclosure in which the device ensemble is disposed (e.g., in situ), e.g., (i) to enable seamless connectivity of the user’s mobile circuitry to the network and/or adjustment of network coupled machinery to requirements and/or preferences of the user, (ii) to identify the user (e.g., using facial recognition, speech recognition, and/or identification tag), and/or (iii) to cater the environment of the enclosure according to any preferences of the user..

[0291] In some embodiments, the target apparatus is operatively coupled to the network.

The network may be operatively (e.g., communicatively) coupled to one or more controllers. The network may be operatively (e.g., communicatively) coupled to one or more processors. Coupling of the target apparatus to the network may allow contactless communication of a user with the target apparatus using a mobile circuitry of the user (e.g., through a software application installed on the mobile circuitry). In this manner, a user need not directly communicatively couple and decouple from the service device (e.g., using Bluetooth technology). By coupling the target apparatus to the network to which the user is communicatively coupled (e.g., through the mobile circuitry of the user), a user may be communicatively couple to a plurality of target apparatuses simultaneously (e.g., concurrently). The user may control at least two of the plurality of target apparatuses sequentially. The user may control at least two of the plurality of target apparatuses simultaneously (e.g., concurrently). For example, a user may have two applications of two different target apparatuses open (e.g., and running) on his mobile circuitry, e.g., available for control (e.g., manipulation).

[0292] In some example, the discovery of target apparatus by a user is not restricted by a range. The discovery of target apparatus by a user can be restricted by at least one security protocol (e.g., dangerous manufacturing machinery may be available only to permitted manufacturing personnel). The security protocol can have one or more security levels. The user may override at least one (e.g., any) range restriction and select the target apparatus from all available target apparatuses.

[0293] In some embodiments, the target apparatus is communicatively coupled to the network. The target device may utilize a network authentication protocol. The network authentication protocol may open one or more ports for network access. The port(s) may be opened when an identity of a target apparatus that attempts to operatively couple (and/or physically couples) to the network is authenticated (e.g., through network authentication). Operative coupling may comprise communicatively coupling. Access of the target apparatus to the network may be authorized (e.g., using the network). The access may or may not be restricted. The restriction may comprise one or more security levels. The identity of the target apparatus can be determined based on credentials and/or a certificate. The credentials and/or certificate may be confirmed by the network (e.g., by a server operatively coupled to the network). The authentication protocol may or may not be specific for physical communication (e.g., Ethernet communication) in a local area network (LAN), e.g., that utilizes packets. The standard may be maintained by the Institute of Electrical and Electronics Engineers (IEEE). The standard may specify the physical media (e.g., target apparatus) and/or the working characteristics of the network (e.g., Ethernet). The networking standard may support virtual LANs (VLANs) on a local area (e.g., Ethernet) network. The standard may support power over local area network (e.g., Ethernet). The network may provide communication over power line (e.g., coaxial cable). The power may be direct current (DC) power. The power may be at least about 12 Watts (W), 15 W, 25W, 30W, 40W, 48W, 50W, or 100W. The standard may facilitate mesh networking. The standard may facilitate a local area network (LAN) technology and/or wide area network (WAN) applications. The standard may facilitate physical connections between target apparatuses and/or infrastructure devices (hubs, switches, routers) by various types of cables (e.g., coaxial, twisted wires, copper cables, and/or fiber cables). Examples of network authentication protocols can be 802.1X, or KERBEROS. The network authentication protocol may comprise secret-key cryptography. The network can support (e.g., communication) protocols comprising 802.3, 802.3af (PoE), 802.3at (PoE+), 802.1Q, or 802.11s. The network may support a communication protocol for Building Automation and Control (BAC) networks (e.g., BACnet). The protocol may define service(s) used to communicate between building devices. The protocol services may include device and object discovery (e.g., Who- Is, l-Am, Who-Has, and/or l-Have). The protocol services may include Read-Property and Write-Property (e.g., for data sharing). The network protocol may define object types (e.g., that are acted upon by the services). The protocol may define one or more data links / physical layers (e.g., ARCNET, Ethernet, BACnet/IP, BACnet/IPv6, BACnet/MSTP, Point- To-Point over RS-232, Source-Follower/Token-Passing over RS-485, ZigBee, and/or LonTalk). The protocol may be dedicated to devices (e.g., Internet of Things (loT) devices and/or machine to machine (M2M) communication). The protocol may be a messaging protocol. The protocol may be a publish - subscribe protocol. The protocol may be configured for messaging transport. The protocol may be configured for remote devices. The protocol may be configured for devices having a small code footprint and/or minimal network bandwidth. The small code footprint may be configured to be handled by microcontrollers. The protocol may have a plurality of quality of service levels including (i) at most once, (ii) at least once, and/or (iii) exactly once. The plurality of quality of service levels may increase reliability of the message delivery in the network (e.g., to its target). The protocol may facilitate messaging (i) between device to cloud and/or (ii) between cloud to device. The messaging protocol is configured for broadcasting messages to groups of targets such as target apparatuses (e.g., devices), sensors, and/or emitters. The protocol may comply with Organization for the Advancement of Structured Information Standards (OASIS). The protocol may support security schemes such as authentication (e.g., using tokens). The protocol may support access delegation standard (e.g., OAuth). The protocol may support granting a first application (and/or website) access to information on a second application (and/or website) without providing the second with a security code (e.g., token and/or password) relating to the first application. The protocol may be a Message Queuing Telemetry Transport (MQTT) or Advanced Message Queuing Protocol (AMQP) protocol. The protocol may be configured for a message rate of at least one (1) message per second per publisher. The protocol may be configured to facilitate a message payload size of at most 64, 86, 96, or 128 bytes. The protocol may be configured to communicate with any device (e.g., from a microcontroller to a server) that operates a protocol compliant (e.g., MQTT) library and/or connects to compliant broker (e.g., MQTT broker) over a network. Each device (e.g., target apparatus, sensor, or emitter) can be a publisher and/or a subscriber. A broker can handle millions of concurrently connected devices, or less than millions. The broker can handle at least about 100, 10000, 100000, 1000000, or 10000000 concurrently connected devices. In some embodiments, the broker is responsible for receiving (e.g., all) messages, filtering the messages, determining who is interested in each message, and/or sending the message to these subscribed device (e.g., broker client). The protocol may require internet connectivity to the network. The protocol may facilitate bi-directional, and/or synchronous peer-to-peer messaging. The protocol may be a binary wire protocol. Examples of such network protocol, control system, and network can be found in US provisional patent application serial no. 63/000,342 filed 03/26/2020 titled “MESSAGING IN A MULTI CLIENT NETWORK,” which is incorporated herein by reference in its entirety.

[0294] Examples of network security, communication standards, communication interface, messaging, coupling of devices to the network, and control can be found in U.S. provisional patent application serial number 63/000,342, and in PCT patent application serial number PCT/US20/70123 filed June 04, 2020, titled “SECURE BUILDING SERVICES NETWORK,” each of which is incorporated herein by reference in its entirety.

[0295] In some embodiments, the network allows a target apparatus to couple to the network. The network (e.g., using controller(s) and/or processor(s)) may let the target apparatus join the network, authenticate the target apparatus, monitor activity on the network (e.g., activity relating to the target apparatus), facilitate performance of maintenance and/or diagnostics, and secure the data communicated over the network. The security levels may allow bidirectional or monodirectional communication between a user and a target apparatus. For example, the network may allow only monodirectional communication of the user to the target apparatus. For example, the network may restrict availability of data communicated through the network and/or coupled to the network, from being accessed by a third party owner of a target apparatus (e.g., service device). For example, the network may restrict availability of data communicated through the network and/or coupled to the network, from being accessed by the organization and/or facility into data relating to a third party owner and/or manufacturer of a target apparatus (e.g., service device).

[0296] In some embodiments, the control system is operatively coupled to a learning module. The learning module may utilize a learning scheme, e.g., comprising artificial intelligence. The learning module may learn preferences of one or more users associated with the vehicle (e.g., an owner of the vehicle, a driver or operator of the vehicle, frequent passengers of the vehicle, etc.). The learning modules may analyze preference of a user or a group of users. The learning module may gather preferences of the user(s) as to one or more environmental characteristics. The learning module may use past preference(s) of the user as a learning set for the user or for the group to which the user belongs. The preferences may include environmental preference or preferences related to a target apparatus (e.g., service machine, a tintable window, a media display, and/or any other controllable device).

[0297] In some embodiments, a control system conditions various aspects of an enclosure. For example, the control system may condition an environment of the enclosure. The control system may project future environmental preferences of the user, and condition the environment to these preferences in advance (e.g., at a future time). The preferential environmental characteristic(s) may be allocated according to (i) user or group of users, (ii) time, (iii) date, and/or (iv) space. The data preferences may comprise seasonal preferences. The environmental characteristics may comprise lighting, ventilation speed, atmospheric pressure, smell, temperature, humidity, carbon dioxide, oxygen, VOC(s), particulate matter (e.g., dust), or color. For example, a user is a heart patient and prefers (e.g., requires) an oxygen level above the ambient oxygen level (e.g., 20% oxygen) and/or a certain humidity level (e.g., 70%). The control system may condition the atmosphere of the environment for that oxygen and humidity level when the heart patient occupant is in a certain enclosure.

[0298] In some embodiments, a control system may operate a target apparatus according to preference of a user or a group of users. The preferences may be according to past behavior of the user(s) in relation to the target apparatus (e.g., settings, service selection, timing related selections, and/or location related selections).

[0299] In some embodiments, the control system may adjust the environment and/or target apparatus according to hierarchical preferences. When several different users (e.g., of different groups) are gathered in an enclosure, which users have conflicting preferences, the control system may adjust the environment and/or target apparatus according to a pre- established hierarchy. The hierarchy may comprise jurisdictional (e.g., health and/or safety) standards, health, safety, activity taking place in the enclosure, number of occupants in the vehicle, vehicle type (e.g., automotive or non-automotive, autonomous or non-autonomous, airplane, passenger vehicle, or any other type of vehicle), time of day, date, season, and/or activity in the facility.

[0300] In some embodiments, the control system considers results (e.g., scientific and/or research based results) regarding environmental conditions that affect health, safety and/or performance of enclosure occupants. The control system may establish thresholds and/or preferred window-ranges for one or more environmental characteristic of the enclosure (e.g., of an atmosphere of the enclosure). The threshold may comprise a level of atmospheric component (e.g., VOC and/or gas), temperature, and time at a certain level. The certain level may be abnormally high, abnormally low, or average. For example, the controller may allow short instances of abnormally high VOC level, but not prolonged time with that VOC level. The control system may automatically override preference of a user if it contradicts health and/or safety thresholds. Health and/or safety thresholds may be at a higher hierarchical level relative to a user’s preference. The hierarchy may utilize majority preferences. For example, if two occupants of a vehicle have one preference, and the third occupant has a conflicting preference, then the preferences of the two occupants will prevail (e.g., unless they conflict health and/or safety considerations).

[0301] Fig. 25 shows an example of a flow chart depicting operations of a control system that is operatively coupled to one or more devices in an enclosure (e.g., a vehicle). In block 2500 an identify of a user is identified by a control system. The identity can be identified by one or more sensors (e.g., camera) and/or by an identification tag (e.g., by scanning or otherwise sensing by one or more sensors). In block 2501, a position of the user or of body portions of the user (e.g., hands of the user, arms of the user, a head of the user, or the like) may optionally be tracked as the user spends time in the enclosure. The use may provide input as to any preference. The preference may be relating to a target apparatus, and/or environmental characteristics. A learning module may optionally track such preferences and provide predictions as to any future preference of the user in block 2503. Past elective preferences by the user may be recorded (e.g., in a database) and may be used as a learning set for the learning module. As the learning process progress over time and the user provides more and more inputs, the predictions of the learning module may increase in accuracy. The learning module may comprise any learning scheme (e.g., comprising artificial intelligence and/or machine learning) disclosed herein. The user may override recommendations and/or predictions made by the learning module. The user may provide manual input into the control system. In block 2502, the user input is provided (whether directly by the user or by predictions of the learning module) to the control system. The control system may alter (or direct alteration of) one or more devices in the vehicle to materialize the user preferences (e.g., input) by using the input in block 2504. The control system may or may not use a location or position of the user. The user may express a preference for a sound of a certain level that constitutes the input. The expression of preference may be by manual input (including tactile, voice and/or gesture command). A past expression of preference may be registered in a database and linked to the user. For example, a sound level of speakers in a vehicle may be adjusted to the user preference when one or more sensors sense presence of the user in the vehicle. The sound level in the vehicle may be returned to a default level and/or adjusted to another’s preference when one or more sensors sense absence of the user in the vehicle. [0302] In some embodiments, a user expresses at least one preference of environmental characteristic(s) and/or target apparatus, which preference constitutes an input. The input may be by manual input (including tactile, voice and/or gesture command). A past expression of preference (e.g., input) may be registered in a database and linked to the user. The user may be part of a group of users. The group of users may be any grouping disclosed herein. The preference of the user may be linked to the group to which the user belongs. The user may enter an enclosure (e.g., a vehicle) at a prescheduled time (e.g., at a time the user needs to begin driving the vehicle to arrive at a destination at a particular time). The environmental characteristic(s) of the enclosure may be adjusted to the user preference (i) when the user was scheduled to enter the enclosure and/or (ii) when one or more sensors sense presence of the user in the enclosure. The environmental characteristic(s) of the enclosure may be returned to a default level and/or adjusted to another’s preference when one or more sensors sense absence of the user in the enclosure. The target apparatus may be adjusted to the user preference (i) when the user was scheduled to use the target apparatus and/or (ii) when one or more sensors sense presence of the user near the target apparatus (e.g., within a predetermined distance threshold). The target apparatus may return to default setting or be adjusted to another’s preference (i) when the scheduled use of the target apparatus by the user ends and/or (ii) when one or more sensors sense absence of the user near the target apparatus (e.g., within a predetermined distance threshold).

[0303] In some embodiments, data is analyzed by a learning module. The data can be sensor data and/or user input. The user input may be regarding one or more preferred environmental characteristics and/or target apparatus. The learning module may comprise at least one rational decision making process, and/or learning that utilizes the data (e.g., as a learning set). The analysis of the data may be utilized to adjust and environment, e.g., by adjusting one or more components that affect the environment of the enclosure. The analysis of the data may be utilized to control a certain target apparatus, e.g., to produce a product, according to user preferences, and/or choose the certain target apparatus (e.g., based on user preference and/or user location). The data analysis may be performed by a machine based system (e.g., comprising a circuitry). The circuitry may be of a processor.

The sensor data analysis may utilize artificial intelligence. The data analysis may rely on one or more models (e.g., mathematical models). In some embodiments, the data analysis comprises linear regression, least squares fit, Gaussian process regression, kernel regression, nonparametric multiplicative regression (NPMR), regression trees, local regression, semiparametric regression, isotonic regression, multivariate adaptive regression splines (MARS), logistic regression, robust regression, polynomial regression, stepwise regression, ridge regression, lasso regression, elasticnet regression, principal component analysis (PCA), singular value decomposition, fuzzy measure theory, Borel measure, Han measure, risk-neutral measure, Lebesgue measure, group method of data handling (GMDH), Naive Bayes classifiers, k-nearest neighbors algorithm (k-NN), support vector machines (SVMs), neural networks, support vector machines, classification and regression trees (CART), random forest, gradient boosting, or generalized linear model (GLM) technique. The data analysis may include a deep learning algorithm and/or artificial neural networks (ANN). The data analysis may comprise a learning schemes with a plurality of layers in the network (e.g., ANN). The learning of the learning module may be supervised, semi-supervised, or unsupervised. The deep learning architecture may comprise deep neural networks, deep belief networks, recurrent neural networks, or convolutional neural networks. The learning schemes may be ones utilized in computer vision, machine vision, speech recognition, natural language processing, audio recognition, social network filtering, machine translation, bioinformatics, drug design, medical image analysis, material inspection programs, and/or board game programs.

[0304] In some examples, a target apparatus is a tintable window (e.g., an electrochromic window). In some embodiments, a dynamic state of an electrochromic window is controlled by altering a voltage signal to an electrochromic device (ECD) used to provide tinting or coloring. An electrochromic window can be manufactured, configured, or otherwise provided as an insulated glass unit (IGU). IGUs may serve as the fundamental constructs for holding electrochromic panes (also referred to as “lites”) when provided for installation in a building. An IGU lite or pane may be a single substrate or a multi-substrate construct, such as a laminate of two substrates. IGUs, especially those having double- or triple-pane configurations, can provide a number of advantages over single pane configurations; for example, multi-pane configurations can provide enhanced thermal insulation, noise insulation, environmental protection and/or durability when compared with single-pane configurations. A multi-pane configuration also can provide increased protection for an ECD, for example, because the electrochromic films, as well as associated layers and conductive interconnects, can be formed on an interior surface of the multi-pane IGU and be protected by an inert gas fill in the interior volume of the IGU.

[0305] In some embodiments, a tintable window exhibits a (e.g., controllable and/or reversible) change in at least one optical property of the window, e.g., when a stimulus is applied. The stimulus can include an optical, electrical and/or magnetic stimulus. For example, the stimulus can include an applied voltage. One or more tintable windows can be used to control lighting and/or glare conditions, e.g., by regulating the transmission of solar energy propagating through them. One or more tintable windows can be used to control a temperature within a building, e.g., by regulating the transmission of solar energy propagating through them. Control of the solar energy may control heat load imposed on the interior of the facility (e.g., building). The control may be manual and/or automatic. The control may be used for maintaining one or more requested (e.g., environmental) conditions, e.g., occupant comfort. The control may include reducing energy consumption of a heating, ventilation, air conditioning and/or lighting systems. At least two of heating, ventilation, and air conditioning may be induced by separate systems. At least two of heating, ventilation, and air conditioning may be induced by one system. The heating, ventilation, and air conditioning may be induced by a single system (abbreviated herein as “HVAC). In some cases, tintable windows may be responsive to (e.g., and communicatively coupled to) one or more environmental sensors and/or user control. Tintable windows may comprise (e.g., may be) electrochromic windows. The windows may be located in the range from the interior to the exterior of a structure (e.g., facility, e.g., building). However, this need not be the case. Tintable windows may operate using liquid crystal devices, suspended particle devices, microelectromechanical systems (MEMS) devices (such as micro-shutters), or any technology known now, or later developed, that is configured to control light transmission through a window. Examples of windows (e.g., with MEMS devices for tinting) are described in U.S. Patent Application Serial Number 14/443,353 filed May 15, 2015, titled “MULTI PANE WINDOWS INCLUDING ELECTROCHROMIC DEVICES AND ELECTROMECHANICAL SYSTEMS DEVICES,” that is incorporated herein by reference in its entirety. In some cases, one or more tintable windows can be located within the interior of a vehicle, e.g., between rows of seats in the vehicle. In some cases, one or more tintable windows can be used in automobiles, trains, aircraft, and other vehicles, e.g., in lieu of a passive and/or non-tinting window.

[0306] In some embodiments, the tintable window comprises an electrochromic device (referred to herein as an “EC device” (abbreviated herein as ECD, or “EC”). An EC device may comprise at least one coating that includes at least one layer. The at least one layer can comprise an electrochromic material. In some embodiments, the electrochromic material exhibits a change from one optical state to another, e.g., when an electric potential is applied across the EC device. The transition of the electrochromic layer from one optical state to another optical state can be caused, e.g., by reversible, semi-reversible, or irreversible ion insertion into the electrochromic material (e.g., by way of intercalation) and a corresponding injection of charge-balancing electrons. For example, the transition of the electrochromic layer from one optical state to another optical state can be caused, e.g., by a reversible ion insertion into the electrochromic material (e.g., by way of intercalation) and a corresponding injection of charge-balancing electrons. Reversible may be for the expected lifetime of the ECD. Semi-reversible refers to a measurable (e.g., noticeable) degradation in the reversibility of the tint of the window over one or more tinting cycles. In some instances, a fraction of the ions responsible for the optical transition is irreversibly bound up in the electrochromic material (e.g., and thus the induced (altered) tint state of the window is not reversible to its original tinting state). In various EC devices, at least some (e.g., all) of the irreversibly bound ions can be used to compensate for “blind charge” in the material (e.g., ECD).

[0307] In some implementations, suitable ions include cations. The cations may include lithium ions (Li+) and/or hydrogen ions (H+) (i.e., protons). In some implementations, other ions can be suitable. Intercalation of the cations may be into an (e.g., metal) oxide. A change in the intercalation state of the ions (e.g. cations) into the oxide may induce a visible change in a tint (e.g., color) of the oxide. For example, the oxide may transition from a colorless to a colored state. For example, intercalation of lithium ions into tungsten oxide (W03-y (0 < y £ -0.3)) may cause the tungsten oxide to change from a transparent state to a colored (e.g., blue) state. EC device coatings as described herein are located within the viewable portion of the tintable window such that the tinting of the EC device coating can be used to control the optical state of the tintable window.

[0308] Fig. 22 shows an example of a schematic cross-section of an electrochromic device 2200 in accordance with some embodiments. The EC device coating is attached to a substrate 2202, a transparent conductive layer (TCL) 2204, an electrochromic layer (EC) 2206 (sometimes also referred to as a cathodically coloring layer or a cathodically tinting layer), an ion conducting layer or region (1C) 2208, a counter electrode layer (CE) 2210 (sometimes also referred to as an anodically coloring layer or anodically tinting layer), and a second TCL 2214. Elements 2204, 2206, 2208, 2210, and 2214 are collectively referred to as an electrochromic stack 2220. A voltage source 2216 operable to apply an electric potential across the electrochromic stack 2220 effects the transition of the electrochromic coating from, e.g., a clear state to a tinted state. In other embodiments, the order of layers is reversed with respect to the substrate. That is, the layers are in the following order: substrate, TCL, counter electrode layer, ion conducting layer, electrochromic material layer, TCL. In various embodiments, the ion conductor region (e.g., 2208) may form from a portion of the EC layer (e.g., 2206) and/or from a portion of the CE layer (e.g., 2210). In such embodiments, the electrochromic stack (e.g., 2220) may be deposited to include cathodically coloring electrochromic material (the EC layer) in direct physical contact with an anodically coloring counter electrode material (the CE layer). The ion conductor region (sometimes referred to as an interfacial region, or as an ion conducting substantially electronically insulating layer or region) may form where the EC layer and the CE layer meet, for example through heating and/or other processing steps. Examples of electrochromic devices (e.g., including those fabricated without depositing a distinct ion conductor material) can be found in U.S. Patent Application No. 13/462,725 filed May 2, 2012, titled “ELECTROCHROMIC DEVICES,” that is incorporated herein by reference in its entirety. In some embodiments, an EC device coating may include one or more additional layers such as one or more passive layers. Passive layers can be used to improve certain optical properties, to provide moisture, and/or to provide scratch resistance. These and/or other passive layers can serve to hermetically seal the EC stack 2220. Various layers, including transparent conducting layers (such as 2204 and 2214), can be treated with anti-reflective and/or protective layers (e.g., oxide and/or nitride layers).

[0309] In some embodiments, an IGU includes two (or more) substantially transparent substrates. For example, the IGU may include two panes of glass. At least one substrate of the IGU can include an electrochromic device disposed thereon. The one or more panes of the IGU may have a separator disposed between them. An IGU can be a hermetically sealed construct, e.g., having an interior region that is isolated from the ambient environment. A “window assembly” may include an IGU. A “window assembly” may include a (e.g., stand alone) laminate. A “window assembly” may include one or more electrical leads, e.g., for connecting the IGUs and/or laminates. The electrical leads may operatively couple (e.g. connect) one or more electrochromic devices to a voltage source, switches and the like, and may include a frame that supports the IGU or laminate. A window assembly may include a window controller, and/or components of a window controller (e.g., a dock).

[0310] Fig. 23 shows an example implementation of an IGU 2300 that includes a first pane 2304 having a first surface S1 and a second surface S2. In some implementations, the first surface S1 of the first pane 2304 faces an exterior environment, such as an outdoors or outside environment. The IGU 2300 also includes a second pane 2306 having a first surface S3 and a second surface S4. In some implementations, the second surface S4 of the second pane 2306 faces an interior environment, such as an inside environment of a home, building or vehicle, or a room or compartment within a home, building or vehicle.

[0311] In some embodiments, (e.g., each of the) first and/or the second panes 2304 and 2306 are transparent and/or translucent to light, e.g., in the visible spectrum. For example, (e.g., each of the) first and/or second panes 2304 and 2306 can be formed of a glass material (e.g., an architectural glass or other shatter-resistant glass material such as, for example, a silicon oxide (SOx) -based glass material. The (e.g., each of the) first and/or second panes 2304 and 2306 may be a soda-lime glass substrate or float glass substrate. Such glass substrates can be composed of, for example, approximately 75% silica (Si02) as well as Na20, CaO, and several minor additives. However, the (e.g., each of the) first and/or the second panes 2304 and 2306 can be formed of any material having suitable optical, electrical, thermal, and mechanical properties. For example, other suitable substrates that can be used as one or both of the first and the second panes 2304 and 2306 can include other glass materials as well as plastic, semi-plastic and thermoplastic materials (for example, poly(methyl methacrylate), polystyrene, polycarbonate, allyl diglycol carbonate, SAN (styrene acrylonitrile copolymer), poly(4-methyl-1-pentene), polyester, polyamide), and/or mirror materials. In some embodiments, (e.g., each of the) first and/or the second panes 2304 and 2306 can be strengthened, for example, by tempering, heating, or chemically strengthening.

[0312] In Fig. 23, first and second panes 2304 and 2306 are spaced apart from one another by a spacer 2318, which is typically a frame structure, to form an interior volume 2308. In some embodiments, the interior volume is filled with Argon (Ar) or another gas, such as another noble gas (for example, krypton (Kr) or xenon (Xn)), another (non-noble) gas, or a mixture of gases (for example, air). Filling the interior volume 2308 with a gas such as Ar, Kr, or Xn can reduce conductive heat transfer through the IGU 2300. Without wishing to be bound to theory, this may be because of the low thermal conductivity of these gases as well as improve acoustic insulation, e.g., due to their increased atomic weights. In some embodiments, the interior volume 2308 can be evacuated of air or other gas. Spacer 2318 generally determines the height “C” of the interior volume 2308 (e.g., the spacing between the first and the second panes 2304 and 2306). In Fig. 23, the thickness (and/or relative thickness) of the ECD, sealant 2320/2322 and bus bars 2326/2328 may not be to scale. These components are generally thin and are exaggerated here, e.g., for ease of illustration only. In some embodiments, the spacing “C” between the first and the second panes 2304 and 2306 is in the range of approximately 6 mm to approximately 30 mm. The width “D” of spacer 2318 can be in the range of approximately 5 mm to approximately 15 mm (although other widths are possible and may be desirable). Spacer 2318 may be a frame structure formed around all sides of the IGU 2300 (for example, top, bottom, left and right sides of the IGU 100). For example, spacer 2318 can be formed of a foam or plastic material. In some embodiments, spacer 2318 can be formed of metal or other conductive material, for example, a metal tube or channel structure having at least 3 sides, two sides for sealing to each of the substrates and one side to support and separate the lites and as a surface on which to apply a sealant, 2324. A first primary seal 2320 adheres and hermetically seals spacer 2318 and the second surface S2 of the first pane 2304. A second primary seal 2322 adheres and hermetically seals spacer 2318 and the first surface S3 of the second pane 2306. In some implementations, each of the primary seals 2320 and 2322 can be formed of an adhesive sealant such as, for example, polyisobutylene (PIB). In some implementations, IGU 2300 further includes secondary seal 2324 that hermetically seals a border around the entire IGU 2300 outside of spacer 2318. To this end, spacer 2318 can be inset from the edges of the first and the second panes 2304 and 2306 by a distance Έ.” The distance Έ” can be in the range of approximately four (4) millimeters (mm) to approximately eight (8) mm (although other distances are possible and may be desirable). In some implementations, secondary seal 2324 can be formed of an adhesive sealant such as, for example, a polymeric material that resists water and that adds structural support to the assembly, such as silicone, polyurethane and similar structural sealants that form a water-tight seal.

[0313] In the example of Fig. 23, the ECD coating on surface S2 of substrate 2304 extends about its entire perimeter to and under spacer 2318. This configuration is functionally desirable as it protects the edge of the ECD within the primary sealant 2320 and aesthetically desirable because within the inner perimeter of spacer 2318 there is a monolithic ECD without any bus bars or scribe lines.

[0314] Configuration examples of IGUs are described in U.S. Patent No. 8,164,818, issued April 24, 2012 and titled ELECTROCHROMIC WINDOW FABRICATION METHODS (Attorney Docket No. VIEWP006), U.S. Patent Application No. 13/456,056 filed April 25,

2012 and titled ELECTROCHROMIC WINDOW FABRICATION METHODS (Attorney Docket No. VIEWP006X1), PCT Patent Application No. PCT/US2012/068817 filed December 10, 2012 and titled THIN-FILM DEVICES AND FABRICATION (Attorney Docket No. VIEWP036WO), U.S. Patent No. 9,454,053, issued September 27, 2016 and titled THIN- FILM DEVICES AND FABRICATION (Attorney Docket No. VIEWP036US), and PCT Patent Application No. PCT/US2014/073081, filed December 13, 2014 and titled THIN-FILM DEVICES AND FABRICATION (Attorney Docket No. VIEWP036X1WO), each of which is hereby incorporated by reference in its entirety.

[0315] In the example shown in Fig. 23, an ECD 2310 is formed on the second surface S2 of the first pane 2204. The ECD 2310 includes an electrochromic (“EC”) stack 2312, which itself may include one or more layers. For example, the EC stack 2312 can include an electrochromic layer, an ion-conducting layer, and a counter electrode layer. The electrochromic layer may be formed of one or more inorganic solid materials. The electrochromic layer can include or be formed of one or more of a number of electrochromic materials, including electrochemically-cathodic or electrochemically-anodic materials. EC stack 2312 may be between first and second conducting (or “conductive”) layers. For example, the ECD 2310 can include a first transparent conductive oxide (TCO) layer 2314 adjacent a first surface of the EC stack 2312 and a second TCO layer 2316 adjacent a second surface of the EC stack 2312. An example of similar EC devices and smart windows can be found in U.S. Patent No. 8,764,950, titled ELECTROCHROMIC DEVICES, by Wang et al., issued July 1, 2214 and U.S. Patent No. 9,261,751, titled ELECTROCHROMIC DEVICES, by Pradhan et al., issued February 16, 2216, which is incorporated herein by reference in its entirety. In some implementations, the EC stack 2312 also can include one or more additional layers such as one or more passive layers. For example, passive layers can be used to improve certain optical properties, to provide moisture or to provide scratch resistance. These or other passive layers also can serve to hermetically seal the EC stack 2312.

[0316] In some embodiments, the selection or design of the electrochromic and counter electrode materials generally governs the possible optical transitions. During operation, in response to a voltage generated across the thickness of the EC stack (for example, between the first and the second TCO layers), the electrochromic layer transfers or exchanges ions to or from the counter electrode layer to drive the electrochromic layer to the desired optical state. To cause the EC stack to transition to a transparent state, a positive voltage may be applied across the EC stack (for example, such that the electrochromic layer is more positive than the counter electrode layer). In some embodiments, in response to the application of the positive voltage, the available ions in the stack reside primarily in the counter electrode layer. When the magnitude of the potential across the EC stack is reduced or when the polarity of the potential is reversed, ions may be transported back across the ion conducting layer to the electrochromic layer causing the electrochromic material to transition to an opaque state (or to a “more tinted,” “darker” or “less transparent” state). Conversely, in some embodiments using electrochromic layers having different properties, to cause the EC stack to transition to an opaque state, a negative voltage is applied to the electrochromic layer relative to the counter electrode layer. For example, when the magnitude of the potential across the EC stack is reduced or its polarity reversed, the ions may be transported back across the ion conducting layer to the electrochromic layer causing the electrochromic material to transition to a clear or “bleached” state (or to a “less tinted”, “lighter” or “more transparent” state).

[0317] In some implementations, the transfer or exchange of ions to or from the counter electrode layer also results in an optical transition in the counter electrode layer. For example, in some implementations the electrochromic and counter electrode layers are complementary coloring layers. More specifically, in some such implementations, when or after ions are transferred into the counter electrode layer, the counter electrode layer becomes more transparent, and similarly, when or after the ions are transferred out of the electrochromic layer, the electrochromic layer becomes more transparent. Conversely, when the polarity is switched, or the potential is reduced, and the ions are transferred from the counter electrode layer into the electrochromic layer, both the counter electrode layer and the electrochromic layer become less transparent.

[0318] In some embodiments, the transition of the electrochromic layer from one optical state to another optical state is caused by reversible ion insertion into the electrochromic material (for example, by way of intercalation) and a corresponding injection of charge balancing electrons. For example, some fraction of the ions responsible for the optical transition may be irreversibly bound up in the electrochromic material. In some embodiments, suitable ions include lithium ions (Li+) and hydrogen ions (H+) (i.e. , protons). In some other implementations, other ions can be suitable. Intercalation of lithium ions, for example, into tungsten oxide (W03-y (0 < y £ -0.3)) causes the tungsten oxide to change from a transparent state to a blue state.

[0319] In some embodiments, a tinting transition is a transition from a transparent (or “translucent,” “bleached” or “least tinted”) state to an opaque (or “fully darkened” or “fully tinted”) state. Another example of a tinting transition is the reverse (e.g., a transition from an opaque state to a transparent state). Other examples of tinting transitions include transitions to and from various intermediate tint states, for example, a transition from a less tinted, lighter or more transparent state to a more tinted, darker or less transparent state, and vice versa. Each of such tint states, and the tinting transitions between them, may be characterized or described in terms of percent transmission. For example, a tinting transition can be described as being from a current percent transmission (% T) to a target % T. Conversely, in some other instances, each of the tint states and the tinting transitions between them may be characterized or described in terms of percent tinting; for example, a transition from a current percent tinting to a target percent tinting.

[0320] In some embodiments, a voltage applied to the transparent electrode layers (e.g. across the EC stack) follows a control profile used to drive a transition in an optically switchable device. For example, a window controller can be used to generate and apply the control profile to drive an ECD from a first optical state (for example, a transparent state or a first intermediate state) to a second optical state (for example, a fully tinted state or a more tinted intermediate state). To drive the ECD in the reverse direction — from a more tinted state to a less tinted state — the window controller can apply a similar but inverted profile. In some embodiments, the control profiles for tinting and lightening can be asymmetric. For example, transitioning from a first more tinted state to a second less tinted state can in some instances require more time than the reverse; that is, transitioning from the second less tinted state to the first more tinted state. In some embodiments, the reverse may be true. Transitioning from the second less tinted state to the first more tinted state can require more time. By virtue of the device architecture and materials, bleaching or lightening may not necessarily (e.g., simply) the reverse of coloring or tinting. Indeed, ECDs often behave differently for each transition due to differences in driving forces for ion intercalation and deintercalation to and from the electrochromic materials.

[0321] Fig. 24 shows an example control profile 2400 as a voltage control profile implemented by varying a voltage provided to the ECD. For example, the solid line in Fig. 24 represents an effective voltage VEff applied across the ECD over the course of a tinting transition and a subsequent maintenance period. For example, the solid line can represent the relative difference in the electrical voltages VAppl and VApp2 applied to the two conducting layers of the ECD. The dotted line in Fig. 24 represents a corresponding current (I) through the device. In the illustrated example, the voltage control profile 2400 includes four stages: a ramp-to-drive stage 2402 that initiates the transition, a drive stage that continues to drive the transition, a ramp-to-hold stage, and subsequent hold stage.

[0322] In Fig. 24, the ramp-to-drive stage 2402 is characterized by the application of a voltage ramp that increases in magnitude from an initial value at time to to a maximum driving value of VDrive at time t1. For example, the ramp-to-drive stage 2402 can be defined by three drive parameters known or set by the window controller: the initial voltage at to (the current voltage across the ECD at the start of the transition), the magnitude of VDrive (governing the ending optical state), and the time duration during which the ramp is applied (dictating the speed of the transition). The window controller may also set a target ramp rate, a maximum ramp rate or a type of ramp (for example, a linear ramp, a second degree ramp or an nth-degree ramp). In some embodiments, the ramp rate can be limited to avoid damaging the ECD.

[0323] In Fig. 24, the drive stage 2404 includes application of a constant voltage VDrive starting at time t1 and ending at time t2, at which point the ending optical state is reached (or approximately reached). The ramp-to-hold stage 2406 is characterized by the application of a voltage ramp that decreases in magnitude from the drive value VDrive at time t2 to a minimum holding value of VHold at time t3. In some embodiments, the ramp-to-hold stage 2406 can be defined by three drive parameters known or set by the window controller: the drive voltage VDrive, the holding voltage VHold, and the time duration during which the ramp is applied. The window controller may also set a ramp rate or a type of ramp (for example, a linear ramp, a second degree ramp or an nth-degree ramp).

[0324] In Fig. 24, the hold stage 2408 is characterized by the application of a constant voltage VHold starting at time t3. The holding voltage VHold may be used to maintain the ECD at the ending optical state. As such, the duration of the application of the holding voltage Vhold may be concomitant with the duration of time that the ECD is to be held in the ending optical state. For example, because of non-idealities associated with the ECD, a leakage current ILeak can result in the slow drainage of electrical charge from the ECD.

Such a drainage of electrical charge can result in a corresponding reversal of ions across the ECD, and consequently, a slow reversal of the optical transition. The holding voltage VHold can be continuously applied to counter or prevent the leakage current. In some embodiments, the holding voltage VHold is applied periodically to “refresh” the desired optical state, or in other words, to bring the ECD back to the desired optical state.

[0325] The voltage control profile 2400 illustrated and described with reference to Fig. 24 is only one example of a voltage control profile suitable for some implementations. However, many other profiles may be desirable or suitable in such implementations or in various other implementations or applications. These other profiles also can readily be achieved using the controllers and optically switchable devices disclosed herein. For example, a current profile can be applied instead of a voltage profile. In some embodiments, a current control profile similar to that of the current density shown in Fig. 24 can be applied. In some embodiments, a control profile can have more than four stages. For example, a voltage control profile can include one or more overdrive stages. For example, the voltage ramp applied during the first stage 2402 can increase in magnitude beyond the drive voltage VDrive to an overdrive voltage VOD. The first stage 2402 may be followed by a ramp stage 2303 during which the applied voltage decreases from the overdrive voltage VOD to the drive voltage VDrive. In some embodiments, the overdrive voltage VOD can be applied for a relatively short time duration before the ramp back down to the drive voltage VDrive.

[0326] In some embodiments, the applied voltage or current profiles are interrupted for relatively short durations of time to provide open circuit conditions across the device. While such open circuit conditions are in effect, an actual voltage or other electrical characteristics can be measured, detected, or otherwise determined to monitor how far along an optical transition has progressed, and in some instances, to determine whether changes in the profile are desirable. Such open circuit conditions also can be provided during a hold stage to determine whether a holding voltage VHold should be applied or whether a magnitude of the holding voltage VHold should be changed. Examples related to controlling optical transitions is provided in PCT Patent Application No. PCT/US 14/43514 filed June 20, 2014, and titled CONTROLLING TRANSITIONS IN OPTICALLY SWITCHABLE DEVICES, which is hereby incorporated by reference in its entirety. [0327] In one or more aspects, one or more of the functions described herein may be implemented in hardware, digital electronic circuitry, analog electronic circuitry, computer software, firmware, including the structures disclosed in this specification and their structural equivalents thereof, or in any combination thereof. Certain implementations of the subject matter described in this document also can be implemented as one or more controllers, computer programs, or physical structures, for example, one or more modules of computer program instructions, encoded on a computer storage media for execution by, or to control the operation of window controllers, network controllers, and/or antenna controllers. Any disclosed implementations presented as or for electrochromic windows can be more generally implemented as or for switchable optical devices (including windows, mirrors, etc.).

[0328] Various modifications to the embodiments described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the claims are not intended to be limited to the implementations shown herein but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein. Additionally, a person having ordinary skill in the art will readily appreciate, the terms “upper” and “lower” are sometimes used for ease of describing the figures, and indicate relative positions corresponding to the orientation of the figure on a properly oriented page, and may not reflect the proper orientation of the devices as implemented.

[0329] Certain features that are described in this specification in the context of separate implementations also can be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also can be implemented in multiple implementations separately or in any suitable sub combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub combination or variation of a sub combination.

[0330] Similarly, while operations are depicted in the drawings in a particular order, this does not necessarily mean that the operations are required to be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the drawings may schematically depict one more example processes in the form of a flow diagram. However, other operations that are not depicted can be incorporated in the example processes that are schematically illustrated. For example, one or more additional operations can be performed before, after, simultaneously, or between any of the illustrated operations. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.

[0331] While preferred embodiments of the present invention have been shown, and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. It is not intended that the invention be limited by the specific examples provided within the specification. While the invention has been described with reference to the afore-mentioned specification, the descriptions and illustrations of the embodiments herein are not meant to be construed in a limiting sense. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. Furthermore, it shall be understood that all aspects of the invention are not limited to the specific depictions, configurations, or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the embodiments of the invention described herein might be employed in practicing the invention. It is therefore contemplated that the invention shall also cover any such alternatives, modifications, variations, or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

[0332] Example embodiments:

[0333] Clause 1 : An apparatus for controlling at least one device in a vehicle, the apparatus comprising at least one controller that is configured to: operatively couple to a device ensemble disposed in or on the vehicle, receive, or direct receipt of, data from the device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and alter, or direct alteration of, at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration. [0334] Clause 2: The apparatus of clause 1, wherein the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a photosensor, an airflow sensor, an oxygen sensor, and a carbon dioxide sensor.

[0335] Clause 3: The apparatus of any one of clauses 1 or 2, wherein the device ensemble comprises a transceiver, a processor, a memory device, a network adaptor, or a controller.

[0336] Clause 4: The apparatus of any one of clauses 1-3, wherein at least one component of the device ensemble is reversibly removeable, wherein reversibly removable includes an ability to remove and insert the at least one component from the device ensemble in a reversible manner. [0337] Clause 5: The apparatus of any one of clauses 1-4, wherein the at least one device comprises (i) a tintable window, (i) a heating ventilation and air conditioning (HVAC) component, a media display, or a safety component, of the vehicle.

[0338] Clause 6: The apparatus of any one of clauses 1-5, wherein the at least one device comprises two or more devices of the vehicle, and wherein altering, or directing alteration of, the present state of the at least one device comprises altering the present states of each of the two or more devices of the vehicle differently.

[0339] Clause 7: The apparatus of any one of clauses 1-6, wherein the two or more devices of the vehicle are on opposite sides of the vehicle.

[0340] Clause 8: The apparatus of any one of clauses 1-7, wherein the alteration is based at least in part on an output of a machine learning model.

[0341] Clause 9: The apparatus of any one of clauses 1-8, wherein the alteration is based at least in part on a requested state of an occupant of the vehicle.

[0342] Clause 10: The apparatus of any one of clauses 1-9, wherein the requested state is determined based at least in part on an output of a machine learning model. [0343] Clause 11: The apparatus of any one of clauses 1-10, wherein the requested state is determined based at least in part on historical data associated with the occupant of the vehicle.

[0344] Clause 12: The apparatus of any one of clauses 1-11, wherein the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on data received from the device ensemble indicating that the vehicle is unoccupied. [0345] Clause 13: The apparatus of any one of clauses 1-12, wherein the at least one device comprises a tintable window, and wherein a tint state of the tintable window is based at least in part on a projected route of the vehicle.

[0346] Clause 14: A method for controlling at least one device in a vehicle, the method comprising: receiving data from a device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and altering at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

[0347] Clause 15: A non-transitory computer-readable program instructions for controlling at least one device in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: receiving, or directing receipt of, data from a device ensemble, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter; and altering, or directing alteration of, at least one of: (i) a present state of the at least one device in the vehicle based at least in part on the data from the device ensemble; and/or (ii) a present environment of the vehicle based at least in part on the data from the device ensemble, to constitute an alteration.

[0348] Clause 16: An apparatus for media viewing in a vehicle, the apparatus comprising at least one controller that is configured to: (a) operatively couple to a tintable window of the vehicle and to a media display; and (b) control, or direct control of, (i) an optical state of the tintable window; and/or (ii) presentation of media content on the media display of the vehicle.

[0349] Clause 17: The apparatus of clause 16, wherein the media display is an organic light emitting diode (OLED) device.

[0350] Clause 18: The apparatus of any one of clauses 16 or 17, wherein the at least one controller is configured to control, or direct control of, a transparency of the media display and/or media projected by the media display.

[0351] Clause 19: The apparatus of any one of clauses 16-18, wherein the at least one controller is configured to control, or direct control of, the optical state of the tintable window at least in part by increasing a contrast between the tintable window and the media display.

[0352] Clause 20: The apparatus of clause 19, wherein the at least one controller is configured to increase the contrast between the tintable window and the media display synergistically with the presentation of the media content on the media display. [0353] Clause 21: The apparatus of any one of clauses 19 or 20, wherein the at least one controller is configured to increase the contrast between the tintable window and the media display in response to controlling, or directing control of, the presentation of the media content on the media display.

[0354] Clause 22: The apparatus of any one of clauses 16-21 , wherein the media content comprises video content from a video, a television show, a movie, a video conference, and/or an advertisement.

[0355] Clause 23: The apparatus of any one of clauses 16-22, wherein the media content comprises image content and/or text content from one or more documents.

[0356] Clause 24: The apparatus of any one of clauses 16-23, wherein the media content is streamed from a remote server.

[0357] Clause 25: The apparatus of any one of clauses 16-24, wherein the media display comprises one or more speaker devices, and wherein the at least one controller is configured to control, or direct control of, presentation of audio content by the one or more speaker devices.

[0358] Clause 26: The apparatus of any one of clauses 16-25, wherein the media display is connected to the tintable window, or is incorporated in an integrated unit forming the tintable window.

[0359] Clause 27: A method for media viewing in a vehicle, the method comprising: controlling (i) an optical state of a tintable window of the vehicle; and/or (ii) presentation of media content on a media display of the vehicle.

[0360] Clause 28: A non-transitory computer readable program instructions for media viewing in a vehicle, the non-transitory computer readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: controlling, or directing control of, (i) an optical state of a tintable window of the vehicle; and/or (ii) presentation of media content on a media display of the vehicle.

[0361] Clause 29: An apparatus for controlling at least one device in a vehicle, the apparatus comprising at least one controller that is configured to: identify, or direct identification of, at least one gesture by an occupant of the vehicle; identify, or direct identification of, a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and cause the at least one device to transition to the state identified.

[0362] Clause 30: The apparatus of clause 29, wherein the at least one controller is part of a hierarchical control system.

[0363] Clause 31: The apparatus any one of clauses 29 or 30, wherein at least one controller of the hierarchical control system is remote from the vehicle.

[0364] Clause 32: The apparatus of any one of clauses 29-31 , wherein the at least one device comprises the tintable window, and wherein the state identified comprises a tint state of the tintable window.

[0365] Clause 33: The apparatus of any one of clauses 29-32, wherein the at least one device comprises the media device, and wherein the state identified comprises initiating presentation of a media content item by the media device.

[0366] Clause 34: The apparatus of any one of clauses 29-33, wherein the at least one device comprises the media device, and wherein the state identified comprises stopping presentation of a media content item by the media device.

[0367] Clause 35: The apparatus of any one of clauses 29-34, wherein the at least one controller is configured to identify, or direct identification of, the at least one gesture by the occupant of the vehicle at least in part by identifying, or directing identification of, a gesture pattern of the occupant of the vehicle.

[0368] Clause 36: The apparatus of clause 35, wherein the gesture pattern comprises (i) facial expression pattern, (ii) voice pattern, or (iii) motion pattern.

[0369] Clause 37: The apparatus of any one of clauses 29-36, wherein the at least one controller is configured to receive, or direct receipt of, data from a device ensemble disposed in or on the vehicle, wherein the device ensemble comprises (A) a plurality of sensors or (B) a sensor and an emitter.

[0370] Clause 38: The apparatus of any one of clauses 29-37, wherein the plurality of sensors comprises two or more of: a geolocation sensor, an occupancy sensor, a motion sensor, a temperature sensor, a visible light photosensor, an airflow sensor, an oxygen sensor, a sound sensor, a pressure sensor, a volatile organic compound (VOC) sensor, a particulate matter sensor, a humidity sensor, and a carbon dioxide sensor. [0371] Clause 39: The apparatus of any one of clauses 29-38, wherein the environmental control device comprises: a heating ventilation and air conditioning HVAC system, lighting, at tintable window, an olfactory compound dispenser, a humidity adjuster, or a music player.

[0372] Clause 40: The apparatus of clause 39, wherein the lighting comprises a media display.

[0373] Clause 41: A method for controlling at least one device in a vehicle, the method comprising: identifying at least one gesture by an occupant of the vehicle; identifying a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and causing the at least one device to transition to the state identified.

[0374] Clause 42: A non-transitory computer-readable program instructions for controlling at least one device in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: identifying, or directing identification of, at least one gesture by an occupant of the vehicle; identifying, or directing identification of, a state of the least one device in the vehicle based at least in part on the gesture identified, wherein the at least one device comprises a tintable window, a media display associated with the tintable window, or an environmental control device; and causing, or directing causation of, the at least one device to transition to the state identified.

[0375] Clause 43: An apparatus for media viewing in a vehicle, the apparatus comprising at least one controller that is configured to: operatively couple to a media display of the vehicle and/or to a tintable window of the vehicle; obtain, or direct obtainment of, information associated with one or more occupants of the vehicle; identify, or direct identification of: (i) media content to be presented by the media display, and/or (ii) a tintable window setting for the tintable window based at least in part on the information obtained; and cause, or direct causing of, (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window.

[0376] Clause 44: The apparatus of clause 43, wherein the information associated with the one or more occupants of the vehicle comprises a stored and/or a historical preference.

[0377] Clause 45: The apparatus of clause 44, wherein the at least one controller is configured to (i) use, or direct usage of, the stored and/or historical preferences as a learning set for a machine learning scheme utilized to predict preference of the user at a future time and/or (ii) operatively couple to a processor performing the machine learning scheme.

[0378] Clause 46: The apparatus of any one of clauses 43-45, wherein the information associated with the one or more occupants of the vehicle comprises at least one user input.

[0379] Clause 47: The apparatus of clause 46, wherein the at least one user input comprises a gesture by at least one occupant of the one or more occupants of the vehicle.

[0380] Clause 48: The apparatus of any one of clauses 46 or 47, wherein the at least one user input comprises input received via a graphical user interface.

[0381] Clause 49: The apparatus of clause 48, wherein the graphical user interface is presented via the media display.

[0382] Clause 50: The apparatus of any one of clauses 43-49, wherein the media content identified comprises: a television show, a movie, an advertisement, a video call, or safety information.

[0383] Clause 51: The apparatus of any one of clauses 43-50, wherein the media content is identified based at least in part on a third party media content application.

[0384] Clause 52: The apparatus of any one of clauses 43-51 , wherein the at least one controller is configured to determine, or direct determination of, whether the vehicle is in motion, wherein causing, or directing causing of, the media display to present the media content identified is based at least in part (i) on a determination of whether the vehicle is in motion and/or (ii) on location of the media display in the vehicle.

[0385] Clause 53: The apparatus of clause 52, wherein the at least one controller is configured to inhibit, or direct inhibition of, presentation of media content in response to a determination (i) that the vehicle is in motion and/or (ii) of location of the media display in the vehicle.

[0386] Clause 54: A method for media viewing in a vehicle, the method comprising: obtaining information associated with one or more occupants of the vehicle; identifying (i) media content to be presented by a media display of the vehicle, and/or (ii) a tintable window setting for a tintable window of the vehicle based at least in part on the information obtained; and causing (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window. [0387] Clause 55: A non-transitory computer-readable program instructions for media viewing in a vehicle, the non-transitory computer-readable program instructions, when read by one or more processors, cause the one or more processors to execute operations comprising: obtaining, or directing obtainment of, information associated with one or more occupants of the vehicle; identifying, or directing identification of, (i) media content to be presented by the media display, and/or (ii) a tintable window setting for the tintable window based at least in part on the information obtained; and causing, or directing causation of, (i) the media display to present the media content identified and/or (ii) the tintable window setting to be applied to the tintable window.