Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUGMENTED REALITY BASED DESIGN
Document Type and Number:
WIPO Patent Application WO/2021/224262
Kind Code:
A2
Abstract:
An augmented reality-based design system includes an augmented reality device, where the augmented reality device has a display and camera. The system further includes a measurement device, in communication with the augmented reality device, where measured values of a parameter measured by the measurement device are provided to an augmented reality device. The augmented reality device is capable of determining locations of the augmented reality device each associated with a measured value of the parameter taken by the measurement device at each location. The augmented reality device being further capable of providing, via the display, a spatial mapping of the locations within the target area, and displaying the measured values overlaid on a real-time image of a target physical area on the display, each measured value displayed near a corresponding location where each measured value was measured.

Inventors:
MATHUR TARUN (NL)
GOLE AJAY (NL)
CHO NAM (NL)
JOSHI PARTH (NL)
Application Number:
PCT/EP2021/061732
Publication Date:
November 11, 2021
Filing Date:
May 04, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SIGNIFY HOLDING BV (NL)
International Classes:
G06T19/00; G01J1/02; G01J3/02; G06Q30/06; G06T15/50; H04W4/02; H04W4/38; H05B47/105; H05B47/11
Foreign References:
US201816195581A2018-11-19
Attorney, Agent or Firm:
VAN EEUWIJK, Alexander, Henricus, Walterus et al. (NL)
Download PDF:
Claims:
CLAIMS:

1. An augmented reality -based design system, comprising: an augmented reality device, wherein the augmented reality device has a display and camera; a measurement device, in communication with the augmented realty device, wherein a plurality of measured values of a parameter measured by the measurement device are provided to an augmented reality device; wherein the augmented reality device is capable of: determining a plurality of locations of the augmented reality device each associated with a measured value of the parameter taken by the measurement device at each location resulting in the plurality of measured values, providing, via the display, a spatial mapping of the plurality of locations within the target area, and displaying, on the display, the plurality of measured values overlaid on a real time image of a target physical area, each measured value displayed near a corresponding location where each measured value was measured.

2. The system of Claim 1, wherein the measured device is attached to an augmented reality device, wherein the measurement device is moved along with the augmented reality device during the measuring of the measured values.

3. The system of Claim 1, wherein the measurement device is a lux measurement device and the parameter is the illuminance value detected by the lux measurement device.

4. The system of Claim 3, wherein the lux measurement device is capable of measuring ultraviolent light.

5. The system of Claim 1, wherein the augmented reality device is further capable of: determining a recommended device for installation within the target physical area based on the plurality of measured values, and displaying a three dimensional model of the recommended device.

6. The system of Claim 5, wherein the augmented reality device is further capable of: determining an updated parameter value at a location of the plurality of locations based on parameter data associated with the recommended device; and displaying the recommended device on the display with the updated parameter value displayed at the location.

7. The system of Claim 5, wherein the augmented reality device displays a plurality of recommended devices for selection by a user.

8. The system of Claim 5, wherein the recommended device is a luminaire.

9. The system of Claim 1, wherein displaying, on the display, the plurality of measured values overlaid on a real-time image of a target physical area includes color coding the measured values.

10. The system of Claim 1, wherein the measurement device is one of a plurality of measurement devices, each remotely located from the augmented reality device and located in or mounted to a ceiling.

11. The system of Claim 10, wherein each location of the plurality of locations are associated with at least one measurement device of the plurality of measurement devices.

12. The system of Claim 11, wherein the plurality of measurement devices are a plurality of sensors, wherein each sensor is included in or on a luminaire, and wherein the measured values displayed are the measured values measured by each sensor.

13. The system of Claim 11, wherein the plurality of measurement devices are air quality sensors, and wherein the measured values displayed are associated with an air quality parameter.

14. The system of Claim 11, wherein the plurality of measurement devices measure air flow. 15. The system of Claim 14, wherein the augmented reality device is further capable of: generating an air flow image indicating air flow values and direction of air flow based on the measurement devices, and displaying the air flow image overlaid on the real time image of the target display area.

Description:
Augmented reality based design

RELATED APPLICATIONS AND CLAIM OF PRIORITY

The present application claims priority to U.S. Provisional Patent Application No. 63/021,217 filed May 7, 2020 and titled “Augmented Reality Based Lighting Design.” The entire content of the foregoing application is hereby incorporated herein by reference.

TECHNICAL FIELD

The present disclosure relates generally to lighting, and more particularly to lighting augmented reality based lighting design.

BACKGROUND

During the commissioning of luminaires, one of the factors that may be considered is the light intensity levels of the lights provided by the luminaires. In some cases, after the initial commissioning of a luminaire, the light intensity of a light provided by the luminaire decreases over time. The light intensity may be reduced to a level such that the replacement of the luminaire or a light engine of the luminaire may be required. In some cases, a new luminaire may be added in the space. To accurately determine whether to replace or add a luminaire in a space, the light intensity level at different locations along a work plane (e.g., at the height of desks in the space) may first be determined. However, reliably determining the light intensity level at different locations along a work plane to decide whether to replace or a luminaire may be challenging.

SUMMARY OF THE INVENTION

In one aspect, the present disclosure relates generally to augmented reality and more particularly to the operation of an augmented reality-based design system. The augmented reality-based design system includes an augmented reality device, where the augmented reality device has a display and camera. The system further includes a measurement device, in communication with the augmented realty device, where measured values of a parameter measured by the measurement device are provided to an augmented reality device. The augmented reality device is capable of determining locations of the augmented reality device each associated with a measured value of the parameter taken by the measurement device at each location. The augmented reality device being further capable of providing, via the display, a spatial mapping of the locations within the target area, and displaying the measured values overlaid on a real-time image of a target physical area on the display, each measured value displayed near a corresponding location where each measured value was measured.

In some example embodiments, the measured device is attached to an augmented reality device, where the measurement device is moved along with the augmented reality device during the measuring of the measured values. The measurement device may be a lux measurement device and the parameter is the illuminance value detected by the lux measurement device. Further, in some embodiments, the lux measurement device is capable of measuring ultraviolent light.

In another example embodiment, the augmented reality device may be further capable of determining a recommended device for installation within the target physical area based on the measured values and displaying a three dimensional model of the recommended device. In another example embodiment, the augmented reality device may be further capable of determining an updated parameter value at a location based on parameter data associated with the recommended device and displaying the recommended device on the display with the updated parameter value displayed at the location. In some example embodiments, the augmented reality device displays multiple recommended devices for selection by a user. In some embodiments, the recommended device may be a luminaire. In some example embodiments, displaying measured values overlaid on a real-time image of a target physical area may include color coding the measured values on the display.

In another example embodiments the measurement device is one of several measurement devices, each being remotely located from the augmented reality device and located and located in or mounted to a ceiling. In some example embodiments, each location may be associated with at least one measurement device. In some example embodiments, the measurement devices are sensors, where each sensor is included in or on a luminaire and the measured values displayed on the display are the measured values measured by each sensor. In other example embodiments, the measurement devices are air quality sensors and the measured values displayed are associated with an air quality parameter. In other example embodiments, the measurement devices measure air flow. In some example embodiments, the augmented reality device is further capable of generating an air flow image indicating air flow values and direction of air flow based on the measurement devices and displaying the air flow image overlaid on the real time image of the target display area.

These and other aspects, objects, features, and embodiments will be apparent from the following description and the appended claims.

BRIEF DESCRIPTION OF THE FIGURES

Reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:

FIGS. 1 A and IB illustrate an augmented reality device for lighting design and internet of things (IoT) design according to an example embodiment;

FIG. 2 illustrates a block diagram of the augmented reality device of FIG. 1 A according to an example embodiment;

FIG. 3 illustrates a lighting design system including the augmented reality device of FIG. 1 A for improving the lighting of an area according to an example embodiment;

FIG. 4 illustrates the lighting design system of FIG. 3 showing a 3-D model of lighting fixture overlaid on a real-time image of a target area according to an example embodiment;

FIG. 5 illustrates an air floor measurement and display system including the augmented reality device of FIG. 1A according to an example embodiment;

FIG. 6 illustrates an air humidity measurement and display system including the augmented reality device of FIG. 1 A according to an example embodiment;

FIG. 7 illustrates an air quality measurement and display system including the augmented reality device of FIG. 1 A according to an example embodiment;

FIG. 8 illustrates the augmented reality device of FIG. 1 A being used for ultraviolet lighting design according to an example embodiment;

FIG. 9 illustrates a 3-D model of a lighting fixture that emits an ultraviolet light and ultraviolet light intensity values determined based on parameter data associated with the 3-D model according to an example embodiment;

FIG. 10 illustrates a 3-D model of lighting fixture that emits an ultraviolet light and ultraviolet light intensity values overlaid on a real-time image of a target area according to an example embodiment;

FIG. 11 illustrates a method of augmented reality -based lighting design to improve a lighting of an area according to an example embodiment; FIG. 12 illustrates a method of augmented reality -based lighting design for ultraviolet light lighting fixtures according to an example embodiment; and

FIG. 13 illustrates a method of augmented reality -based lighting design for ultraviolet light lighting fixtures according to an example embodiment.

The drawings illustrate only example embodiments and are therefore not to be considered limiting in scope. The elements and features shown in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the example embodiments. Additionally, certain dimensions or placements may be exaggerated to help visually convey such principles. In the drawings, the same reference numerals used in different drawings may designate like or corresponding but not necessarily identical elements.

DETAILED DESCRIPTION OF THE EXAMPLE EMBODIMENTS

In the following paragraphs, example embodiments will be described in further detail with reference to the figures. In the description, well-known components, methods, and/or processing techniques are omitted or briefly described. Furthermore, reference to various feature(s) of the embodiments is not to suggest that all embodiments must include the referenced feature(s).

In some example embodiments, a measurement device is attached to an augmented reality device, and measurements are made in an area by the measurement device. Parameters that are measured by the measurement device may include, for example, illuminance (i.e., light intensity. The augmented reality device associates measured values with locations of measurements and displays the measured values overlaid on the real-time image of the area captured by the camera of the augmented reality device. The augmented reality device may display the measured value in one or more formats including displaying the values and using different colors that represent ranges of values. The augmented reality device may also process the measured values and associated location information to determine if changes are needed at some locations of the area. For example, the augmented reality device may determine that the illuminance values measured by a light measurement device attached to the augmented reality device are too low at some locations of the area and may indicate such locations. In some example embodiments, a user may indicate to the augmented reality device locations that have illuminance below a desired level displayed measured values. The augmented reality device may recommend products, such as lighting fixtures, that can be used to improve measured values at some locations indicated by a user or determined by the augmented reality device as having low illuminance.

FIGS. 1 A and IB illustrate an augmented reality device 100 for lighting design according to an example embodiment. In some example embodiments, FIG. 1 A illustrates a back side of the augmented reality device 100, and FIG. IB illustrates the front side of the augmented reality device 100. For example, the augmented reality device 100 may be a tablet, a smartphone, etc.

Referring to FIGS. 1 A and IB, in some example embodiments, the augmented reality (AR) device 100 may include a back-facing camera 102 on a back side of the augmented reality device 100. The AR device 100 may also include a viewport/display screen 106 on a front side of the augmented reality device 100. In some example embodiments, the AR device 100 may also include a front-facing camera 104, a user input area 108, an ambient light sensor 110, accelerometers, or other sensors useful in determining orientation or real-time feedback from the physical space the AR device 100 is located for use in interpreting and displaying the AR on the display 106 of the AR device 100.

In some example embodiments, the viewport 106 may be used to display images as seen by the cameras 102, 104 as well as to display objects (e.g., icons, text, etc.) stored, received, and/or generated by the AR device 100. The viewport 106 may also be used as a user input interface for the AR device 100. For example, the viewport 106 may be a touch sensitive display screen. The viewport 106 may contain a number of pixels in the vertical and horizontal directions (known as display resolution). For example, the viewport 106 may have a display resolution of 2048 x 1536. Each pixel may contain subpixels, where each subpixel typically represents red, green, and blue colors.

In some example embodiments, an image of a physical/real area in front of the AR device 100 may be displayed on the viewport 106 in real time as viewed by the camera 102. For example, the AR device 100 may include a lighting design AR application that activates the camera 102 such that a real-time image of the physical space viewed by the camera 102 is displayed on the viewport 106. Alternatively, the camera 102 may be enabled/activated to display a real-time image of the physical space before or after the lighting design AR application started. In some example embodiments, the real-time image displayed on the physical space may be displayed with a slight delay.

In some example embodiments, the AR device 100 may include an artificial intelligence application and/or components that can automatically suggest/provide recommended types of lighting fixtures. The AR device 100 may also suggest location, orientation, and/or an appropriate number of lighting fixtures based on characteristics associated with the light fixtures (e.g., glare, intensity, available color temperatures or colors, available optics or accessories that change the beam angle or distribution produced by the light fixture, etc.), measured illuminance, etc. For example, the artificial intelligence software application and/or component may identify or suggest the right location for a certain fixture in the observed space, which results in requiring minimal input, interaction, and decision making by a user in achieving lighting design of a physical space/area.

FIG. 2 illustrates a block diagram of the augmented reality device 100 of FIG.

1 A according to an example embodiment. Referring to FIGS. 1 A, IB, and 2, in some example embodiments, the AR device 100 includes a controller 202, a camera component 204, a display component 206, an input interface 208, a memory device 212, and a communication interface 214. For example, the camera component 204 may correspond to or may be part of the cameras 102, 104. The display component 206 may correspond to or may be part of the viewport/display screen 106 and may include circuitry that enables or performs displaying of information (e.g., images, text, etc.) on the viewport 106. For example, the pixels of the viewport may be set/adjusted to display the image as viewed by the camera 102 or 104. The input interface 208 may include the user input area 108 and/or the user input capability of viewport 106. For example, the display component 206 and the input interface 208 may make up or may be part of the viewport 106, where the viewport 106 is, for example, a touch-sensitive display screen. The communication interface 214 may be used for communication, wirelessly or via a wired connection, by the AR device 100. For example, the communication interface 214 may include a USB port that can be used to connect an external device (e.g., a measurement device) to the AR device 100.

The controller 202 may include one or more microprocessors and/or microcontrollers that can execute software code stored in the memory device 212. For example, the software code of the lighting design AR application may be stored in the memory device 212 or retrievable from a remote storage location (e.g., cloud service or remotely located server or database) via the communication interface 214 or other communication means. Other executable software codes used in the operation of the AR device 100 may also be stored in the memory device 212 or in another memory device of the AR device 100. For example, artificial intelligence lighting and/or other software may be stored in the memory device 212 as part of the AR application or along with the AR application and may be executed by the controller 202. To illustrate, the controller 202 may execute the artificial intelligence application or another software code to identify locations that have low illuminance and automatically suggest/provide recommended type(s) of lighting fixtures along with additional information such as suggested location, orientation, and/or an appropriate number of lighting fixtures. In general, the one or more microprocessors and/or microcontrollers of the controller 202 execute software code stored in the memory device 212 or in another device to implement the operations of the AR device 100 described herein. In some example embodiments, the memory device 212 may include a non-volatile memory device and volatile memory device.

In some example embodiments, data that is used in or generated during the execution of the lighting design AR application and other code may also be retrieved and/or stored in the memory device 212 or in another memory device of the AR device 100 or retrieved from a remote storage location (e.g., cloud service or remotely located server or database) via the communication interface 214 or other communication means. For example, 3-D models of lighting fixtures and photometric data files (e.g., IES files, ultraviolet light parameter files), associated with the lighting fixture models may be stored in the memory device 112, or retrieved from storage on a remote “cloud” -based service, and may be retrieved during execution of the lighting design AR application. 3-D models of other devices such as sensors, cameras, microphones, speakers emitter/detector, wireless devices such as Bluetooth or Wi-Fi repeater, etc. and parameter data associated with the devices may be stored in the memory device 112, or stored in and retrieved from storage on a remote cloud-based service, and may be retrieved during execution of AR applications on the AR device 100.

The data stored and/or retrieved may include information such as range, viewing angle, resolution or similar operation information that may be visualized through the AR device). For example, the data may contain necessary information to estimate one or more view angles and range that is produced by sensor (e.g., motion, light, temperature, humidity, sound or other type of sensor) or an accessory device, such as camera, microphone, speaker, emitter/detector, wireless device like Bluetooth or Wi-Fi repeater, etc. within a three dimensional space. The files may also include other information about the light emitted by the sensor or the accessory device.

In some example embodiments, the lighting design AR application stored in the memory device 112 may incorporate or interface with an augmented reality application/software, such as ARKit, ARCore, etc., that may also be stored in the memory device 112 or called upon from or provided via a remote storage location (e.g., cloud service or remotely located server or database) via the communication interface 214 or other communication means.

The controller 202 may communicate with the different components of the AR device 100, such as the camera component 204, etc., and may execute relevant code, for example, to display a real-time image as viewed by the camera 102 and/or 104 as well as other image objects on the viewport 106.

FIG. 3 illustrates a lighting design system 300 including the augmented reality device 100 of FIG. 1 A for improving the lighting of a target area 304 according to an example embodiment. Referring to FIGS. 1 A-3, in some example embodiments, the system 100 includes the AR device 100 and a lux measurement device 302. The lux measurement device 302 is attached to the AR device 100. For example, the lux measurement device 302 is attached to the AR device 100 via a cable as shown in FIG. 3 or may be directly plugged into a port of the AR device 100. The lux measurement device 302 measures light intensity levels. As the light intensity level changes, the light intensity level measured and indicated by the lux measurement device 302 also changes. For example, because light intensity levels may be different at different parts of the area 304 (e.g., a room), when a user holding the lux measurement device 302 moves around the area 304, the lux measurement device 302 may measure and indicate different light intensity values. The lux measurement device 302 may be moved at a particular elevation above the floor of the area 304. For example, the lux measurement device 302 may be moved around to make measurements at a work plane level, for example, to determine light intensity levels at an elevation where people may sit to work. As another example, the lux measurement device 302 may be moved to make measurements at the floor or close to the floor level.

In some example embodiments, the lux measurement device may provide the light intensity values to the AR device 100 on a real time basis as the light intensity is being measured by the lux measurement device 302. As the lux measurement device 302 is moved around the area attached to the AR device 100, the AR device 100 may determine the location of the AR device 100 (thus, effectively the location of the lux measurement device 302) and associate particular locations with the measured light intensity level values provided by the lux measurement device 302. For example, the AR device 302 may determine the location of the AR device 100 in the area 304 based on indoor position system (GPS), global positioning system (GPS), or other means. As a non-limiting particular example, the AR device 100 may execute one or more ARKit 3.0 modules to perform position tracking. In some example embodiments, the AR device 100 may display a real time image 312 of the area 304 on the viewport 106 of the AR device 100. For example, the real time image 312 may be captured by the camera 102 of the AR device 100. The AR device 100 may augment the real time image 312 of the area 304 with light intensity (illuminance) information 306. For example, the AR device 100 may display the light intensity information 306 overlaid on the real time image 312. To illustrate, the light intensity information 306 may include light intensity( i.e., illuminance values) as measured by the lux measurement device 302 and provided to the AR device 100. Because the measured light intensity values are associated by the AR device 100 with locations in the area 304, the AR device 100 may display the illuminance values in association with respective locations in the area 304.

In some example embodiments, the AR device 100 may display the light intensity information 306 using color coding instead of or in addition to illuminance values (e.g., in footcandle). To illustrate, the AR device 100 may display the lighting intensity information as a “heat-map,” where locations at the floor level or a work plane that are associated with different illuminance values are shown in different color. As a non-limiting example, locations that are associated with higher illuminance values may be shown with more reddish colors, locations associated with lower illuminance values may be shown with more bluish colors, and locations associated with mid-range illuminance values may be shown with greenish colors.

In some example embodiments, a user may provide an input to the AR device 100 to identify one or more sections of the area 304 that have undesirably low illuminance. For example, the user may identify a section 308 as a dark spot location (i.e., an area with a lower illuminance than desired). Alternatively, the AR device 100 may execute code to process the measured light intensity level values provided by the lux measurement device 302 to identify one or more dark spots such as the section 308. For example, the AR device 100 may identify location that have illuminance values below a threshold level as dark spots. The AR device 100 may also display indicators (e.g., a particular shape icon) on the viewport 106 to indicate such dark spots.

In some example embodiments, the AR device 100 may execute code to recommend a particular lighting fixture that can be used to remedy one or more dark spots such as the dark spot 308. For example, the AR device 100 may automatically display a 3-D model of recommended lighting fixture overlaid on the real time image 306 as shown in FIG. 4. In some alternative embodiments, the AR device 100 may display a 3-D model of a recommended lighting fixture. In some example embodiments, the AR device 100 may display identification information (e.g., model number, serial number, etc.) of a recommended lighting fixture to remedy a dark spot, and a user may select a 3-D model of the recommended lighting fixture from a menu 304 and place the 3-D model at a desired location in the real time image 312 such that the 3-D model is overlaid on the real time image 312. In some example embodiments, the AR device 100 may not recommend a particular lighting fixture. Instead, a user may select a 3-D model of a desired lighting fixture from the menu 304 and place the 3-D model at a desired location in the real time image 312 such that the 3-D model is overlaid on the real time image 312.

FIG. 4 illustrates the lighting design system 300 of FIG. 3 showing a 3-D model 402 of lighting fixture overlaid on a real-time image 312 of the target area 304 according to an example embodiment. Referring to FIGS. 1-4, in some example embodiments, the 3-D model 402 may correspond to a lighting fixture recommended by the AR device 100 based on the analysis of the illuminance values measured by the lux measurement device 302. In some example embodiments, the 3-D model 402 may be displayed automatically by the AR device 100 as a recommendation of the lighting fixture represented by the 3-D model 402 to remedy a dark spot such as the dark spot 308. In some alternative embodiments, the 3-D model 402 may be selected and placed at a particular location by a user by provide input (e.g., selecting and moving using a finger) to the AR device 102.

In some example embodiments, after the 3-D model 402 is selected or displayed, the AR device 100 may calculate/determine updated illuminance values for the area 304, for example, at a work plane or floor level, and display the light intensity information 404 that is based on the measured light intensity values and calculated illuminance values associated with the 3-D model 402. For example, the illuminance values associated with the 3-D model 402 and, thus, with the lighting fixture represented by the 3-D model 402, may be determined using parameter data (e.g., parameter data in an IES file) for various locations in the area 304 and for a particular installation height of the lighting fixture as represented by the location of the 3-D model 402 in the real time image 312 in a similar manner as described in U.S. Patent Application No. 16/195,581, filed November 19, 2018 (“’581”), which is incorporated herein by reference in its entirety. The illuminance values determined with respect to the 3-D model 402 may be combined with the measured light intensity values to produce the light intensity information 404 that may include illuminance values and/or color coded heat map as described with respect to FIG. 3. The light intensity information 404 may be updated if additional 3-D models of lighting fixtures are added to the real-time image 312.

FIG. 5 illustrates an air floor measurement and display system 500 including the augmented reality device 100 of FIG. 1A according to an example embodiment.

Referring to FIGS. 1 A, IB, 2, and 5, in some example embodiments, the system 500 includes AR device 100 and lighting fixtures including lighting fixtures 504, 506. The AR device 100 may display a real time image 502 of an area 516. For example, the lighting fixtures 504,

506 and other lighting fixtures may be installed in the area 516. The lighting fixtures 504,

506 may each include one or more sensors. For example, the lighting fixture 504 may include an air flow sensor 508 that may be integrated with or attached to the lighting fixture 504, and the lighting fixture 506 may include an air flow sensor 510 that may be integrated with or attached to the lighting fixture 506. The air flow sensors 508, 510 may measure air flows in the area 516 and wirelessly (e.g., using Wi-Fi, BLE, ZigBee, etc. connections) transmit the measured air flow values (e.g., in cfm or other units) to the AR device 100. The sensors 508, 510 may transmit the air flow values to the AR device 100 using the wireless communication components of the lighting fixtures 504, 506.

In some example embodiments, the AR device 100 may display the air flow values (e.g., an air flow value 514) overlaid on the real time image 502 of the area 516. For example, the AR device 100 may display the air flow values in association with particular locations of the area 516. To illustrate, each sensor 508, 510 or respective lighting fixture 504, 506 may transmit the location of the sensor 508, 510 or the lighting fixture 504, 506 to the AR device 100. Based on the association the air flow values with the respective sensor 504, 506 and location, the AR device 100 may display the air flow values overlaid on the real time image 502 in association with the locations corresponding to the locations of the sensors 508, 510. The AR device 100 may display air flow values received from a sensor regardless of whether the sensor is shown in the real time image 502.

In some example embodiments, the AR device 100 may generate an image 512 from the air flow values received from sensors (e.g., the sensor 508, 510) of lighting fixtures (e.g., the sensor 504, 506) in the area 516. The AR device 100 may display the image 512 overlaid on the real time image 502 of the area 516.

FIG. 6 illustrates an air humidity measurement and display system 600 including the augmented reality device 100 of FIG. 1A according to an example embodiment. Referring to FIGS. 1 A, IB, 2, and 6, in some example embodiments, the system 600 includes AR device 100 and lighting fixtures including lighting fixtures 604, 606. The AR device 100 may display a real time image 602 of an area 616. For example, the lighting fixtures 604,

606 and other lighting fixtures may be installed in the area 616. The lighting fixtures 604, 606 may each include one or more sensors. For example, the lighting fixture 604 may include a humidity sensor 608 that may be integrated with or attached to the lighting fixture 604, and the lighting fixture 606 may include a humidity sensor 610 that may be integrated with or attached to the lighting fixture 606. The air humidity sensors 608, 610 may measure humidity in the area 616 and wirelessly (e.g., using Wi-Fi, BLE, ZigBee, etc. connections) transmit the measured humidity values to the AR device 100. The sensors 608, 610 may transmit the humidity values to the AR device 100 using the wireless communication components of the lighting fixtures 604, 606.

In some example embodiments, the AR device 100 may receive humidity values from different sensors, including the sensors 608, 610, and calculate an average humidity value 612. The AR device 100 may display the calculated average humidity value 612 overlaid on the real time image 602 of the area 616. The AR device 100 may also display an icon 614 overlaid on the real time image 602 of the area 616. The icon 614 graphically illustrates the calculated humidity value 612. The AR device 100 may calculate the average humidity value 612 based on humidity values received from sensors that may not be shown in the real time image 616. For example, when the AR device 100 is turned to view a different section of the area 616, the sensor 610 may not be displayed in the viewport 106, but the humidity information overlaid on the displayed real time image may still include the humidity value provided by the sensor 610.

FIG. 7 illustrates an air quality measurement and display system 700 including the augmented reality device 100 of FIG. 1A according to an example embodiment.

Referring to FIGS. 1 A, IB, 2, and 7, in some example embodiments, the system 700 includes AR device 100 and lighting fixtures including lighting fixtures 704, 706. The AR device 100 may display a real time image 702 of an area 716. For example, the lighting fixtures 704,

706 and other lighting fixtures may be installed in the area 716. The lighting fixtures 704, 706 may each include one or more sensors. For example, the lighting fixture 704 may include an air quality sensor 708 that may be integrated with or attached to the lighting fixture 704, and the lighting fixture 706 may include an air quality sensor 710 that may be integrated with or attached to the lighting fixture 706. The air quality sensors 708, 710 may measure air quality in the area 716 and wirelessly (e.g., using Wi-Fi, BLE, ZigBee, etc. connections) transmit the measured quality values to the AR device 100. For example, the sensors 708, 710 may transmit air quality values for different matters. The sensors 708, 710 may transmit the air quality values to the AR device 100 using the wireless communication components of the lighting fixtures 704, 706.

In some example embodiments, the AR device 100 may receive air quality values from different sensors, including the sensors 708, 710, and calculate an average air quality value 712, 718 for different matters. The AR device 100 may display the calculated average air quality values 712, 718 (e.g., as percentages of maximum acceptable level) overlaid on the real time image 702 of the area 716. The AR device 100 may also display an icon 714 overlaid on the real time image 702 of the area 716 and associated with the air quality value 712. The icon 714 may be a graphical illustration of a particular matter that is identified in a legend 722. The AR device 100 may also display an icon 720 overlaid on the real time image 702 of the area 716 and associated with the air quality value 718. The icon 720 may be a graphical illustration of another matter that is identified in the legend 722. For each matter, the AR device 100 may calculate the average air quality value 712, 718 based on air quality values received from sensors that may not be shown in the real time image 716. In some alternative embodiments, instead of an average air quality value, the highest or lowest air quality value with respect to a particular matter may be shown without departing from the scope of this disclosure.

FIG. 8 illustrates the augmented reality device 100 of FIG. 1 A used for ultraviolet lighting design according to an example embodiment. Referring to FIGS. 1 A, IB, 2, and 8, in some example embodiments, a real time image 802 of a target area 804 may be displayed on the viewport 106 of the AR device 100. For example, the real time image 802 may be captured by the camera 102 of the AR device 100. The menu 310 of lighting fixtures may also be displayed in the viewport 106. The target area 802 may include surfaces such as the floor surface 806 and the tale surface 808. The AR device 100 may perform a spatial mapping of at least a portion of the target physical area 804. For example, the AR device 100 may execute software code, such as modules of ARKit 3.0 or modules of HoloToolkit, to identify surfaces. In some example embodiments, a user may place a 3-D model of a lighting fixture in the viewport 106 overlaid on the real time image 802 as shown in FIG. 10, and the AR device 100 may determine intensity levels of an ultraviolet light that the lighting fixture emits as described with respect to FIG. 9.

FIG. 9 illustrates a 3-D model 902 of a lighting fixture that emits an ultraviolet light, and ultraviolet light intensity values determined based on parameter data associated with the 3-D model 902 according to an example embodiment. Referring to FIGS. 1 A, IB, 2, 8, and 9, in some example embodiments, the 3-D model 902 may correspond to the 3-D model 1004 shown in FIG. 10. The photometric data associated with the 3-D model 902 may include or may be used to determine a light distribution shape and intensity levels of an ultraviolet light that the lighting fixture represented by the 3-D model 902 emits. For example, based on an installation height of the lighting fixture with respect to a particular surface (e.g., a table, a shelf, a desk, a cart, a floor, etc.), the intensity levels of the ultraviolet light may be determined at the different areas of the particular surface. The AR device 100 may determine the distribution shape and the intensity levels of the ultraviolet light in a similar manner as described in U.S. Patent Application No. 16/195,581.

FIG. 10 illustrates a 3-D model of 1004 lighting fixture that emits an ultraviolet light, and ultraviolet light intensity values overlaid on a real-time image of a target area according to an example embodiment. Referring to FIGS. 1 A, IB, 2, and 8-10, in some example embodiments, the 3-D model 1004 is displayed at a particular location in the area 802. For example, a user may select the 3-D model 1004 from the menu 310 and place the 3- D model 1004 at a desired location in the real time image 802. By executing AR software code, such as ARKit 3.0 modules, the AR device 100 associates the 3-D model 1004 with a particular location (including a particular installation height) in the area 802 as can be readily understood by those of ordinary skill in the art with the benefit of this disclosure.

In some example embodiments, the AR device 100 determines the intensity levels of the ultraviolet light that gets emitted by the lighting fixture represented by the 3-D model 1004 in a similar manner as described with respect to FIG. 9 and U.S. Patent Application No. 16/195,581. The AR device 100 may display the calculated intensity level values of the ultraviolet light at different locations of the surfaces 806, 808. The AR device 100 may recalculate the intensity levels of the ultraviolet light if the user moves the 3-D model 1004 to a different location in the real time image 802 as displayed in the viewport 106, if the user selects a different 3-D model from the menu 310, if the user places another 3- D model of a lighting fixture in the real time image 802, etc. In some example embodiments, the AR device 100 may display the intensity levels of the ultraviolet light as color coded (e.g., as a heat map).

FIG. 11 illustrates a method 1100 of augmented reality -based lighting design to improve a lighting of an area according to an example embodiment. Referring to FIGS.

1 A-4 and 11, in some example embodiments, at step 1102, the method 1100 includes attaching a measurement device to an augmented reality device. At step 1104, the method 1100 includes measuring illuminance values in an area using the measurement device, wherein the illuminance values are provided to the augmented reality device. At step 1106, the method 1100 includes determining locations of the augmented reality device while the parameter is being measured the measurement device. At step 1108, the method 1100 includes displaying the illuminance values overlaid on a real-time image of the area at locations corresponding to the measurement values. At step 1102, the method 1110 includes displaying a lighting fixture model overlaid on the real-time image of the area, wherein the lighting fixture model is selected based on illuminance values at a portion of the area.

In some alternative embodiments, the method 1100 may include more or fewer steps than described without departing from the scope of this disclosure. In some alternative embodiments, the steps of the method 1100 may be performed in a different order than shown without departing from the scope of this disclosure.

FIG. 12 illustrates a method 1200 of augmented reality -based lighting design to improve a lighting of an area according to an example embodiment. Referring to FIGS. 1A-2, 8-10, and 12, in some example embodiments, at step 1202, the method 1200 includes displaying, by the augmented reality device 100, a real-time image of a target physical area on a display screen of the augmented reality device. At step 1204, the method 1200 includes displaying, by the augmented reality device 100, a 3-D model 1004 of a lighting fixture in response to a user input, where the 3-D model 1004 is overlaid on the real-time image of the target physical area. The lighting fixture provides an ultraviolet light. At step 1206, the method 1200 includes displaying on the display screen, by the augmented reality device 100, ultraviolet light intensity level values overlaid on the real-time image of the target physical area.

In some alternative embodiments, the method 1200 may include more or fewer steps than described without departing from the scope of this disclosure. In some alternative embodiments, the steps of the method 1200 may be performed in a different order than shown without departing from the scope of this disclosure.

FIG. 13 illustrates a method 1300 of augmented reality -based lighting design for ultraviolet light lighting fixtures according to an example embodiment. Referring to FIGS. 1A-2, 8-10, 12 and 13, in some example embodiments, at step 1302, the method 1300 includes displaying, by the augmented reality device 100, a real-time image of a target physical area on a display screen of the augmented reality device. At step 1304, the method 1300 includes displaying, by the augmented reality device, a lighting fixture 3-D model on the display screen in response to a user input, where the lighting fixture 3-D model is overlaid on the real-time image of the target physical area. At step 1306, the method 1300 includes performing, by the augmented reality device, a spatial mapping of at least a portion of the target physical area. At step 1308, the method 1300 includes determining, by the augmented reality device, ultraviolet light intensity level values of an ultraviolet light on one or more surfaces of the target physical area based on at least parameter data associated with the lighting fixture 3-D model. At step 1310, the method 1300 includes displaying, by the augmented reality device, ultraviolet light intensity level values on the display screen overlaid on the real-time image of the target physical area.

In some alternative embodiments, the method 1300 may include more or fewer steps than described without departing from the scope of this disclosure. In some alternative embodiments, the steps of the method 1300 may be performed in a different order than shown without departing from the scope of this disclosure.

Although particular embodiments have been described herein in detail, the descriptions are by way of example. The features of the example embodiments described herein are representative and, in alternative embodiments, certain features, elements, and/or steps may be added or omitted. Additionally, modifications to aspects of the example embodiments described herein may be made by those skilled in the art without departing from the spirit and scope of the following claims, the scope of which are to be accorded the broadest interpretation so as to encompass modifications and equivalent structures.