Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
INFORMATION GRID
Document Type and Number:
WIPO Patent Application WO/2012/125308
Kind Code:
A1
Abstract:
A lighting system and method is provided for tracking movement within a predetermined area. This system includes a plurality of lights installed at predetermined locations throughout the predetermined area, each having at least one sensor with a field of view. The lights include a computing module operatively associated with each sensor, and a communication module operatively associated with the computing module. The lights are configured to communicate with one another and capture sensor output to identify and record points at which the fields of view overlap with one another, to form a unified sensor network having a composite field of view. The system and method is configured to track changes associated with individual targets within the composite field of view.

Inventors:
HOPPER MICHAEL BLAIR (US)
Application Number:
PCT/US2012/027519
Publication Date:
September 20, 2012
Filing Date:
March 02, 2012
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
REYES HECTOR (US)
International Classes:
G08B23/00
Foreign References:
US20090196206A12009-08-06
US20070003146A12007-01-04
US7488941B22009-02-10
US20100148672A12010-06-17
Other References:
ARORA ET AL.: "A line in the sand: a wireless sensor network for target detection, classification, and tracking", COMPUTER NETWORKS, vol. 46, 2004, pages 605 - 634, Retrieved from the Internet [retrieved on 20120610]
Attorney, Agent or Firm:
SAMPSON, Richard, L. (P.C.50 Congress Street, Suite 51, Boston MA, US)
Download PDF:
Claims:
CLAIMS

1. A lighting system for tracking movement within a predetermined area, the system comprising:

a plurality of lights installed at predetermined locations throughout the predetermined area;

each of the plurality of lights including at least one sensor having a field of view; each of the plurality of lights further including a computing module operatively associated with each sensor and a communication module operatively associated with the computing module;

the communication modules of each of said plurality of lights being configured to communicate with one another to form a network;

the lights being configured to communicate via the network, and using the computing modules, capture sensor output to identify and record points at which the fields of view overlap with one another, wherein the networked lights form a unified sensor network having a composite field of view;

the system being configured to track changes associated with individual targets within the composite field of view.

2. The system of claim 1 , wherein the tracked changes comprise movement of the individual targets through the composite field of view.

3. The system of claim 1 , wherein the tracked changes comprise changes in heat signature of the individual targets in the composite field of view.

4. The system of claim 1, wherein at least one of said computing modules is configured to record the position of said points relative to one another in three-dimensional coordinates.

5. The system of claim 2, wherein the tracked changes comprise movement of individuals through the composite field of view.

6. The system of claim 5, wherein the computing modules are configured to capture and time-stamp sensor output sequentially, to track speed and direction of the individuals moving through the composite field of view.

7. A method for operating a system of sensor equipped lights, the method comprising:

(a) moving a target sequentially through the fields of view of each of the lights of claim 1;

(b) generating, with a sensor, a signal when the target is detected within the sensor's field of view;

(c) capturing and recording the signal with at least one computing module;

(d) identifying points at which the target is simultaneously located within the fields of view of two or more sensors;

(e) recording the position of each of said identified points relative to one another in at least two-dimensional coordinates to form a unified sensor network having a composite field of view which incorporates the fields of view of each of the sensors; and

(f) tracking changes associated with individual targets within the composite field of view.

8. The method of claim 7, wherein said tracking (f) comprises tracking movement of the individual targets through the composite field of view.

9. The method of claim 7, wherein said tracking (f) comprises tracking changes in heat signature of the individual targets in the composite field of view.

10. The method of claim 7, wherein said recording (e) further comprises recording the position of each of said identified points relative to one another in three-dimensional coordinates.

11. The method of claim 8, wherein said tracking (f) comprises tracking the movement of individuals through the composite field of view.

12. The method of claim 11, wherein said tracking (f) is effected sequentially and is time- stamped, to track speed and direction of the individuals moving through the composite field of view.

13. An article of manufacture for operating a plurality of sensor equipped lights, said article of manufacture comprising a computer usable medium having a computer readable program code embodied therein, for:

capturing and recording a signal generated by the sensors;

identifying points at which a target is simultaneously located within the fields of view of two or more of the sensors;

recording the position of each of said identified points relative to one another in at least two-dimensional coordinates to form a unified sensor network having a composite field of view which incorporates the fields of view of each of the sensors; and

tracking changes associated with individual targets within the composite field of view.

Description:
Information Grid

BACKGROUND

Related Application

This application claims priority to U.S. Non-Provisional Application Ser. No.

13/049,175, entitled Information Grid, filed on March 16, 2011, which claims the benefit of U.S. Provisional Application Ser. No. 61/314,351, entitled Information Grid, filed on 3/16/2010, the contents of which are incorporated herein by reference in their entirety for all purposes.

This application is also a Continuation-In-Part of U.S. Patent Application Serial Number 13/019,871 entitled Tracking Method and System, filed on February 2, 2011, which is fully incorporated herein for all purposes. This application is also a Continuation-In-Part of U.S. Patent Application Serial Number 12/630,102 entitled Energy Efficient Lighting System and Method, filed on December 3, 2009, which is fully incorporated herein for all purposes.

This application is also related to U.S. Patent Application Serial No. 12/630,074, filed on December 3, 2009, entitled Electrical Panel, which is incorporated herein by reference in its entirety for all purposes.

Technical Field

This invention relates to a system of networked, sensor equipped light fixtures.

Background Information

There is a need to track movements of people in various types of buildings. Tracking such movements may be advantageous from a safety standpoint, such as to identify the presence of building occupants in the event of a fire. Other applications may include tracking shoppers' movements in a store in order to analyze traffic patterns for the purpose of product placement. One large retail chain recently found, for instance, that sales of breath strips were particularly sensitive to their placement relative to customer traffic patterns in their stores, with up to 80% higher sales depending on location within the store. After moving the breath strips to the same, optimal location in all of its stores, sales increased by several millions of dollars a year.

A need exists for a method and system that facilitates tracking the movement of

5 individuals within buildings or other structures.

SUMMARY

According to one aspect of the invention, a lighting system is provided for tracking movement within a predetermined area. This system includes a plurality of lights installed at o predetermined locations throughout the predetermined area, each having at least one sensor with a field of view. The lights include a computing module operatively associated with each sensor, and a communication module operatively associated with the computing module. The lights are configured to communicate with one another and capture sensor output to identify and record points at which the fields of view overlap with one another, to form a unified sensor network5 having a composite field of view. The system is configured to track changes associated with individual targets within the composite field of view.

In another aspect of the invention, a method for operating the above-described lighting system includes moving a target sequentially through the fields of view of each of the lights, and generating a signal when the target is detected within the sensor's field of view. The signal is o captured and recorded with at least one computing module. The method also included identifying points at which the target is simultaneously located within the fields of view of two or more sensors. The position of each of the identified points relative to one another are recorded in at least two-dimensional coordinates to form a unified sensor network having a composite field of view which incorporates the fields of view of each of the sensors. The method is configured to 5 track changes associated with individual targets within the composite field of view.

In yet another aspect of the present invention, an article of manufacture for operating a plurality of sensor equipped lights, includes a computer usable medium having a computer readable program code embodied therein, for capturing and recording a signal generated by the sensors, identifying points at which a target is simultaneously located within the fields of view of 0 two or more of the sensors, recording the position of each of said identified points relative to one another in at least two-dimensional coordinates to form a unified sensor network having a composite field of view which incorporates the fields of view of each of the sensors, and to track changes associated with individual targets within the composite field of view.

The features and advantages described herein are not all-inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and not to limit the scope of the inventive subject matter.

BRIEF DESCRIPTION OF THE DRAWINGS

Fig. 1 is a perspective view of a smart light useful in embodiments of the present invention;

Fig. 2 is a perspective view of the light of Fig. 1, with notional grids depicting the field of view thereof, at representative elevations;

Fig. 3 is a view similar to that of Fig. 2, for a plurality of lights with overlapping fields of view;

Fig. 4 is a plan view of one of the grids of Figs. 2 and 3, with a tracked individual shown thereon;

Fig. 5 is a view similar to that of Fig. 4, with overlapping grids;

Fig. 6 is a view similar to that of Fig. 5, showing movement of the tracked individual; Fig. 7 is a view similar to that of Fig. 2, for an alternate embodiment of the present invention;

Fig. 8 is a view similar to that of Fig. 3, for the embodiment of Fig. 7;

Fig. 9 is a view similar to that of Fig. 5, for the embodiment of Figs. 7 and 8;

Fig. 10 is a schematic representation of an embodiment of the present invention;

Fig. 11 is a schematic representation of a graphical user interface generated by embodiments of the present invention; and

Figs. 12 and 13 are perspective views of representative targets identified by embodiments of the present invention.

DETAILED DESCRIPTION In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable 5 those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized. It is also to be understood that structural, procedural and system changes may be made without departing from the spirit and scope of the present invention. In addition, well- known structures, circuits and techniques have not been shown in detail in order not to obscure the understanding of this description. The following detailed description is, therefore, not to be o taken in a limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents. For clarity of exposition, like features shown in the accompanying drawings are indicated with like reference numerals and similar features as shown in alternate embodiments in the drawings are indicated with similar reference numerals.

Where used in this disclosure, the term "axial" when used in connection with an

5 element described herein, refers to a direction relative to the element, which is substantially parallel to its longitudinal axis and/or line of sight. Similarly, the term "transverse" refers to a direction other than substantially parallel to the axial direction. The term "computer" or "computing module" is meant to encompass any suitable computing device including a processor, a computer readable medium upon which computer readable program code

o (including instructions and/or data) may be disposed, with or without a user interface. Terms such as "module" and the like are intended to refer to a computer-related component, including hardware, software, and/or software in execution. For example, a module may be, but is not limited to being, a process running on a processor, a processor including an object, an executable, a thread of execution, a program, and a computer. Moreover, the various

5 components may be localized on one computer and/or distributed between two or more

computers. The term "real-time" refers to sensing and responding to external events nearly simultaneously (e.g., within milliseconds or microseconds) with their occurrence, or without intentional delay, given the processing limitations of the system and the time required to accurately respond to the inputs.

0 Embodiments of the system and method of the present invention, including various modules thereof, may be programmed in any suitable language and technology, such as, but not limited to: C++; Visual Basic; Java; VBScript; Jscript; BCMAscript; DHTM1; XML and CGI. Alternative versions may be developed using other programming languages including, Hypertext Markup Language (HTML), Active ServerPages (ASP) and Javascript. Any suitable database technology can be employed, but not limited to: Microsoft Access, Microsoft SQL Server, and IBM AS 400.

Referring now to the appended Figures, embodiments of the present invention will be described. Various embodiments provide for the unification of image sensors placed throughout a building or other predetermined area to create in effect one large unified sensor. For ease of explanation, these embodiments will be shown and described as applied to an array of networked light systems (fixtures) such as disclosed in U.S. Patent Applications

Serial No. 12/630,102, filed on December 3, 2009, entitled Energy Efficient Lighting System and Method (the Ί02 application), and Serial No. 12/630,074, filed on December 3, 2009, entitled Electrical Panel (the Ό74 application), both of which are incorporated herein by reference in their entireties for all purposes. As disclosed therein, these light systems include integral electromagnetic sensors such as CCD (charged-coupled device), CMOS

(complementary metal-oxide semiconductor), APS (active-pixel sensor), PIR (passive infrared), and/or EM (electro-magnetic) sensors. These light systems also include computing modules and communication modules with IP addresses or the like, so that they may communicate with one another over a network.

Turning now to Fig. 1, an example of such a light system 1 includes one or more sensors 2 (e.g., CCD, CMOS, APS), PIRs 3 and LED lighting elements 4. As shown in Fig 2, each light fixture 1 is provided, via its sensor 2, with a field of view which extends

transversely from a line of sight of sensor 2. As shown schematically by divergent lines 8, the field of view of sensor 2 tends to increase with distance (e.g., along the line of sight) from the light. As shown, divergent lines 8 represent the outermost boundary of the field of view at any particular point along the line of sight. In the embodiment shown, lines 8 are disposed at an angle of approximately 60 degrees to one another, though this angle of divergence may be sensor-specific, and/or otherwise range anywhere from about 30 to about 75 degrees. The increasing field of view is shown schematically by substantially planar grids 5, 6 and 7 which successively increase in size in proportion to their distances from the light 1 along the line of sight of sensor 2. It is noted that in the embodiments shown and described, sensors 2 are provided with fixed lenses, so that the angle by which the lines 8 diverge is fixed. It should be recognized, however, that adjustable lenses may be used, so that the angle of divergence may be adjusted as desired. Regardless of whether of not the divergence angle is fixed or adjustable, the light systems may identify and keep track of the angle currently applicable to the sensors to facilitate accurate tracking in 3-D space of objects within the fields of view, as discussed hereinbelow.

Referring now to Fig. 3, an array of lights 1, such as may be installed in the ceilings of rooms within a building, are shown as lights 1A, IB and 1C, each having their own field of view grids shown respectively at 5A, 6A, 7A; 5B, 6B, 7B; and 5C, 6C, 7C. As also shown, the lights are spaced so that their field of view grids overlap at predetermined distances from the lights. In the example shown, the grids closest to the lights, 5A, 5B and 5C, do not overlap, while mutually adjacent grids disposed further from the lights may intersect one another, such as at areas shown generally at 9. These mutually intersecting view grids enable an array of lights 1 to be linked to one another to provide a substantially continuous, composite (linked) field of view. Lights 1 may thus provide a composite field of view that extends substantially continuously throughout any area in which lights 1 are installed.

The lights may be calibrated and communicably coupled to one another using any number of suitable methods. One exemplary method includes placing the lights in calibrate mode and moving a target, such as a pinpoint heat and/or light source (or virtually anything that can be seen by the sensors 2), from the field of view of one light to the next, etc. In particular embodiments, this target is maintained within a predetermined transverse (e.g., horizontal) plane, e.g., height, relative to the lights as it is moved, to facilitate accurate calibration. It may be desirable to complete this process at least twice, at two different heights, such as at the level of field of view 6, 6A, 6B, 6C, etc., and at the level of field of view 7, 7A,

7B, 7C, etc., as shown in Fig. 3.

Turning now to Fig 4, a target 12A is shown within field of view grid 6B of light IB (Fig. 3). It will be understood that this grid 6B is at a known distance from the light IB and/or from the floor of the room in which light IB is installed.

As mentioned above, the field of view of the sensor expands as the distance from the light increases. The number of pixels 11 in the grid is based on the resolution of the CCD or other type of sensor. A 506 pixel resolution is shown for the sake of illustration but the resolution can be in the megapixel range for increased resolution.

Referring now to Fig 5, overlapping grids 6 A, 6B, 6C of a representative installation of lights 1A, IB, 1C (Fig. 2) are shown with a calibrating target 12B moving from grid 6B to 5 grid 6A. While in calibrate mode the lights are configured to communicate with one another.

The lights are configured to broadcast a signal, e.g., in real-time, in the event the target is within its field of view. When adjacent lights simultaneously capture the presence of the target, the particular pixels 11 triggered by the target are identified (e.g., using X and Y coordinates) and recorded as overlapping with one another. The predetermined height (Z o coordinate) of the target may also be recorded.

Referring now to Fig. 6, the target, shown at 12C, has moved to another location which overlaps two of the grids (6A and 6B). The overlapping pixels are again recorded in X, Y and Z coordinates. With the identification of at least two overlapping pixels, e.g., as shown in Figs. 5 and 6, the grids (e.g., 6A and 6B) from two lights (1A, IB) can now be

5 mathematically orientated to one another based on the predetermined height of the target. The greater the number of overlapping pixels, the more precise the calibration and orientation of the grids.

It should be recognized that in particular embodiments, the times at which the individual sensors are triggered by the target may be captured and stored to a database or other o memory device associated with the computing modules of the lights. These time stamps may be used to determine the direction of movement of the target through the various fields of view. This time and direction information may thus be used with or without height information to determine positions of the lights relative to one another. This relative position information may enable the fields of view to be linked to form the composite field of view 5 without the need to repeat the calibration process at multiple elevations.

It is recognized that in some applications it may be desirable to generate a composite field of view that has higher accuracy and/or resolution than that provided by a single pass of a unitary target through the fields of view. In these instances, the target may be passed through the fields of view again at a different height (Z) to provide another set of overlapping pixel 0 locations, such as along the grids of level 7 in Fig. 3 (e.g., grids 7A, 7B, 7C). Completion of these operations using targets moved through the fields of view at at least two distinct heights, provide the networked lights with X, Y, and Z axis coordinates for each point of overlap. This data may then be used to calculate the overlap of the fields of view in three dimensional space.

It should be understood that the aforementioned calibration is not limited to moving a single target through the various fields of vision at one height and then optionally again at another height. Many alternate approaches may be used, such as for example, using a bar or other tool having two or more targets spaced a predetermined distance from one another thereon. A user may then orient the bar/tool so that the targets are disposed at different elevations (heights), and then while maintaining this orientation, move the tool through the various fields of view as discussed above. In this manner, a user need only make a single pass through the fields of view.

In a variation of the above approach, targets may be spun on a disk and passed through the fields of view. As the disk passes through overlapping portions of the fields of view, it may pass through locations in which adjacent light systems 1, 1A, etc., identify a pair of overlapping pixels at a first undetermined height, and another pair of overlapping pixels at a second undetermined height. (It is expected that additional pairs of overlapping pixels may also be identified as the disk rotates about its axis, e.g., due to the well known persistence of vision effect.) As the two targets rotate, the light systems may calculate the three dimensional position (e.g., along X, Y and Z axes) of the targets, based on the apparent maximum distance between the targets, and/or in combination with the known angle of divergence (e.g., of lines 8, Fig. 2) of the individual fields of view.

In particular embodiments, the calibration process continues until the target(s) has moved through all the fields of view of the installed light systems 1, 1 A, etc., to effectively connect all of the light systems together into one large 2D or 3D composite field of view/grid. Once this process is completed, the lights may continue to self calibrate when two or more light systems see the same heat signature. In this manner, for example, the individual light systems/fixtures may be configured to re-calibrate in the event a single person passing through the field of view of one fixture appears simultaneously at an unexpected pixel location of an overlapping field of view of an adjacent fixture.

Still another approach for calibration is to program each light 1, 1A, etc., with its approximate location relative to surrounding light systems. With this location information, overlapping pixels may be identified automatically when an individual or other target is viewed simultaneously by adjacent lights.

Once the installed lights 1, 1A, etc., have been calibrated as discussed above, the fields of view of the various lights will be effectively linked to one another to form a unified sensor network having a composite field of view made up of the fields of view of the various lights. 5 This unified sensor network may then be used to track movement of people or other objects therethrough.

Turning now to Fig. 7, instead of image based sensors, other embodiments of the present invention may use non-image based sensors to track targets. For example, lights 1, 1A, etc., may use passive infrared sensors 3 (Fig. 1) to track movement of people or objects. The o field of view of such sensors expands in a manner similar to that shown and described

hereinabove with respect to sensors 2, e.g., along divergent lines 8 as shown. This field of view of a single sensor 3 is shown schematically at three different elevations 13, 14 and 15. The fields of view of an array of lights 1A, IB, 1C, etc., including any overlap, are shown at 13A-C, 14A-C and 15A-C of Figs. 8 and 9.

5 In this approach, the lights may be initially programmed with their approximate

locations relative to surrounding light systems as discussed above. As a target moves into the field of view, the location of the target may be calculated by tracking which sensors 3 are activated, in combination with the known location of the lights 1, 1A, etc.

Moreover, and/or in the alternative, in any of the embodiments discussed herein, the o speed and location of the target may be determined by successive sampling of the output of the sensors 2, 3. Each sample may be time stamped, to effectively track the time the target takes to move from one location to the next, e.g., from the field of view of one light system, to the field of view of an adjacent light system. Those skilled in the art will recognize that the accuracy and/or resolution of such tracking may be dependent on the resolution of the

5 particular sensors used. Thus, in some embodiments, the image-based CCD sensors of Figs.

2-6 may be expected to provide greater tracking accuracy and/or resolution than the non image-based PIR sensors shown and described with respect to Figs. 7-9. Depending on the resolution of the particular sensors used, and the density of light systems, i.e., the number of lights 1, 1A, etc., deployed within an particular area, the presence and direction of movement 0 of multiple room the occupants may be tracked.

Exemplary methods in accordance with the present invention are shown and described with respect to Tables I and II hereinbelow. As shown, the method includes moving 100 a target sequentially through the fields of view of each of the lights, and generating 102, with a sensor, a signal when the target is detected within the sensor's field of view. At 104, the signal is captured and recorded. Points are identified 106 at which the target is simultaneously located within the fields of view of two or more sensors. The position relative to one another of each of the identified points is recorded at 108 to form a unified sensor network having a composite field of view which incorporates the fields of view of each of the sensors.

Turning now to Table Π, various optional aspects of the foregoing method are shown and described. At 110, the recording may include recording the position of each of said identified points relative to one another in three-dimensional coordinates. At 112, moving may include maintaining the target at a first elevation, and then repeating with the target disposed at a second elevation. At 114, the moving may include simultaneously moving two or more targets, each disposed at mutually distinct elevations, through the fields of view. At 116, the identifying may include using the predetermined viewing angle of the sensors. At 118, the identifying may include broadcasting a signal to the network in real-time, with the identity of any sensor as it is triggered by the target. At 120, the identifying may include recording as overlapping, the fields of view of any sensors broadcasting said signal simultaneously. At 120, the identifying may include identifying overlapping pixels within the overlapping fields of view. At 122, calibration of the lights may be updated by repeating steps 102-108 when sensors of at least two lights are simultaneously triggered by an object passing through the composite field of view. At 124, the lights may be programmed with their installed locations relative to one another. At 126, individuals may be tracked as they move through the composite field of view. At 128, movement of individuals through the composite field of view may be captured sequentially and time- stamped, to track speed and direction.

Table I

Table Π

It is noted that light systems 1, 1A, IB, etc., may be provided with more than one sensor, such as the multiple sensors 3 shown in Fig. 1. While a single sensor may prove sufficient to implement aspects of the embodiments shown and described herein, in some applications, the use of more than one sensor in each light system may be used to relatively improve the accuracy/resolution of tracking.

Referring now to Figs. 10-13, alternate embodiments of the present invention will be shown and described. There may be a need for a system and method configured to gather, assimilate and disperse information captured by the sensors of the embodiments shown and described hereinabove. Using the system and method described below it may be demonstrated that by connecting these lights, a virtual information grid encompassing an entire room, building and buildings may be created. (Although these embodiments will be shown and described with reference to sensor equipped light fixtures, it should be recognized that similar sensors may be used which are not necessarily associated with lights or light fixtures.) For example, companies with multiple sites may use this system and method to connect all of their physical spaces together as if under one sensor. The above-referenced U.S. Patent Application Serial Number 12/630,102 (the Ί02 application), discloses a light fixture, also described hereinabove, that has a microprocessor used to disseminate information in a controlled system and method to surrounding light fixtures of like kind and to a network. The network may connect to wireless or wired devices such as Apple™ computers, iPads, iPhones and

Microsoft® Windows™ based machines. The network may be accessed by virtually any device configured and permitted to connect to it. This approach provides a convenient and simple means for tracking movements and locations of people in a room, building or multiple buildings. Tracking people's locations in a building may be very useful in the event of a fire. Using this system and method the building may be self-aware of all occupants and using preprogrammed evacuation scenarios that may instruct the occupants via built in speakers how to properly evacuate the building. The lights also may be configured to lead people to exit doors with strobe and colored LED lights. This system and method allows authorized personnel such as the fire department to access the network and determine and control the evacuation from virtually anywhere in the world. Access to the information gives the fireman real time information as to the whereabouts of occupants. This information may be displayed on any device allowed access to the data. Central firefighting stations may be located anywhere and watch and control the evacuation on their computers. Commands to the firefighters may be sent directly to the firefighter's headset through the network's connections to the light fixtures. It should be noted that these lights may be equipped with super capacitors and/or batteries to augment electrical power in the event of power being disconnected or a power failure. These lights also may be autonomous and may disconnect from a wired connection and still function wirelessly for a limited amount of time. The firefighter having access to the network may be able to see a floor plan (created by the lights themselves) and any occupant in real time. The mobile device access allows the firefighter on the move to see his current location as well as occupants left in the building in a floor plan view or whatever type of view the user prefers. The firefighter may use a heads-up display built into his protective mask. Using the RFID feature, the firefighters may be identified by name or other means of identity markers. For descriptive purposes it may be like having Google maps with a GPS except this may be zoomed into a building in real-time using infrared, CMOS, CCD, RFID, microphones, speakers, electromagnetic sensors and LEDs as tools to see, hear and communicate.

Another use for this system and method may be security in buildings. As the features presented above clearly show that this system and method not only may be useful in a fire but also knowing the locations of everyone in the building. Using the RFID feature or precise tracking that these lights have, high security buildings may see where everyone is, track them as deemed necessary. It should be noted that the lights used in this system and method may include three-dimensional (3D) capabilities with substantially any resolution and bandwidth 5 currently available or as may become available in the future. A security benefit of using 3D imaging may be to watch what an occupant touches or picks up. The infrared and

electromagnetic "vision" capabilities of the fixture allow for facial recognition software to be utilized, and/or observing whether someone may be distressed simply by tracking changes in their infrared signature. The lights may also be used to detect weaponry.

o Stores may use this system and method to observe and track all occupants in their store. This allows for product placement to be optimized. It also has the capabilities of watching and recording every product going into a shopping cart as well as those that may be hidden under clothing. The lights may also be capable of determining gender at an acceptable percentage level as well as the size and height of the shopper. The lights may watch

5 employees using the RFID and/or by using precise infrared and/or CMOS tracking, such as to determine the productivity of the worker.

Moreover, building and manufacturing facilities may use these embodiments to monitor their equipment to watch for unusual heat signatures which may signify device failures. Substantially any organization may use this system and method to monitor employee o productivity and to help ensure safety, such as by locating someone who may be injured. It may also be used as a deterrent in an assault situation by using the LED lights as a strobe light, such as to disorient an assailant.

Connecting 3 or more of these sensors together helps to provide an accurate 3D view of what may be in the field of view of the sensors. This information may be used for a wide 5 range of applications, ranging from, for example, controlling production robots, precision scanning at high security locations, monitoring and controlling access at shipyards, loading and unloading of containers, robotic stocking of cargo ships, and automated stocking of store shelves, etc.

Referring to Fig. 1, as discussed hereinabove, one NAAL (network addressable

0 autonomous light) 1 equipped with one or more sensors (such as disclosed in the

aforementioned Ί02 application), is shown. Turning to Fig. 2, a representative field of view of this particular embodiment is shown at 8, while matrices of what the CMOS or CCD, etc., sensors see at different distances from the sensors are shown at 5, 6 and 7. And as also discussed hereinabove, fields of view of alternate types of sensors, such as a PIR sensor(s), are shown at 13, 14, and 15 of Fig. 7. An array of NAALs may be calibrated and communicably 5 coupled to one another as shown and described hereinabove with respect to Figs. 3-9.

Referring now to Fig. 10, the NAALs 1 may communicate wirelessly, or may be hardwired from one to another, to communicate via a suitable network 24, such as an Ethernet or WiFi network. Moreover, NAALs 1 may be connected in a peer-to-peer manner, such as shown at 20. Depending on the bandwidth, available power and information needed, the o NAALs may communicate in a packet based communication mode, though the embodiments hereof are not limited to packet based communication. After the NAALs have been calibrated they become part of a unified matrix or collective. The hierarchy of the data that may be communicated from one NAAL to another may be based on the requirements of the user. For example if tracking people is the priority, then only small amounts of infrared data may be 5 required. Each NAAL may take a snap shot at predetermined intervals of the infrared

signatures within its field of view.

Turning now to Fig. 11, the system may generate a graphical user interface (GUI) such as shown at 30. The NAAL system may compress the data for presentation as a single icon for each individual (target), such as shown at 36 and 38. Each target 36, 38 may be given an ID o number as it enters the matrix and this target may be handed off to the adjacent NAAL in a manner described in the Ί02 application. The level of detail captured and transmitted to the network may be limited by the NAAL's software protocols, network bandwidth and/or power constraints. It should be recognized that the network 20, 24 may include a central computer that gathers and disseminates information, e.g., in a client-server configuration. Alternatively, 5 the network may simply include the NAALs themselves, e.g., in a peer-to-peer configuration.

The network may be communicably coupled (e.g., accessed) by substantially any networkable device, such as hardwired or wireless computers 22 and/or hand held devices 26 such as an Apple™ iPhone™ or a PDA, on which the GUI 30 may be displayed.

In the example shown, GUI 30 may be configured for tracking people for security 0 and/or fire safety purposes. In this example, the NAALs 1 (Fig. 10) may be programmed to detect walls and doorways and store this information as a floor plan. This may be accomplished, for example, by the aforementioned calibration, using a pinpoint thermal tip on a device that transmits to the NAALs when it is touching points that connect the wall to the floor. For example, the user may hold the device against the baseboard of a building at the floor, and press a button to transmit a signal representing the location of an inside corner. The 5 same device may include another button that transmits doorways. In this manner the NAALs may see where the walls meet the floors and may be able to extrapolate the corresponding points that make up a room or rooms. It should be noted that walls and floors may be calculated without using a thermal device, the lights equipped with CMOS or infrared types of sensors may be programmed with logic that may see where the walls connect to the floors. For o example, the NAALs may be programmed to analyze traffic patterns to determine the location of walls and doorways. The NAALs may then only need to be calibrated together to draw an effective floor plan. Alterantively, pre-existing floor plans may simply be uploaded to the networked NAALs.

As shown, walls 34 may be clearly delineated on GUI 30, with targets 36, 38 identified5 as they walk or otherwise move through the fields of view of the NAALs. It should be noted that a target 36, 38, may be equipped with an RFID or similar tag, to permit that NAALs to uniquely identify and display the target's location. Another approach for maintaining the identity of a person not equipped with RFID may be that when the person enters the matrix they swipe a card or some manner of securely identifying their identity. The NAALS may now o track that person throughout a properly equipped facility. Using the system and method of tracking people's identity allows users interfacing with the network to watch a particular person, such as via GUI 30 as shown. This approach may also be useful for communicating with individuals, e.g., from user computers 22, 26, via NAALs equipped with speakers and microphones.

5 It should be noted that these embodiments are not limited to one-way communication from the NAALs to the user on a device 22, 26. Rather, two-way communication may take place, e.g., using audible voice commands. For example, a user at a remote location may log into the system via a user device 22, 26, and use GUI 30 to determine whether a particular target is within the matrix. If so, then a link may be opened to permit audible communication 0 from the user to the target, via NAALs equipped with, for example, with speakers. Moreover, the communication may be two-way, e.g., via NAALs equipped with microphones, so that the target's verbal responses may be captured and relayed back to the user via the user's device 22, 26, etc. This approach may be used, for example, to speak with a target as the target walks down a hallway. The audible communication may continue as the person is walking because the NAALs hand off audible and visual communication to the next NAAL.

5 Another useful application may be during fires. A fire department may use the network to see how many people are located in a building with a fire. The NAALs may have preprogrammed evacuation routes that calculate the best scenario of evacuation based on the occupancy of the building and the size of the fire. The size of the fire and smoke density may be determined using the sensors built into the NAALs. For instance, the NAAL's infrared o sensors may detect a fire by its growing heat signature 40 while the CCD or CMOS sensor sees its field of view being obscured by smoke. An advantage of using both types of sensor in this manner is that the flow of smoke may be tracked. This may aid in personnel evacuation by determining the most effective evacuation routes. Because the NAALs may be configured to act autonomously, they may direct some people out to one exit as they direct others to

5 different exits. In particular embodiments, fire department personnel or any other authorized users may override the evacuation and direct groups or individuals to any exit. They also may communicate directly with individual targets, such as for directing them to persons located nearby who may need help with the evacuation. An authorized user may then direct firefighters to the location of people left in the building. They may also be used to confirm to o the firefighter that particular rooms may be empty of occupants to help them work as

efficiently as possible. The NAALs also may interface directly with hand held devices used by the firefighters so they may see for themselves where they are and where to go.

It should be noted that the interface between the firefighter and the NAALs may not be limited to hand held devices but may be built right into their mask as a heads up display. 5 SWAT teams may also use NAALs to control hostage situations. Airports may watch and follow everyone in an airport. Using sophisticated sensors built into the NAALs, weaponry may be detected under someone's garments. For an example a gun hidden under someone's jacket may be detected with the built in infrared sensors. Because the NAALs may see in 3D as described in the Ί02 patent.

0 Retail establishments may use these embodiments to follow traffic patterns of

shoppers and monitor their employee's behavior. Using NAALs 3D imaging capabilities product definition may be determined. Fig 12 shows a typical store wall/shelf 40. Product 42 on the shelves may be imaged with the NAALs and when one of the products moves off of the shelf the NAALs may use this information to adjust inventory stock on the shelf and follow the product out of the door.

Turning to Fig. 13, control rooms and factories may use these embodiments, e.g., the infrared sensors of the NAALs 1, to watch equipment and machinery for temperature changes. If equipment starts to show signs of temperature changes outside of the predetermined safety ranges an alarm may be tripped. In the example shown, the system has identified an exhaust tank 46 with an overheating connection 48. The system may then activate a failsafe electronic valve to reduce or shut off this tank before the tank fails.

These embodiments may thus be used to create situational awareness of things in the light environment and put that awareness into an analyzing and control network.

This situational awareness can reveal to the higher control system the location of people or aberrant environmental conditions. That information can be used on a grid wide basis for analysis and control.

It should be noted that the various modules and other components of the embodiments discussed hereinabove may be configured as hardware, as computer readable code stored in any suitable computer usable medium, such as ROM, RAM, flash memory, phase-change memory, magnetic disks, etc., and/or as combinations thereof, without departing from the scope of the present invention. Additional examples of a suitable computer storage medium include any of, but not limited to, the following: CD-ROM, DVD, magnetic tape, optical disc, hard drive, floppy disk, ferroelectric memory, flash memory, ferromagnetic memory, optical storage, charge coupled devices, magnetic or optical cards, smart cards, EEPROM, EPROM, RAM, ROM, DRAM, SRAM, SDRAM, and/or any other appropriate static or dynamic memory or data storage devices.

The above systems, modules, etc., may be implemented in various computing environments. For example, embodiments of the present invention may be implemented on a conventional IBM PC or equivalent, multi-nodal system (e.g., LAN) or networking system (e.g., Internet, WWW, wireless web). All programming and data related thereto are stored in computer memory, static or dynamic or non-volatile, and may be retrieved by the user in any of: conventional computer storage, display (e.g., CRT, flat panel LCD, plasma, etc.) and/or hardcopy (i.e., printed) formats. The programming of these embodiments may be implemented by one skilled in the art of computer systems and/or software design based on the teachings herein.

It should be understood that any of the features described with respect to one of the embodiments described herein may be similarly applied to any of the other embodiments described herein without departing from the scope of the present invention.

In the preceding specification, the invention has been described with reference to specific exemplary embodiments for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of this disclosure. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.

Having thus described the invention, what is claimed is: