Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
EMERGENCY GUIDANCE SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2016/135448
Kind Code:
A1
Abstract:
A mixed reality emergency guidance system, and method of providing same, for use within a building or other structure, the system comprising a headset (100) for placing over a user's eyes, in use, the headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of the real world environment into the three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display the mixed reality environment on said screen. A storage module (112) is provided, having stored therein a three-dimensional virtual model of the interior layout of the building or other structure. A positioning module determines the current location of the user within the building or other structure, and a processing module (104) is configured to calculate a recommended escape, or other, route from the current location of said user to a second location relative to the building or structure and generate navigation data representative of the recommended route. An image processing module generates image data representative of the navigation data and displays the image data within the mixed reality environment on the screen.

Inventors:
WHITEFORD CHRISTOPHER JAMES (GB)
COLOSIMO NICHOLAS GIACOMO ROBERT (GB)
WRIGHT JULIAN DAVID (GB)
Application Number:
PCT/GB2016/050366
Publication Date:
September 01, 2016
Filing Date:
February 15, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BAE SYSTEMS PLC (GB)
International Classes:
G08B7/06
Foreign References:
US20140198017A12014-07-17
JP2005037181A2005-02-10
US20030234725A12003-12-25
US20080243385A12008-10-02
Attorney, Agent or Firm:
BAE SYSTEMS PLC, GROUP IP DEPT (Farnborough Hampshire GU14 6YU, GB)
Download PDF:
Claims:
CLAIMS

A mixed reality guidance system for use within a building or other structure, the system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the system further comprising a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, a positioning module for determining the current location of said user within said building or other structure, a processing module configured to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.

A system according to claim 1 , wherein said image data comprises navigational symbols, overlayed or blended into said mixed reality environment so as to be representative of said recommended route.

A system according to claim 2, wherein said navigational symbols are updated within said mixed reality environment using updated location data from said positioning module as said user moves through the interior of the building or other structure.

A system according to any of the preceding claims, wherein said image processing module is further configured to obtain, from said three- dimensional virtual model, image data representative of selected fixed features of the interior of the building or other structure within said real world environment in the vicinity of the user, and overlay or blend said image data into said mixed reality environment in respect of corresponding features therein.

5. A system according to any of the preceding claims, configured to receive data from at least one external sensor indicative of a hazard or obstacle in or on said recommended route, and re-calculate said recommended route to circumnavigate said hazard or obstacle.

6. A system according to claim 5, comprising an image processing module for generating image data representative of said hazard or obstacle, and overlaying or blending said image data into said mixed reality environment displayed on said screen.

7. A system according to any of the preceding claims, wherein said positioning module is mounted in or on said headset.

8. A system according to any of the preceding claims, wherein said image capture means comprises at least one image capture device mounted on said headset so as to be substantially aligned with a user's eyes, in use.

9. A system according to any of the preceding claims, wherein said processing module is configured to receive, from remote sensors, data representative of the health or structural or environmental status of the building or other structure and/or equipment located therein. 10. A system according to any of the preceding claims, wherein the headset comprises a face mask, configured to be worn over a user's nose and/or mouth, in use, and including a respirator.

1 1 . A system according to claim 10, wherein said face mask is provided with a fume seal configured to form an air tight seal between said face mask and a user's face, in use.

12. Control apparatus for a mixed reality guidance system according to any of the preceding claims, said control apparatus comprising a storage module having stored therein a three-dimensional virtual model of a building or other structure, a processing module configured to receive, from a positioning module, location data representative of the current location of a user, determine a required location for said user relative to said building or structure and calculate a recommended route for said user from their current location to said required location and generate navigation data representative of said recommended route, the processing module being further configured to receive, from said positioning module, updated location data representative of the current location of the user as they move through said building or structure and generate updated navigation data representative of said recommended route accordingly.

Apparatus according to claim 12, wherein said processing module is configured to receive, from a plurality of positioning modules, location data representative of the respective current locations of a plurality of users, generate a required location for each said user, calculate a respective recommended route for each user from their current location to their required location, and generate respective navigation data representative of each recommended route, the processor being further configured to receive, from each said positioning module, updated location data representative of the current location of each respective user as they move through said building or structure and generate updated navigation data representative of their respective recommended route accordingly.

Apparatus according to claim 13, wherein said processor is further configured to receive sensor data from the current location of at least one of said users and use said sensor data in said calculation of one or more of said recommended routes.

Apparatus according to claim 13 or claim 14, including a storage module for storing data representative of the current occupants of said building or structure.

A mixed reality emergency guidance system, for use within a building or other structure, the system comprising at least one headset for placing over a user's eyes, in use, the or each headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a respective user's field of view and display said mixed reality environment on said screen, the system further comprising control apparatus according to any of claims 12 to 15.

A method of providing a guidance system for a building or other structure, the method comprising providing a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the method further comprising providing a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, providing a positioning module for determining the current location of said user within said building or other structure, providing a processing module and configuring said processing module to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and providing an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.

Description:
EMERGENCY GUIDANCE SYSTEM AND METHOD

This invention relates generally to an emergency guidance system and method and, more particularly but not necessarily exclusively, to a visual guidance system and method for use in assisting users to escape from, or evacuate, a building or other structure in an emergency situation.

There are many potential emergency situations in which occupants of a building or other structure would be required to escape therefrom, as quickly as possible and by means of the quickest, but also the safest, route. Statutory health and safety regulations specify that signs illustrating and explaining emergency procedures and escape routes are clearly displayed within all public and corporate buildings and structures, which are intended to inform occupants as to the emergency and evacuation procedures for a specific building or structure, and provide guidance and/or directions as to the quickest escape route from their current location (i.e. near the sign). However, there are a number of issues associated with this type of passive information and guidance facility. Firstly, a user may not be familiar with their environment, and have difficulty, especially under pressure, in determining the correct escape route by reference to a two dimensional floor plan or map. Furthermore, once they have moved away from the sign, they do not have any ongoing reference. Still further, unknown hazards may exist or occur along the signposted exit route, of which a person may be unaware until they actually reach it, possibly causing injury and/or forcing them to take an alternative route with which they may be unfamiliar. Finally, during some types of emergency, smoke or other noxious substances may severely obscure a person's vision and/or affect their ability to safely navigate the exit route.

It would therefore be desirable to provide an emergency guidance system and method which provides more effective and intuitive emergency guidance and addresses at least some of the issues outlined above.

In accordance with a first aspect of the present invention, there is provided a mixed reality guidance system for use within a building or other structure, the system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the system further comprising a storage module having stored therein a three- dimensional virtual model of the interior layout of said building or other structure, a positioning module for determining the current location of said user within said building or other structure, a processing module configured to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.

The image data may comprise navigational symbols, overlayed or blended into said mixed reality environment so as to be representative of said recommended route. Such navigational symbols may be updated within said mixed reality environment using updated location data from said positioning module as said user moves through the interior of the building or other structure.

The image processing module may be further configured to obtain, from said three-dimensional virtual model, image data representative of selected fixed features of the interior of the building or other structure within said real world environment in the vicinity of the user, and overlay or blend said image data into said mixed reality environment in respect of corresponding features therein.

In an exemplary embodiment of the present invention, the system may be configured to receive data from at least one external sensor indicative of a hazard or obstacle in or on said recommended route, and re-calculate said recommended route to circumnavigate said hazard or obstacle. In this case, the system may comprise an image processing module for generating image data representative of said hazard or obstacle, and overlaying or blending said image data into said mixed reality environment displayed on said screen.

The positioning module may be mounted in or on said headset.

The image capture means may comprise at least one image capture device, and more probably two image capture devices, mounted on said headset so as to be substantially aligned with a user's eyes, in use.

The processing module may be configured to receive, from remote sensors, data representative of the health or structural or environmental status of the building or other structure and/or equipment located therein. The headset may comprise a face mask, configured to be worn over a user's nose and/or mouth, in use, and include a respirator. The face mask may be provided with a fume seal configured to form an air tight seal between said face mask and a user's face, in use.

In accordance with another aspect of the present invention , there is provided control apparatus for a mixed reality guidance system as described above, said control apparatus comprising a storage module having stored therein a three-dimensional virtual model of a building or other structure, a processing module configured to receive, from a positioning module, location data representative of the current location of a user, determine a required location for said user relative to said building or structure and calculate a recommended route for said user from their current location to said required location and generate navigation data representative of said recommended route, the processing module being further configured to receive, from said positioning module, updated location data representative of the current location of the user as they move through said building or structure and generate updated navigation data representative of said recommended route accordingly.

The processing module may be configured to receive, from a plurality of positioning modules, location data representative of the respective current locations of a plurality of users, generate a required location for each said user, calculate a respective recommended route for each user from their current location to their required location, and generate respective navigation data representative of each recommended route, the processor being further configured to receive, from each said positioning module, updated location data representative of the current location of each respective user as they move through said building or structure and generate updated navigation data representative of their respective recommended route accordingly. The processor may be further configured to receive sensor data from the current location of at least one of said users and use said sensor data in said calculation of one or more of said recommended routes.

The control apparatus may further include a storage module for storing data representative of the current occupants of said building or structure. Another aspect of the present invention extends to a mixed reality emergency guidance system, for use within a building or other structure, the system comprising at least one headset for placing over a user's eyes, in use, the or each headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a respective user's field of view and display said mixed reality environment on said screen, the system further comprising control apparatus as described above.

In accordance with yet another aspect of the present invention, there is provided a method of providing a guidance system for a building or other structure, the method comprising providing a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three- dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality

environment on said screen, the method further comprising providing a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, providing a positioning module for determining the current location of said user within said building or other structure, providing a processing module and configuring said processing module to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and providing an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.

These and other aspects of the present invention will be apparent from the following specific description in which embodiments of the present invention are described, by way of examples only, and with reference to the

accompanying drawings, in which:

Figure 1 is a front perspective view of a headset for use in a mixed reality system in respect of which a method and apparatus according to an exemplary embodiment of the present invention can be provided;

Figure 2 is a schematic block diagram illustrating the configuration of some principal elements of a mixed reality system for use in an exemplary embodiment of the present invention;

Figure 3 is a schematic illustration of a single image frame displayed on the screen of a mixed reality system according to an exemplary embodiment of the present invention;

Figure 4 is a schematic diagram illustrative of the configuration of a mixed reality emergency guidance system according to an exemplary

embodiment of the present invention; and Figure 5 is a schematic diagram illustrative of the configuration of the configuration of a mixed reality emergency guidance system according to another exemplary embodiment of the present invention.

Virtual reality systems are known, comprising a headset which, when placed over a user's eyes, creates and displays a three dimensional virtual environment in which a user feels immersed and with which the user can interact in a manner dependent on the application. For example, the virtual environment may comprise a game zone, within which the user can play a game.

More recently, augmented and mixed reality systems have been developed, wherein image data captured in respect of a user's real world environment can be captured, rendered and placed within a 3D virtual reality environment. Thus, the user views their real world environment as a three dimensional virtual world generated images captured from their surroundings.

Referring to Figure 1 of the drawings, a mixed reality display system may include a headset 100 comprising a visor 10 having a pair of arms 12 hingedly attached at opposing sides thereof in order to allow the visor to be secured onto a user's head, over their eyes, in use, by placing the curved ends of the arms 12 over and behind the user's ears, in a manner similar to

conventional spectacles. It will be appreciated that, whilst the headset is illustrated herein in the form of a visor, it may alternatively comprise a helmet for placing over a user's head, or even a pair of contact lenses or the like, and the present invention is not intended to be in any way limited in this regard. Also provided on the headset, is a pair of image capture devices 14 for capturing images of the environment, such image capture devices being mounted as closely as possible aligned with the user's eyes, in use. A typical mixed reality system further comprises a processor, which is communicably connected in some way to a screen which is provided inside the visor 10. Such communicable connection may be a hard wired electrical connection, in which case the processor and associated circuitry will be mounted on the headset. Alternatively, however, the processor may be configured to wirelessly communicate with the visor, for example, by means of Bluetooth or similar wireless communication protocol, in which case, the processor need not be mounted on the headset but can instead be located remotely from the headset, with the relative allowable distance between them being dictated, and limited only by, the wireless communication protocol being employed. For example, the processor could be mounted on, or formed integrally with, the user's clothing, or instead located remotely from the user, either as a stand-alone unit or as an integral part of a larger control unit, for example. ln general, the processor receives image data from the image capture devices, and renders and blends such image data, in real time, into a displayed three dimensional virtual environment. The concept of real time image blending for augmented or mixed reality is known, and several different techniques have been proposed. The present invention is not necessarily intended to be limited in this regard. However, for completeness, one exemplary method for image blending will be briefly described. Thus, in respect of an object or portion of a real world image to be blended into the virtual environment, a threshold function may be applied in order to extract that object from the background image. Its relative location and orientation may also be extracted and preserved by means of marker data. Next, the image and marker data is converted to a binary image, possibly by means of adaptive thresholding (although other methods are known). The marker data and binary image are then transformed into a set of coordinates that match the location within the virtual environment in which they will be blended. Such blending is usually performed using black and white image data. Thus, if necessary, colour data sampled from the source image can be backward warped, using homography, to each pixel in the resultant virtual scene. All of these computational steps require minimal processing and time and can, therefore, be performed quickly and in real (or near real) time. Thus, as the user's field of view and/or external surroundings change, image data within the mixed reality environment can be updated in real time.

Referring to Figure 2 of the drawings, an emergency guidance system according to a first exemplary embodiment of the present invention comprises at least one mixed reality headset 100 and an emergency response processing system 104, which may be integrated in or mounted on the headset 100 and/or provided at a fixed location within the building or structure and configured for wireless communication with the headset 100. It is envisaged that some exemplary embodiments may comprise a central emergency response processing system for communication with a single headset, or in parallel with a plurality of headsets. However, in alternative exemplary embodiments, the processing functionality of the emergency response processing system may be distributed, partly or fully, amongst individual processing units provided in or on the headsets, which may or may not be the same processing units used to provide on the screen mixed reality images of the wearer's environment derived from image data captured by the image capture devices on the headset, and the present invention is not necessarily intended to be in any way limited in this regard. The processing system 104 is configured to receive, from one or more external sources 108, 1 10, data representative of, for example, the structural status of the building or structure, the health and/or status of (at least) key equipment therein, the nature and/or status of an emergency situation, the location of hazardous elements of an emergency situation (i.e. the location of a fire, for example), the location of other occupants within the building or structure, etc. Thus, the processing system 104 generally includes an interface to enable data to be transmitted therefrom and received thereby, in order that data that could potentially be changing dynamically is updated in real (or near real) time. Furthermore, the processing system 104 may be configured to receive, or have stored therein, a three dimensional, virtual model 1 12 of the building or structure.

It will be appreciated by a person skilled in the art that the processing functionality of the above-described emergency response processing system may be provided by means of more than one processor. Indeed, several processors may be required to facilitate embodiments of the present invention, some of which may be dedicated system processors, whether remote or onboard (i.e. mounted in or on the one or more headsets 100), and others of which may be processors or other data processing devices incorporated in the network infrastructure of the building, and the present invention is not necessarily intended to be limited in this regard. Indeed, the processing function may be provided by an entirely de-centralised network. For example, this functionality may be provided by a "mesh network", which is configured to self-initiate (in response to an emergency situation or otherwise) and build a network using distributed devices. Such a de-centralised network would continue to function even if, for example, the infrastructure of the building is damaged or destroyed: each node may pass data along to another available node, in the manner of a "daisy chain", such that not all nodes in the network need to be within communication range of each other. The or each headset 100 may include an internal geo-location system for generating data representative of the relative location, within the building or structure, of the wearer of the respective headset and transmit such data, continuously or otherwise, to the processing system 104. In the event of an emergency situation, a user places a headset 100 over their eyes, and the processing system 104 is configured, based on the current location of the wearer within the building or structure, to calculate, using the above-mentioned 3D virtual model 1 12 of the building or structure, the safest and/or quickest route from the wearer's location. It will be appreciated that, in many cases, this may be an escape route, but it may also be a route toward the hazard or emergency depending on the role of the wearer within the situation.

Calculation of the above-mentioned route may be performed in a similar manner to that used in in-car satellite navigation systems. Thus, in respect of data within the 3D virtual model 1 12, the processing system identifies the wearer's current location and the required destination. It then determines the current status of the connecting paths between those two locations, based on the 3D virtual model 1 12 (for permanent status aspects) and from data received from external sources (to take into account the dynamically changing environment). Status parameters may include whether or not two proximal paths or corridors are physically connected (or separated by a wall or locked door) and actually navigable (i.e. not blocked by an obstacle or two narrow to pass through safely). The processing system may also identify path to path 'cost', in terms of, for example, the number of turns and corners to be navigated, presence or absence of doors, etc. The processing system then identifies the shortest and/or simplest route having the lowest 'cost'.

Once a route has been identified, the processing system 104 generates appropriate navigation instructions, generates virtual representations of such navigational instructions and overlays them, or otherwise blends them, in the virtual environment displayed on the screen within the headset 100. Thus, the wearer can see their immediate environment (derived from rendered and blended image data captured by the image capture devices on the headset) together with visual navigation aids directing them along the recommended route. The navigational image data may include indications of areas through which the wearer cannot pass, for example, a locked door. A still image of what the wearer may see on their screen, according to one exemplary embodiment of the invention, is illustrated in Figure 3 of the drawings.

The visual navigational aids may be supplemented, or even replaced, with voice guidance emitted through speakers provided within the headset.

As the wearer moves within their environment, along the recommended route, the 3D environment displayed on the screen is continuously updated, in real time, using images captured by the image capture devices. In addition, the processing system 104 is continuously updated with the wearer's current location, such that the displayed navigation data can also be updated accordingly.

Furthermore, structural sensors and equipment health reporting systems, as well as other sensors, including, in some exemplary embodiments, sensors provided on the headset itself, may be used to supply relevant data to the processing system 104, such that the wearer's route can be dynamically updated to take into account changing conditions. Thus, for example, in the case of a fire, data from heat sensors (or other means) can be used to identify the location of the fire within the structure, such that the calculated route is configured to avoid it (or the route re-calculated, as required). Equally, if data from structural sensors indicates that a part of the structure has become unsafe, or an obstruction has been identified, the processing system is configured to recalculate the route accordingly, to ensure that the wearer avoids any hazard. The processing system may be configured to generate and insert a visual representation of an obstacle or hazard in a user's vicinity into the 3D virtual environment displayed on their screen.

It will be apparent to a person skilled in the art that the nature, type and number of external data sensors required to detect, identify and classify key data relevant to the generation of an optimum route within a dynamically changing environment will be dependent on the building or structure itself, its infrastructure and equipment therein, the types of emergency situations envisaged, etc. However, in general, the processing system 104 is configured to collate available sensor data from appropriate sources with the aim of ensuring that the wearer of the headset is guided around blockages, breaches, fire, heat, chemical spills and/or other potential hazards, as appropriate.

As stated above, the system includes a 3D virtual model 1 12 of the building or structure. Thus, in accordance with some exemplary embodiments of the invention, the processing system may be configured to overlay image data representative of permanent features (such as walls, corners, stairs, etc.) onto the 3D mixed reality images displayed to the user (generated from the images captured by the image capture devices on their headsets), using knowledge of the user's absolute location and/or one of a number of image matching techniques. Thus, the wearer would still be able to see permanent features of the internal infrastructure within their field of view, even if thick smoke, for example, is obscuring the images captured by the image capture devices. Thus, referring to Figure 4 of the drawings, the user 200 may be presented with a 3D virtual image 202 of their immediate environment including a visual representation 204 of the recommended route, an overlayed image 206 of permanent features of the building infrastructure, and known (or identified) hazards 208.

It is envisaged that, in an exemplary embodiment of the invention, each headset may include a seal such that the visor can be sealed over the wearer's eyes, in use, thereby preventing smoke and other noxious substances from reaching their eyes and potentially compromising their vision. Furthermore, the headset could include a respiration filter within a mask portion for covering the user's nose and mouth, to prevent inhalation of smoke or other irritant or toxic substances, and aid breathing. In summary, embodiments of the present invention provide an emergency guidance system and method, wherein a mixed reality headset is provided with a live connection to the building infrastructure via, for example, a central control system which integrates several such headsets and employs distributed sensors, machine health information modules, and pre-existing emergency detection systems, functionally integrated therein, so as to provide a mixed reality system which generates and displays a live route in an emergency situation and which can also identify dangers and hazards in a dynamically changing environment. The headset itself may be provided with sensors such as ambient temperatures sensors, oxygen quality sensors and even health monitors in respect of the wearer. Multi-spectral cameras may also be provided to identify additional sources of heat, and even radiation sensors could be employed, depending on the environment for which the system is intended. It is envisaged, in accordance with some exemplary embodiments, that multiple headsets would be communicably coupled to a central control system and to each other to enable gathered data to be shared, thereby to increase the overall situational awareness of the system.

It will further be appreciated, as briefly mentioned above, that a single processing system can be used to generate dynamically updated, optimum routes in respect of a number of different users (and headsets), as illustrated schematically in Figure 2, wherein the route calculated and updated for each wearer will be dependent on their individual respective location within the building or structure and their role within the emergency situation, using data from external, static sensors within the infrastructure of the environment and/or data from sensors mounted on-board their respective headsets. In addition, in some exemplary embodiments of the present invention, sensor data from other headsets within the system may additionally be used by the processing system to identify data relevant to a particular user. In other exemplary embodiments of the invention, the processing system may be configured to coordinate multiple users' locations, movements and routes, such that each individual user's route can be calculated taking into account the location and movement of other users so as to ensure, for example, that localised crowds or bottle necks within the only or principal thoroughfares can be avoided or at least minimised. In this case, the main processing system will be remote from, and wirelessly coupled to, the headsets 100, either in a fixed location within the building or structure or in or on one of the headsets (intended for example for use by a team leader or safety officer). However, each headset may include a local processing system with similar (or possibly reduced) functionality, such that the headsets can still function adequately, in the event of a main system failure, to guide the wearer to safety.

Thus, an exemplary embodiment of the present invention, as illustrated schematically in Figure 5 of the drawings, may provide an emergency guidance system which can be used to coordinate the movements of several people, whereby each user's headset is communicably coupled to the processing system and also, optionally, to each other. Thus, in calculating the recommended route for each individual in a group, the size, location and nature of the group as a whole can additionally be taken into account, thus, for example, enabling the management and generation of alternative routes to allow emergency crews access to relevant areas.

In addition to the features of the individual systems described above, namely, and as an example, the provision of feedback between the stored 3D virtual model of the building or structure and the environment with the mixed reality depth information ton identify changes (such as blockages) which may affect a route, the resultant system can potentially report safe routes that have provided others with a safe escape, provide an active list of all people still within the building or structure, and report building health and hazard locations to emergency crews for coordination purposes. Still further, sensors worn on each user's person may be configured to transmit data representative of the respective user's vital signs and/or health status to the system, for provision to, for example, the emergency services, so as to potentially enable diagnosis of injuries such as burns, lung damage or other injuries. Various exemplary embodiments of the present invention are envisaged for use in various different environments, including, but not limited to, buildings, surface and sub-surface marine vessels, offshore oil rigs, oil refineries, and other complex internal and external environments.

It will be appreciated by a person skilled in the art, from the foregoing description, that modifications and variations can be made to the described embodiments without departing from the scope of the invention as claimed.