Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
FACILITY MONITORING BY A DISTRIBUTED ROBOTIC SYSTEM
Document Type and Number:
WIPO Patent Application WO/2018/051349
Kind Code:
A1
Abstract:
Methods and systems of detecting a facility condition based on status data sensed within a facility using a plurality of mobile sensing platforms. The status data include sensing data and positions of the sensing. The facility condition is determined by operation of a computer processor using an associating data structure, which associates status data patterns to facility conditions. The status data patterns are defined in a combination of the status data from more than one of the mobile sensing platforms, and can form a status data pattern not defined in status data from any single one of the mobile sensing platforms.

Inventors:
EINAV OMER (IL)
ROSENMANN SHMUEL (IL)
TALISMAN DROR (IL)
BEN BASAT TAL HAIM (IL)
Application Number:
PCT/IL2017/051045
Publication Date:
March 22, 2018
Filing Date:
September 14, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
R A S ROBOTICS ARTIFICIAL INTELLIGENCE LTD (IL)
International Classes:
G08B23/00; B25J9/00; G06N5/00
Foreign References:
US20140266669A12014-09-18
US20140350890A12014-11-27
US20130297071A12013-11-07
US20140091811A12014-04-03
US20150283702A12015-10-08
Attorney, Agent or Firm:
EHRLICH, Gal et al. (IL)
Download PDF:
Claims:
WHAT IS CLAIMED IS :

1. A method of detecting a facility condition, comprising:

sensing status data within a facility using mobile sensing platforms, wherein the status data include sensing data and positions of the sensing; and

determining a facility condition by operation of a computer processor using an associating data structure associating status data patterns to facility conditions, based on a status data pattern within the status data, and wherein the status data pattern includes at least one relationship among status data collected from more than one of the mobile sensing platforms.

2. The method of claim 1, wherein the associating data structure comprises weights learned by a machine learning algorithm based on previous observations of association between the facility condition and status data.

3. The method of any one of claims 1-2, wherein the at least one relationship comprises a least one of the group consisting of: a commonality of sensed target, a commonality of being collected from a shared sampling population, and a commonality of sensing a parameter having a component of variance explained by a same source.

4. The method of any one of claims 1-3, wherein the facility condition comprises human occupancy of one or more locations of the facility.

5. The method of claim 4, wherein the facility condition comprises an indication of the number of humans occupying the one or more locations of the facility.

6. The method of claim 4, wherein the determining comprises identifying a signature of a particular individual associated with the human occupancy.

7. The method of claim 6, wherein the human occupancy is furthermore determined to indicate a potential security problem in the facility, based on a status data pattern comprising the recurrence of the signature within a plurality of locations within the facility.

8. The method of claim 6, wherein the human occupancy is furthermore determined to indicate a potential security problem in the facility, based on a status data pattern comprising the signature not matching one or more predetermined criteria.

9. The method of claim 8, wherein the predetermined criteria comprise matching of a signature authorized for access to the one or more locations.

10. The method of any one of claims 8-9, wherein the predetermined criteria comprise matching of a signature of an individual known to have entered the one or more locations through an authorized access way.

11. The method of claim 6, wherein the human occupancy is furthermore determined to indicate a potential security problem in the facility, based on a status data pattern comprising the recurrence of a signature of a particular individual within a single location within the facility, observed by different mobile sensing platforms.

12. The method of any one of claims 7-11, wherein the signature comprises a physical appearance of the particular individual.

13. The method of claim 12, wherein the physical appearance is determined by computerized face recognition.

14. The method of any one of claims 7-13, wherein the signature comprises an audible or visual characteristic of an individual gait.

15. The method of any one of claims 7-13, wherein the signature comprises a characteristic of an individual voice.

16. The method of any one of claims 7-13, wherein the signature comprises a behavior of the particular individual.

17. The method of claim 16, wherein the behavior comprises noise.

18. The method of any one of claims 16-17, wherein the behavior comprises physical contact with elements of the facility.

19. The method of any one of claims 7-18, wherein identifying the signature comprises:

sensing, by a first mobile platform, status data indicating human occupancy of a location;

dispatching a second mobile platform to the location;

sensing, by the second mobile platform, status data identifying the signature.

20. The method of any one of claims 4-19, wherein the sensing is performed during interaction of a mobile sensing platform with a human occupant, and the status data indicates behaviors elicited during the interaction.

21. The method of claim 1, wherein the facility condition comprises the presence of fire in the facility.

22. The method of any one of claims 1-21, wherein the mobile sensing platforms include robots, and the positions are moved to by the robots at least partially autonomously.

23. The method of claim 22, wherein positions moved to by the robots are selected for performance of a task comprising at least one of cleaning and carrying.

24. The method of any one of claims 1-23, wherein the status data is sensed by at least one of the group consisting of a motion sensor, an obstruction sensor, and a temperature sensor.

25. The method of any one of claims 1-24, wherein the status data is sensed by at least one of the group consisting of a sound sensor, a visible light camera, and an infrared camera.

26. The method of claim 25, wherein the status data is sensed by processing by a processor of data received from the at least one of the group consisting of a sound sensor, a visible light camera, and an infrared camera.

27. The method of any one of claims 1-25, comprising moving the mobile sensing platforms, based on the determining.

28. The method of claim 27, wherein the mobile sensing platforms are moved in locations surrounding positions wherein the status data is sensed.

29. The method of any one of claims 27-28, wherein the mobile sensing platforms are moved in facility spaces away from positions wherein the status data is sensed.

30. The method of any one of claims 27-29, comprising interacting with human individuals using the moved mobile sensing platforms, wherein the interacting is based on the determining.

31. The method of any one of claims 27-29, comprising interfering with behaviors of human individuals using the moved mobile sensing platforms, wherein the interfering is based on the determining.

32. The method of any one of claims 27-29, comprising guiding behaviors of human individuals using the moved mobile sensing platforms, wherein the guiding is based on the determining.

33. The method of any one of claims 1-25, comprising adjusting sensing by the mobile sensing platforms, based on the determining.

34. The method of claim 33, comprising sensing additional status data after the adjusting; and determining a new facility condition, based on the additional status data.

35. A system for detecting a facility condition, comprising:

a plurality of mobile sensing platforms configured to sense status data comprising facility status indications associated with positions of the mobile sensing platforms; and

a computer processor in data communication with the mobile sensing platforms, and configured to determine a facility condition using an associating data structure associating status data patterns to facility conditions, based on a status data pattern within the status data, and wherein the status data pattern includes at least one relationship among status data collected from more than one of the mobile sensing platforms.

36. The system of claim 35, wherein the mobile sensing platforms include robots.

37. The system of claim 36, wherein mobility of the robots is at least partially autonomous.

38. The system of claim 35, wherein the associating data structure comprises weights learned by a machine learning algorithm based on previous observations of association between the facility condition and status data.

39. The system of any one of claims 36-37, wherein the robots comprise robots configured to perform facility operations tasks.

40. The system of claim 39, wherein the facility operations tasks comprise at least one of vacuum cleaning, carpet cleaning, food carrying, item delivery, item dispensing, luggage carrying, and laundry carrying.

41. A method of associating sensed status data to a facility condition, comprising:

sensing status data within a facility using mobile sensing platforms, wherein the status data include positions of the mobile sensing platforms associated with facility status indications sensed from the positions; determining one or more facility conditions co-occurring with the sensing; and

calculating an associating data structure associating status data patterns defined in a combination of the status data positions and indications, wherein the defined pattern is based on at least one relationship among status data collected from more than one of the mobile sensing platforms.

42. The method of claim 41, wherein the associating data structure comprises weights of a neural network.

43. The method of claim 41, wherein the calculating comprises machine learning by using the facility conditions as training feedbacks to a neural network.

44. A method of managing a facility condition, comprising:

sensing first status data within a facility using at least a first mobile sensing platform; and

after an interval of at least one minute, sensing second status data within the facility using at least a second mobile sensing platform;

wherein the first and second status data include positions of the mobile sensing platforms associated with facility status indications sensed from the positions;

determining a facility condition by operation of a computer processor using a combination of the first and second status data; and

operating at least one mobile sensing platform to modify the facility condition, based on the determining.

45. The method of claim 44, comprising interacting with human individuals using the operated at least one mobile sensing platforms, wherein the interacting is based on the determining.

46. The method of any one of claims 44-45, comprising interfering with behaviors of human individuals using the operated at least one mobile sensing platforms, wherein the interfering is based on the determining.

47. The method of any one of claims 44-45, comprising guiding behaviors of human individuals using the operated at least one mobile sensing platforms, wherein the guiding is based on the determining.

48. A method of identifying an unmonitored zone for a security monitoring system of a facility, comprising:

monitoring, from an automatic monitoring station of the security monitoring system, and using sensors of a security monitoring system, a mobile robot as it moves through a security area;

identifying loss of monitoring of the mobile robot; and

signaling to the robot that monitoring has been lost; and

receiving, from the robot, the position of the robot when monitoring is lost, as an indication of a position of the unmonitored zone.

49. The method of claim 48, comprising using the mobile robot to monitor the indicated unmonitored zone.

50. The method of claim 48, comprising dispatching a second robot to a position monitoring the indicated unmonitored zone.

51. A method of calibrating sensors among a heterogeneous plurality of robots, comprising iteratively:

measuring one or more parameters, using sensors of each of the plurality of robots;

calibrating the measurements according to a set of corrections to produce calibrated measurements; and adjusting the set of corrections, based on inconsistency of the calibrated measurements with one or more reference values produced from the calibrated measurements;

wherein the reference values are produced using a weighted combination of the calibrated measurements, and wherein the weights assigned are relatively larger for calibrated measurements from sensors which previously have produced calibrated measurements having relatively more consistency with the one or more reference values.

Description:
FACILITY MONITORING BY A DISTRIBUTED ROB OTIC SYSTEM

RELATED APPLICATIONS

This application claims the benefit of priority under 35 USC § 119(e) of U.S.

Provisional Patent Application No. 62/394,772 filed September 15, 2016; the contents of which are incorporated herein by reference in their entirety.

FIELD AND BACKGROUND OF THE INVENTION

The present invention, in some embodiments thereof, relates to the field of robotics and more particularly, to distributed robotic systems.

Autonomous robotic systems capable of self-maintenance (e.g., charge maintenance), environmental sensing, task performance, and/or autonomous navigation have found uses in architectural facility contexts for services such as such as floor cleaning, pool maintenance, lawn mowing, delivery, social interaction, and/or facility patrolling.

Some types of distributed robotic systems have been described, for example robotic swarms, wherein emergent behavior is produced from a plurality of robots acting together while following simple rules. Another robotic paradigm has been described as a "ubiquitous robot" paradigm (e.g., Kim et al. Ubiquitous Robot: A New Paradigm for Integrated Services, 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10-14 April 2007), in which robotic roles are distributed among a plurality of different robotic types. Split robotic systems comprise robotic parts which can divide and recombine, for example to assume different functional configurations.

SUMMARY OF THE INVENTION

There is provided, in accordance with some embodiments of the present disclosure, a method of detecting a facility condition, comprising: sensing status data within a facility using mobile sensing platforms, wherein the status data include sensing data and positions of the sensing; and determining a facility condition by operation of a computer processor using an associating data structure associating status data patterns to facility conditions, based on a status data pattern within the status data, and wherein the status data pattern includes at least one relationship among status data collected from more than one of the mobile sensing platforms.

In some embodiments, the associating data structure comprises weights learned by a machine learning algorithm based on previous observations of association between the facility condition and status data.

In some embodiments, the at least one relationship comprises a least one of the group consisting of: a commonality of sensed target, a commonality of being collected from a shared sampling population, and a commonality of sensing a parameter having a component of variance explained by a same source.

In some embodiments, the facility condition comprises human occupancy of one or more locations of the facility.

In some embodiments, the facility condition comprises an indication of the number of humans occupying the one or more locations of the facility.

In some embodiments, the determining comprises identifying a signature of a particular individual associated with the human occupancy.

In some embodiments, the human occupancy is furthermore determined to indicate a potential security problem in the facility, based on a status data pattern comprising the recurrence of the signature within a plurality of locations within the facility.

In some embodiments, the human occupancy is furthermore determined to indicate a potential security problem in the facility, based on a status data pattern comprising the signature not matching one or more predetermined criteria.

In some embodiments, the predetermined criteria comprise matching of a signature authorized for access to the one or more locations.

In some embodiments, the predetermined criteria comprise matching of a signature of an individual known to have entered the one or more locations through an authorized access way.

In some embodiments, the human occupancy is furthermore determined to indicate a potential security problem in the facility, based on a status data pattern comprising the recurrence of a signature of a particular individual within a single location within the facility, observed by different mobile sensing platforms. In some embodiments, the signature comprises a physical appearance of the particular individual.

In some embodiments, the physical appearance is determined by computerized face recognition.

In some embodiments, the signature comprises an audible or visual characteristic of an individual gait.

In some embodiments, the signature comprises a characteristic of an individual voice.

In some embodiments, the signature comprises a behavior of the particular individual.

In some embodiments, the behavior comprises noise.

In some embodiments, the behavior comprises physical contact with elements of the facility.

In some embodiments, identifying the signature comprises: sensing, by a first mobile platform, status data indicating human occupancy of a location; dispatching a second mobile platform to the location; sensing, by the second mobile platform, status data identifying the signature.

In some embodiments, the sensing is performed during interaction of a mobile sensing platform with a human occupant, and the status data indicates behaviors elicited during the interaction.

In some embodiments, the facility condition comprises the presence of fire in the facility.

In some embodiments, the mobile sensing platforms include robots, and the positions are moved to by the robots at least partially autonomously.

In some embodiments, positions moved to by the robots are selected for performance of a task comprising at least one of cleaning and carrying.

In some embodiments, the status data is sensed by at least one of the group consisting of a motion sensor, an obstruction sensor, and a temperature sensor.

In some embodiments, the status data is sensed by at least one of the group consisting of a sound sensor, a visible light camera, and an infrared camera. In some embodiments, the status data is sensed by processing by a processor of data received from the at least one of the group consisting of a sound sensor, a visible light camera, and an infrared camera.

In some embodiments, the method comprises moving the mobile sensing platforms, based on the determining.

In some embodiments, the mobile sensing platforms are moved in locations surrounding positions wherein the status data is sensed.

In some embodiments, the mobile sensing platforms are moved in facility spaces away from positions wherein the status data is sensed.

In some embodiments, the interacting is based on the determining.

In some embodiments, the interfering is based on the determining.

In some embodiments, the guiding is based on the determining.

In some embodiments, the method comprises adjusting sensing by the mobile sensing platforms, based on the determining.

In some embodiments, the method comprises sensing additional status data after the adjusting; and determining a new facility condition, based on the additional status data.

There is provided, in accordance with some embodiments of the present disclosure, a system for detecting a facility condition, comprising: a plurality of mobile sensing platforms configured to sense status data comprising facility status indications associated with positions of the mobile sensing platforms; and a computer processor in data communication with the mobile sensing platforms, and configured to determine a facility condition using an associating data structure associating status data patterns to facility conditions, based on a status data pattern within the status data, and wherein the status data pattern includes at least one relationship among status data collected from more than one of the mobile sensing platforms.

In some embodiments, the mobile sensing platforms include robots.

In some embodiments, mobility of the robots is at least partially autonomous.

In some embodiments, the associating data structure comprises weights learned by a machine learning algorithm based on previous observations of association between the facility condition and status data. In some embodiments, the robots comprise robots configured to perform facility operations tasks.

In some embodiments, the facility operations tasks comprise at least one of vacuum cleaning, carpet cleaning, food carrying, item delivery, item dispensing, luggage carrying, and laundry carrying.

There is provided, in accordance with some embodiments of the present disclosure, a method of associating sensed status data to a facility condition, comprising: sensing status data within a facility using mobile sensing platforms, wherein the status data include positions of the mobile sensing platforms associated with facility status indications sensed from the positions; determining one or more facility conditions co-occurring with the sensing; and calculating an associating data structure associating status data patterns defined in a combination of the status data positions and indications, wherein the defined pattern is based on at least one relationship among status data collected from more than one of the mobile sensing platforms.

In some embodiments, the associating data structure comprises weights of a neural network.

In some embodiments, the calculating comprises machine learning by using the facility conditions as training feedbacks to a neural network.

There is provided, in accordance with some embodiments of the present disclosure, a method of managing a facility condition, comprising: sensing first status data within a facility using at least a first mobile sensing platform; and after an interval of at least one minute, sensing second status data within the facility using at least a second mobile sensing platform; wherein the first and second status data include positions of the mobile sensing platforms associated with facility status indications sensed from the positions; determining a facility condition by operation of a computer processor using a combination of the first and second status data; and operating at least one mobile sensing platform to modify the facility condition, based on the determining.

In some embodiments, the interacting is based on the determining.

In some embodiments, the interfering is based on the determining.

In some embodiments, the guiding is based on the determining. There is provided, in accordance with some embodiments of the present disclosure, a method of identifying an unmonitored zone for a security monitoring system of a facility, comprising: monitoring, from an automatic monitoring station of the security monitoring system, and using sensors of a security monitoring system, a mobile robot as it moves through a security area; identifying loss of monitoring of the mobile robot; and signaling to the robot that monitoring has been lost; and receiving, from the robot, the position of the robot when monitoring is lost, as an indication of a position of the unmonitored zone.

In some embodiments, the method comprises using the mobile robot to monitor the indicated unmonitored zone.

In some embodiments, the method comprises dispatching a second robot to a position monitoring the indicated unmonitored zone.

There is provided, in accordance with some embodiments of the present disclosure, a method of calibrating sensors among a heterogeneous plurality of robots, comprising iteratively: measuring one or more parameters, using sensors of each of the plurality of robots; calibrating the measurements according to a set of corrections to produce calibrated measurements; and adjusting the set of corrections, based on inconsistency of the calibrated measurements with one or more reference values produced from the calibrated measurements; wherein the reference values are produced using a weighted combination of the calibrated measurements, and wherein the weights assigned are relatively larger for calibrated measurements from sensors which previously have produced calibrated measurements having relatively more consistency with the one or more reference values.

wherein the reference values are produced using a weighted combination of the calibrated measurements, and wherein the weights assigned are relatively larger for calibrated measurements from sensors which previously have produced calibrated measurements having relatively more consistency with the one or more reference values.

Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.

As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, some embodiments of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. Implementation of the method and/or system of some embodiments of the disclosure can involve performing and/or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of some embodiments of methods, systems, and/or computer program products of the present disclosure, several selected tasks could be implemented by hardware, by software or by firmware and/or by a combination thereof, e.g. , using an operating system.

For example, hardware for performing selected tasks according to some embodiments of the present disclosure could be implemented as a chip or a circuit. As software, selected tasks according to some embodiments of the present disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment, one or more tasks according to some exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well.

Any combination of one or more computer readable medium(s) may be utilized for some embodiments. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.

A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro -magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.

Program code embodied on a computer readable medium and/or data used thereby may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

Computer program code for carrying out operations for some embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).

Some embodiments of the present disclosure may be described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.

The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

Some embodiments of the present disclosure are described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example, and for purposes of illustrative discussion. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the present disclosure may be practiced.

In the drawings:

FIG. 1 is a flowchart schematically illustrating a method of determining a facility condition based on distributed sensing of facility status data, according to some embodiments of the present disclosure;

FIG. 2 schematically represents a system for distributed sensing of a facility condition, e.g., according to the method of Figure 1, according to some embodiments of the present disclosure;

FIG. 3 is a flowchart schematically representing a method of identifying individual facility occupants based on partial matching of a signature, according to some embodiments of the present disclosure;

FIG. 4 schematically represents a method of distributed capacity utilization sensing, and optionally actions to modify capacity utilization, according to some embodiments of the present disclosure;

FIG. 5 schematically represents a method of interacting with identified individual facility occupants, based on a profile of past interactions with the facility occupant, according to some embodiments of the present disclosure;

FIG. 6 schematically represents a method of detecting and optionally responding to a safety and/or security concern, according to some embodiments of the present disclosure;

FIG. 7 is a flowchart schematically describing a method for estimation and calibration of robotic sensor bias, according to some embodiments of the present disclosure;

FIG. 8 is a schematic representation of an environment of a distributed robotic system, according to some embodiments of the present disclosure; FIG. 9 is a top view of a multi-location robotic environment (e.g., environment comprising distinct and optionally separated areas), according to some embodiments of the present disclosure;

FIG. 10 schematically represents the floor plan of architectural facility in which a distributed robotic system is operating, according to some exemplary embodiments of the invention; and

FIG. 11 schematically represents the floor plan of architectural facility in which a distributed robotic system is operating during a period of safety concern, according to some exemplary embodiments of the invention. DESCRIPTION OF SPECIFIC EMB ODIMENTS OF THE INVENTION

The present invention, in some embodiments thereof, relates to the field of robotics and more particularly, to distributed robotic systems.

Overview

A broad aspect of some embodiments of the invention relates to distributed robotic systems in one or more of the following aspects and/or applications:

• Distributed data collection;

Applications in the hospitality industry, for example guest/facility interactions;

• Security, optionally including detection of abnormal and/or suspicious behavior;

• Behavior monitoring in elderly care;

· Position estimation in a distributed robotic system environment;

Commercial advertising;

• Human/robot interfacing in work environments;

• Medical care/assistance in a hospital setting;

Navigation, crew support, mechanical maintenance, and/or cargo handling in air transport.

An aspect of some embodiments of the present invention relates to systems and methods using a plurality of mobile sensing platforms for status data collection. From the status data, including from a relationship between status data collected from more than one of the mobile sensing platforms, there is determined a condition of an architectural facility (herein, "facility"). In the determining, data collected from the plurality of mobile sensing platforms are combined to determine a facility status pattern; and this facility status pattern indicates a facility condition to a level of detail and/or certainty potentially unavailable from any of the status data sources individually In some embodiments, status data combined to determine a facility status pattern are obtained at different times and/or places, optionally including at times and/or places which are disjoint. For example, status data in the pattern are collected and/or processed in periods separated by at least 1 minute, 2 minutes, 5 minutes 10 minutes, or another interval. Status data are optionally collected from locations separated by being in separate rooms of a facility, and/or separated by a distance, for example at least 3 meters, 5 meters, 10 meters, 15 meters, or another distance. Types of relationships among status data collected by different mobile sensing platforms and included in a single facility status pattern optionally include a commonality of sensed target (e.g., the same person, group, object, and/or room), a commonality of being collected from a shared sampling population (e.g., a population of facility occupants within a certain portion of the facility), and/or a commonality of sensing a parameter having a component of variance explained by a same source (e.g. , a same fire at a certain location explaining variance in temperature measurements at different locations measured by different mobile sensing platforms).

The status pattern indicates the facility condition by association, e.g., association learned using deep learning algorithms from the field of machine learning (ML). Use of association learning provides potential advantages for flexibility and/or expanded use cases of distributed robotic systems comprising optionally heterogeneous collections of robots and associated mobile sensing platforms.

Mobile sensing platforms of a distributed robotic system, according to some embodiments of the invention, include a plurality of robots capable of navigating within the facility, and/or robot- attached sensing hardware. Movement may be by any suitable method (e.g. , by use of wheels, propellers, and/or leg appendages). Optionally, robots may act under substantially autonomous control, and/or under the control of one or more central controllers. Regardless of the distributed architecture type implemented, there is sharing of data from a plurality of the mobile sensing platforms to one or more processing nodes (robots themselves and/or one or more a central processor) for the determination of facility conditions. Insofar as a mobile sensing platform optionally is integrated with control of its own navigation hardware, it "is" a robot for purposes of descriptions herein. Self- navigation may provide capabilities for enhancing sensing functions, such as autonomous navigation to engage and/or inspect a target of interest (e.g. , a portion of a facility space, a human occupant of the facility) from a different position. In some embodiments, at least some mobile sensing platforms are passive riders on a robot, without precluding that data sensed by passive-riding mobile sensing platforms may be used by an intermediate controller (e.g., a central controller) of the distributed robotic system as input for navigation or other control of the robot being ridden.

Optionally, the passive-riding mobile sensing platforms are physically attached to robots (e.g. , as add-on modules), but otherwise operate to sense and process data separately from the robot itself. Optionally or alternatively, the passive-riding mobile sensing platforms take advantage of data (e.g. , wireless connectivity), power (e.g. , power over USB), and/or other services provided by the navigable robot. Optionally, any passive (non-navigation dependent) task described herein that may be undertaken by a mobile sensing platform may be additionally or alternatively performed with the use of a suitably located stationary-location sensing platform (e.g. , a so-called "smart" device or system, such as an "internet of things"- (IoT) enabled camera). By "stationary- location" is meant that the sensing platform generally remains in a single location (does not autonomously transporting itself), but this does not exclude movements-in-place such as e.g. , camera panning. Some embodiments of the distributed robotic system may operate using such stationary-location sensing platforms. However, it is a particular feature and potential advantage of some embodiments of the present invention that a plurality of sensing platforms is mobile, for example as explained in the descriptions of Figure 1.

In some embodiments, one or more of the robots in the distributed robotic system is configured for at least one facility operations task apart from use in distributed sensing. Examples of robotic facility operations tasks include: vacuum cleaning (of floors and/or other surfaces), carpet shampooing, other floor cleaning, food carrying, luggage carrying, laundry carrying, dispensing/distribution (e.g., dispensing and/or distribution of soaps, towels and/or linens, reading materials, coupons, and/or gift shop items such as flowers), and/or facility address (room/building) delivery (e.g. , of packages, envelopes, and/or shopping deliveries).

Optionally, one or more of the robots in the distributed robotic system is equipped for social robotics: interactive exchanges with human facility occupants. Such exchanges may occur as part of another facility operations task, or as a facility operations take in itself. Equipment used to provide social robotics capabilities to a robot comprises, for example, a display (e.g. , an electronic screen, scanning laser, patterned light projector, and image projector), user input controls, speech production and/or recognition capability, and/or facial, posture, and/or gesture expression generator.

The method of facility condition determination is optionally used to assist one or more robotic facility operations functions. Optionally, status data sensing by a particular robot (and/or a mobile sensing platform it carries) is carried out as a function separate from a robot's facility operations task(s).

Optionally, a facility operations task comprises a robot's regular task function (optionally also a robot's primary design purpose). Regular task functions are designated, for example, by task- specialized tooling and/or form factors (such as a vacuum cleaner, manipulator, carrying compartment and/or facial expression generator) and/or by a task function which governs at least 50% of the activities and/or repositioning movements (e.g., as measured by distance and/or time of movement) of the robot among spaces of the facility.

Herein, a "facility condition" relates to a condition of facility architectural structures and/or object contents of a facility, and/or to conditions involving human occupants that affect functioning of the facility. Other terms paired with "condition" described herein refer to particular instances and/or types of facility conditions. Types and examples of facility conditions are described in relation to the figures, e.g. , beginning with facility condition types defined in the descriptions of Figure 1.

"Facility status data" (also referred to as "status data") comprises data sensed using mobile sensing platforms and used in determining an associated facility condition. Both raw collected data and collected data processed to produce an intermediate result are considered status data; for example, a raw video feed optionally is provided as status data; also a data structure indicating detection of some recognized target (such as a face, door, piece of furniture, or another object) optionally is provided as status data. In either case, the status data is referred to herein as being "sensed" by a "sensing capability".

The mutually equivalent terms "facility status pattern", "status data pattern", and "status pattern" refer to monitoring data (and optionally monitoring-adjunct data, e.g., position, time, and/or scheduling information) which potentially indicate facility conditions, in some embodiments of the present invention. More particularly, status patterns comprise status data in suitable combinations useful for facility condition indication. Other terms paired with "status" described herein refer to particular instances and/or types of status data and/or status patterns. Types and examples of status data and status patterns are described in relation to the figures, e.g., optional types of core sensing capabilities listed in relation to Figure 1.

In some embodiments of the invention, another feature of status data sensed using one or more sensing modalities is that they include {e.g., are logged along with) positions within a facility of the mobile sensing platform at the time of sensing. Optionally or additionally (for remote sensing modalities), a position being sensed remotely is logged. Optionally, the positions are described in the status data so that they can be related to positions within a navigation map used in common among the mobile sensing platforms.

An aspect of some embodiments of the present invention relates to systems and methods using a plurality of robots and/or associated mobile sensing platforms for distributed agent management of facility conditions. In some embodiments, sensing of facility conditions is distributed among multiple mobile (and optionally stationary- location) sensing platforms (for example as just described), while actions to manage are optionally distributed to agents not involved in the sensing, and/or agents which provide only fragmentary portions of status data used in determining the facility condition.

In some embodiments, the "agents" comprise automatic agents, in particular robots used as and/or with mobile sensing platforms, and optionally also other robots (for example, robots specialized by their appearance and/or capabilities for robot-human interaction) computer-driven displays, speakers, and/or kiosks.

Agent actions may include any suitable action initiated to adjust a facility condition. In some embodiments, actions are optionally directed to toward human occupants of a facility to, for example, inform, persuade, and/or guide them. With respect to human occupants determined by the system to be undertaking actions considered undesirable, actions may be directed to, for example, discourage, interfere, and/or reprimand them. Actions may include notifying and/or attracting the attention of facility occupants (e.g. , staff members) to adjust the facility condition. For robotic agents, actions may include performing a normal facility-oriented activity of the robotic agent, e.g. , cleaning a spill, delivery (e.g., of food, drink, toiletry incidentals, packages and/or envelopes), collecting a laundry bag, or another task appropriate to the capabilities of the individual robot.

An aspect of some embodiments of the present invention relates to the enhancement of automatic security monitoring within a facility by self -detection of monitoring security dead zones. Herein, a security dead zone (alternatively and equivalently "dead zone", or "unmonitored zone") is considered to exist in a facility location which a security monitoring system for a facility is unable to monitor for some reason. Dead zones may be dynamic. For example, private guest rooms may be restricted from security monitoring except under defined circumstances such as an emergency, and/or during room maintenance by a facility employee. Dead zones may also occur due to malfunctions within the security monitoring system. Dead zones may be known limitations of the system, due, for example, to constraints on allocation of resources.

In some embodiments, a security system comprises one or more robotic agents which also act as mobile sensing platforms. The security system is configured, optionally using a networked combination of stationary-location and mobile sensors, to monitor activities (e.g. , by video recording) within monitored regions of an architectural facility. In some embodiments, one or more automatic monitoring stations are provided which are in functional communication with the robotic agents and any other sensors of the security system. The security system may be dynamic in the areas of monitoring coverage and/or in coverage area requirements (e.g. , due to movement of people and/or objects within the facility). In some embodiments, robotic agents entering an area for which more complete security coverage is potentially required are configured to indicate their presence to detectors of the security system (and through them to the monitoring station), for example by flashing light, making sensor-triggering movements, making noise, or another method. When the robot's indication disappears from the security system's monitoring, yet the robot remains within an area designated for more complete coverage, the security system determines that a security dead zone exists. Optionally, the robot continues to map the security dead zone, creating a map which may be used as a basis for corrective action, for example, installation of an additional camera, or dispatch of a mobile sensing platform. Optionally, the robot used to probe the dead zone is itself reconfigured to become part of the security monitoring network, e.g. , while monitoring and/or occupying the identified dead zone.

An aspect of some embodiments of the present invention relates to mutual calibration of sensing by mobile sensing platforms within a distributed robotic system, used for position control of the mobile sensing platforms within the distributed robotic system. In some embodiments, one or more aspects of the calibration are calculated using machine learning techniques, for example, one or more deep learning algorithms.

A single robot (considered as an example of a mobile sensing platform) is limited to its own measurement abilities. However, in some embodiments of the invention, a plurality of robots can compensate for each other's possible limitations by suitable processing, filtering and analyzing of data collected by the aggregate.

In some embodiments, positions of mobile sensing platforms are determined relative to a navigation map of the facility (or other area being monitored). The navigation map is shared in common among the plurality of mobile sensing platforms, and dynamically updated using sensing data from the mobile sensing platforms themselves. Optionally, sensing data from different mobile sensing platforms are calibrated based on a consensus-finding calibration algorithm to produce a consistent map. For example, the consensus-finding calibration algorithm may comprise an average, and optionally an average weighted according to accuracies and/or precisions of mobile sensing platform position information. In some embodiments, the weighting is determined

An aspect of some embodiments of the present invention relates to systems and methods for machine learning of status data patterns indicating facility conditions.

In some embodiments, status data sensing is performed using sensing capabilities which are general purpose and/or raw (e.g., image collection, audio collection, position sensing, motion detection), and/or derived by computer processing of raw input, e.g., to recognize visible, audible, and/or otherwise indicated environmental features and/or their positions. The processing may be performed by use of any method suitable for the features; for example, convolution/threshold based image processing, and/or neural-network based object detection.

In some embodiments of the invention, status data patterns are based on raw status data and/or features processed from raw status data. Use of raw status data provides a potential advantage for preserving maximum latent information for pattern identification. However, in large quantities, raw data can be cumbersome to transmit and store (especially for a system on which real-time and in-the-field performance demands are placed). Raw data also leaves open the matter of salience— which can make such data difficult to use for deep learning.

Processed features, at least as identified using current computer-based technologies, generally need to belong to classes of environment features (such as objects of a particular type) for which a recognition capability has been explicitly provided. However, such processed features are likely to be selected because they are considered salient within their environment, and their descriptions can potentially be transmitted, stored, and/or subjected to further processing with substantially less use of resources than raw data.

In some embodiments of the invention, the nature of the association between patterns in status data and facility conditions is one of correlation, rather than of direct identification. In some embodiments, parameters of correlation— optionally even its existence— is determined through machine learning. In some embodiments, a procedure for teaching a distributed robotic system a new association between status data patterns and facility conditions comprises supervised learning: e.g., tagging epochs of status data used in training according to one or more facility conditions which are known (e.g. , identified from an automatically and/or manually recorded log, and/or identified by manual review of the status data itself) to have occurred during acquisition of the status data. Optionally, status data are filtered based on their suspected salience (e.g. , by human selection), and/or salience is learned during the machine learning process. For example, in facility security conditions, positions of identified individuals may be of particular salience, and machine learning is carried out on observations of a large number of different individual position patterns, each of which has been separately scored for its relevance to one or more security concerns. In some embodiments, unsupervised learning is performed, e.g. , to distinguish different classes of patterns latent in the status data. For example, cluster analysis (e.g. , machine learning implemented cluster analysis) is optionally carried out to assign cluster classifications to different patterns of identified facility occupant positions, based on any suitable model of clustering (e.g. , connectivity, centroid, distribution, density, subspace, group, graph-based, and/or neural). Identified classes are optionally used as a basis for supervised learning. For example, upon recognition that recorded status data are capturing a certain identifiable class of status data pattern (e.g., a pattern of daily towel usage), it may be separately determined (e.g., by machine learning) which facility conditions (e.g. , as manually and/or automatically logged) are correlated with that particular pattern (e.g., capacity utilization of a pool the next day; and/or— perhaps more surprisingly— capacity utilization of a restaurant that turns out to be correlated with use of the pool).

In some embodiments, facility conditions are determined by an iterative ("chaining") process in which the activation of a previously inactive mode of collecting/processing status data used for determining a second facility condition is triggered by the determination of a first facility condition. The second facility condition optionally represents, for example, a verification, refinement, and/or follow-up of the first facility condition. For example, a first facility condition may be based on crude identification of an obstacle in a certain place, and a second facility condition represents identification of the obstacle as a fallen facility occupant.

In some embodiments, collection of status data is controlled to support at least partially automated development of facility condition chains. For example, certain low level identifications (unexpected obstacle) are optionally associated to a facility condition which may be expressed as "system should investigate this facility condition". This can trigger actions related to investigation such as bringing one or more additional robots to investigate which may be better suited to investigatory tasks (e.g. , a wider range of status data collection methods, greater processing power, higher quality sensing), than an originally detecting robot; sending more raw data back to be stored and/or processed by a central monitoring processor; changing a priority of the processing of status data of one or more particular types, or another action related to status data collecting and/or processing. Potentially, the automatic adjustment to data collecting mode results in the provision of status data which correlate with a more detailed second facility condition, e.g., to distinguish an obstacle as a facility occupant, and/or to distinguish the facility occupant as conscious or unconscious.

Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings. The invention is capable of other embodiments or of being practiced or carried out in various ways. Method of Distributed Sensing of Facility Conditions

Reference is now made to Figure 1, which is a flowchart schematically illustrating a method of determining a facility condition based on distributed sensing of facility status data, according to some embodiments of the present disclosure.

In overview:

At block 101, in some embodiments, a data structure associating status data patterns to at least one facility condition is received.

At block 103, in some embodiments, a plurality of mobile sensing platforms sense status data.

At block 105, in some embodiments, a status pattern in the sensed status data is matched to a facility condition, using the associating data structure.

Terms and activities mentioned in Figure 1 are now described in additional detail.

Facility conditions

In some embodiments of the invention, the at least one facility condition of blocks 101 and 105 can include one or more conditions from of a potentially diverse range of facility condition types.

Certain architectural facilities— for example, the buildings, grounds, rooms, and/or other spaces of a hotel, motel, resort, spa, cruise ship and/or port facility, concert venue, sports venue, conference center, visitor center, shopping center, care facility {e.g., nursing home), supervised living facility, airport, hospital, restaurant, museum, theme park and/or other tourist attraction, warehouse, assembly line, and/or office complex {e.g., a high technology office complex)— are subject to event- specific and/or usage conditions, particularly event and/or usage conditions related to human occupancy, which have a potential impact on functions of the facility, comfort and/or convenience of its users, and/or on the safety and/or security of persons and/or property within the facility. In some embodiments of the present invention, one or more of the following categories of facility conditions are indicated based on the determination of one or more corresponding facility status patterns.

Capacity utilization conditions: Among intended uses of a facility, there may be more- and less-preferred conditions in the utilization of facility capacity. Less-preferred capacity utilization conditions may be, for example, those which lead to inconveniences for facility occupants, and/or inefficiencies and/or loss of revenue for facility operators.

For example, spaces (e.g., rooms, hallways, pools, exercise facilities, spas, business centers, courtyards and/or parking garages) and/or services (e.g., reception, mobility assistance, transportation assistance, luggage porting, laundry services, room cleaning services, food services, and/or concierge services) of the facility may be underused, overused, and/or subject to oscillations and/or unpredictability in capacity utilization. A capacity utilization condition may be assessed relative to a utilization balance among spaces and/services. In a less-preferred condition, for example, one of two facility restaurants may be crowded, while the other is emptier.

Maintenance conditions: Spaces and/or services may also be subject to maintenance requirements which are potentially irregular; e.g., depending on unpredictable and/or uncontrolled events; and/or depending on a non-constant rate of wear and/or exhaustion of supplies. In some cases, there may be a preference to schedule maintenance actions for a space and/or service so that they do not coincide with and/or do not unduly interfere with heavy capacity utilization, or even any utilization of the space and/or service. For example, use of the space and/or service is optionally restricted for maintenance, and/or maintenance is performed when the space and/or service is otherwise unengaged.

Security concern conditions: Another type of less-preferred condition includes activities of authorized and/or unauthorized facility occupants which are security concerns. Security concerns related to the activities of individuals and/or particular groups include, for example, behaviors liable to disturb other occupants, destroy property, monopolize or otherwise inappropriately access facility services, and/or outright theft. These behaviors can potentially occur alongside and/or masquerading as legitimate activities of facility occupants, so that the problem of detecting them is not necessarily suitable to sensing methods such as motion and/or intrusion detection. Moreover, certain behaviors are potentially security concerns even though they are not intrinsically harmful; for example: loitering, following or otherwise tracking fellow facility occupants, and/or presence in multiple areas without explanation.

Additionally or alternatively, certain security concern conditions relate to the state of the facility itself and/or its object contents. In particular, security concern conditions optionally arise from transient vulnerabilities: e.g., doors and/or windows left open, power and/or communication losses, movement and/or variable locations of high- value assets (e.g. tools, audio-visual equipment, and/or art objects). In some embodiments, certain events related to normal functioning of the facility are optionally also considered security concern conditions, e.g., activity during shipping and/or receiving, and/or activity related to securing a facility space for closure (e.g., nightly closing of a shopping area).

Safety concern conditions: Another class of less-preferred conditions comprises facility states and/or occupant activities raising safety concerns. Many types of security concerns are also safety concerns. Ordinary concerns more specific to safety include over-occupancy and/or maintenance of clear passageways. Emergency safety concerns include flooding, fire, and/or individual medical emergencies. As for security concern conditions, safety concern conditions may be related to normal functioning of a facility; for example, when a facility is crowded, medical safety preparedness may reach a higher level of safety concern (e.g., indicating putting an emergency medical team on alert).

Examples relating to the general types of conditions are provided, for example, with respect to discussion of Figures 3 6 and 9 10.

Status data

In some embodiments, status data sensing is performed using sensing capabilities which are general purpose and/or raw (e.g., image collection, audio collection, position sensing, motion detection), and/or derived by computer processing of raw input, e.g., using a software library providing a capability for detection of a class of common features of a facility environment such as human faces, human forms, voices, walls, corners, and/or "obstacles" (as a generic class).

Optionally, sensing data processing capable of recognizing another class of object (for example, one more particular to a specific facility environment) is performed, for example by using an object recognition neural network trained to recognize the object (this may be implemented, for example, using the Darknet open source neural network framework implementation of the YOLO real-time object detection system). Examples of objects which may be optionally selected for recognition include, for example, doors (open or closed), windows (open or closed), persons in states of particular interest (e.g., sleeping, sitting, standing, and/or wearing facility staff uniforms), maintenance equipment (e.g., laundry bins, vacuum cleaners, cleaning carts, and/or ladders), furniture (e.g. , tables, chairs, sofas, and/or beds), other free-standing items (e.g. , trash receptacles, barriers such as rope barrier line dividers, signage, and/or bulletin boards), loose articles (e.g., wallets, purses, bags, and/or umbrellas), or another object class.

Other sensing capabilities may include, for example: detection and/or identification of noises; speech-to-text; detection of vibration as such (e.g. amplitudes of floor vibrations and/or ambient noise levels); identification of detected faces, voices, and/or other identifying features of individual human occupants; segmentation of human forms and/or objects from background (a separate problem from detecting their presence); atmospheric conditions such as temperature, smoke (e.g. , by particle detection), chemical detection ("smell"); motion; position-finding within the facility; and/or telemetry data of robotic movements which are potentially affected by the presence of nearby human occupants (e.g., adjusted to avoid collisions with human facility occupants).

Processing of raw data to intermediate status data results optionally includes onboard and/or remotely assisted processing of raw sensing data to produce intermediate status data results such as facial recognition, object recognition, feature identification, and/or speech-to-text. Both processing locations are potentially subject to limitations on use. For a given robot, each additional "recognizer" implemented represents an additional load for processing, which in general is limited by considerations, for example, of hardware cost, available electrical power budget, and/or competing processing priorities. On the other hand, live streaming of a sensing modality such as high definition video to a server which is less limited by processing bandwidth is potentially limited by available network bandwidth (especially if this bandwidth has to be shared by several streams simultaneously) and/or available input/output bandwidth.

Moreover, aside from limits on processing, sensors themselves (e.g. , cameras, distance sensors, and/or microphone arrays) may be limited in one or more of resolution, field of view, accuracy, precision, noise rejection, or another technical feature; for example due to cost considerations, and/or engineering tradeoffs.

In some embodiments, this results in a situation where any individual robot in a distributed robotic system is configured to be capable of reporting status data for its environment in a fragmented fashion, limited by considerations of, for example, economics, processing power and/or bandwidth.

A mitigation of this, in some embodiments, comprises performing status data sensing in a tiered and/or modal fashion, wherein certain sensed conditions trigger a change in how sensing is performed and/or processed. For example, a robot in a mode relying on a simple processing capability (e.g. , an object recognizing network trained on generically recognizing imaged obstructions of a certain size range) is optionally elevated (upon detection of an obstruction) to a mode wherein more particular recognition is possible: for example by triggering streaming of full images from the robot to a more powerful central processor, and/or by triggering a slower but more complete attempt to perform object identification on-board the robot. In some cases, this "elevation" can be based on a single robot's own status data sensing.

However, in some embodiments, there are status data detections of potential interest for deeper inspection which— for whatever reason— cannot or should not be elevated without more information. For example, a cleaning robot working in a heavily trafficked hallway may be equipped to recognize some basic representation of a human form in front of it; e.g. , in order to avoid collisions. However, the same robot would potentially be overwhelmed if its processing was interrupted to perform detailed identification of each individual facility occupant that it encountered. In such a case, there may be an advantage to waiting for and/or gating on some second status data element before further processing is performed. For example, if there is a determined or otherwise provided facility condition (e.g. , an impending children's' event) that makes it worthwhile to engage a certain class of facility occupant (e.g., a child) in robotic social interaction, then a portion of processing and/or network bandwidth is optionally spent on "child identification" which otherwise would not be performed.

In some embodiments, a new facility condition is built up in this way, including changing of status data processing patterns as facility conditions are recognized and/or change. For example, a collection of generic moving object status data pattern recognitions indicate a pattern of unexplained, rapidly moving objects in some space of the facility (e.g. , a hallway). In some embodiments, this triggers the distributed robotic system to enter a state where it requests robots entering the space to perform additional processing and/or transmitting of image data to a central server and/or to security monitoring staff. Potentially, this leads to a clearer identification of the source of the earlier-noticed pattern: e.g., children playing a game, an animal running loose, or a panicked facility occupant. Optionally, the system specifically directs a robot (e.g. , one which is unengaged, or engaged nearby in a lower priority task) to move to perform the status data sensing.

Potential advantages of mobile sensing platforms for collecting status data

From discussions and examples of facility conditions provided herein, it may be understood that any space of a facility is potentially implicated in some condition of interest. In the case of security concern conditions, there may even be an active attempt by a misbehaving facility occupant to seek out-of-the-way spaces. However, a capability for full, continuous, and direct monitoring (e.g., sound and/or visual monitoring) of all parts of the facility using stationary-position sensors is potentially expensive to install, configure and/or maintain, even if were to be performed automatically. Human supervision of large numbers of sensing nodes is also expensive, and can become overwhelming relative to benefit for a sufficiently large facility. Transmission bandwidth itself can be limiting for a sufficiently large facility. In some cases, legal considerations prevent cameras from being put into certain areas for privacy reasons, yet under other circumstances (e.g. , in response to safety and/or security concerns), it may become permissible to monitor an otherwise off-limits facility space.

Accordingly, mobile sensing platforms (e.g. , robots, and/or sensors mounted to robots), provide certain potential advantages. They can optionally direct themselves and/or be directed to investigate any platform-accessible space in the facility, and may do so according to dynamic needs and/or permissions. Wheeled robots can move throughout large areas of most architectural facilities; additionally or alternatively, robots are equipped with legs or another type of motility (e.g. , are operable as airborne drones).

Mobile sensing platforms, in some embodiments, also provide an ability to adjust position in response to conditions (e.g. , to provide different perspectives on a space), and/or to influence them (e.g., to engage with a human occupant, potentially eliciting further behaviors that indicate whether the human occupant is relevant to some particular facility condition).

Mobile sensing platforms also may have sufficient internal resources useful for analysis of data that results can be provided to a central repository and/or to other members of a distributed system of individually (at least partially) autonomous robots as high level summaries. For example, face recognition data can be provided to be transmitted to and/or retained by the system as a summary data structure of the result of face recognition, which can be a much more compact representation than a live video stream.

Incidental/auxiliary sensing of status data

The status data of block 103 sensed by a mobile sensing platform are optionally acquired by sensing which is incidental and/or auxiliary to a separate regular (and optionally primary) task function of the mobile sensing platform, and/or of a robot to which the mobile sensing platform is attached.

Current robotic solutions tend to find market success where they are constructed and tooled to primarily address relatively specialized problems, e.g. , one of manufacturing, cleaning (often just a particular aspect of cleaning such as floor cleaning), face-to-face human interaction, telepresence, vending, carrying, and navigation. However, basic sensing capabilities such as video and sound are becoming commoditized, while modular packages supporting certain common sensory processing tasks such as person/scene segmentation and face recognition are also increasingly available. Machine learning and creation of associating data structures

In some embodiments, determining of facility conditions comprises the use of machine learning methods (e.g., convolutional neural networks) which take status data patterns as indicators-by-association of facility conditions— without necessarily requiring direct sensing of the determined facility conditions as such. Such methods are potentially well suited for (while not being limited to) the use of status data acquired from a distributed network of mobile robots, and optionally incidentally and/or auxiliary to facility operations tasks of the robots.

Security-oriented robots performing at least some monitoring functions have been previously described, and in a few cases placed on the market. However, in the awareness of the inventors, there remains a significant development problem in diversifying the capabilities of robotic systems to automatically recognize more of the wide variety of facility conditions which (human) facility managers might find valuable to automatically identify.

Considering a mundane pair of examples: the machine vision problem of recognizing that a bathroom is running out of toilet paper may be solvable using a mobile sensing platform and suitable algorithmic development relying on current technologies (e.g. , a suitable guidance algorithm coupled to a suitably weighted object recognition implementation such as YOLO). However, it is a significantly different problem— from the standpoint of R&D effort and potentially even underlying algorithms— than the seemingly closely related problem (again, cast as a machine vision problem) of recognizing that a napkin dispenser in a restaurant is running out of paper napkins. The number of such "nice to solve" problems which facility functions raise could be understood as indicating potential for commercial opportunity— but is also potentially much larger than could reasonably be individually addressed by even a fairly large algorithm development team. The problems themselves might not be even apparent to someone working away from actual (and perhaps even individual) facilities in which they arise.

The inventors have realized that a class of problems having potentially significant commercial value may be addressed by using data sensed from a plurality of mobile sensing platforms as indications-by-association of one or more facility conditions. Optionally, the sensing occurs as the mobile sensing platforms go about other tasks (e.g. , facility operations tasks that take them through a monitored space only occasionally and/or transiently; and/or tasks that rely on a plurality of simultaneous or near-simultaneous perspectives). Potentially no individual platform by itself provides sufficient indication of the facility condition.

In some embodiments, an ML technique (e.g. , a deep learning algorithm) and/or another method such as manual labeling is used to recognize patterns in status data that indicate (e.g., are associated with in an associating data structure, such as is mentioned in block 101 of Figure 1) one or more facility conditions. A potential advantage of using indication of facility conditions by association with status data patterns is that the facility condition does not have to be directly determined, so long as it is statistically correlated or otherwise known to be co-occurring with certain status data patterns, at least to a sufficient extent to be usefully predicted. For example— to continue examples introduced above— daily occupant traffic along a corridor leading to a restroom or restaurant, as intermittently sensed by a plurality of robots of a distributed robotic system, may be determined to correlate with the use of toilet paper and napkins, respectively, allowing estimates of these facility conditions to be obtained from intermittent crowd counts (with the understanding that even rough estimates are potentially of assistance in scheduling supply replenishment). Optionally, the sensing robots are primarily engaged in tasks other than crowd counting, but perform partial counts as they transit the corridor in the course of those other tasks.

Thus, a potential advantage of using an associating data structure to make facility assessments is that direct determination of certain facility conditions indicated by status data may be bypassed (including bypassing of targeted development of special algorithms, motion routines, etc. dedicated to identification of the condition). Another potential advantage is that the association is optionally learned directly from status data reflecting conditions within a facility, bypassing several potential sources of data noise, ambiguity, and/or modeling error.

Machine learning techniques referred to herein optionally include so-called "deep learning" algorithms, which use a cascade of many layers of nonlinear processing units for feature extraction and transformation. Each successive layer uses the output from the previous layer as input. Deep learning algorithms may be supervised or unsupervised; and applications include pattern analysis (unsupervised) and classification (supervised). Deep learning algorithms are based in particular on the (unsupervised) learning of multiple levels of features or representations of the data. Higher level features are derived from lower level features to form a hierarchical representation. Deep learning algorithms typically use some form of gradient descent for training. Machine learning algorithms, and the algorithms of deep learning in particular, are optionally interpreted in terms of probabilistic inference (Bayesian inference). Examples of software frameworks (software libraries) which enable deep learning methods include TensorFlow and PyTorch, both freely available open source implementations of a collection of machine learning capabilities. More specific network models available include, for example, AlexNet, ResNet, GoogLeNet, and VGG. In some embodiments, object recognition is performed using the YOLO real-time object detection system, which has been implemented, for example, using the Darknet open source neural network framework.

In some embodiments, supervised machine learning training feedback (e.g., of a convolutional neural network) may be provided in the course of a collection of scenarios in which the distributed robotic system observes (using its usual mobile sensing platform sensors) a certain range of facility conditions, and is informed for each scenario how the scenario should be classified. Scenarios may be actual or enacted for purposes of the training. Optionally, scenarios are collected across a plurality of distributed robotic system installations (e.g., different hotels in a chain). In some embodiments, training feedback uses partially or fully simulated scenarios, e.g. , a plurality of separately recorded "tracks" of status data from particular human occupants superimposed to simulate different scenarios.

In some embodiments, unsupervised training is optionally performed, e.g., on status data recorded in the course of normal operations (from one or more facilities), allowing the identification of clusters of status data patterns. Optionally, clusters are associated to particular facility conditions by a human system operator. Optionally, after training, sensed status data patterns which do not fall within a known cluster are flagged for the attention of a human supervisor when they occur.

In some embodiments, status data is combined with indications of significance such as the use of geofencing and/or schedule-dependent tags; e.g., stairwells or other locations may be marked as being of particular significance in status data patterns for identifying a facility condition of "loitering occupant", while the lobby of a building may be marked as being of decreased significance. Times of public business for a facility are optionally treated differently than times when a facility or portion thereof is closed.

Optionally, a status data pattern is at least initially unconnected with a particular facility condition, but is "unusual" in some way (e.g. , an unusual space distribution of a number of detections of a new face).

Example System for Distributed Sensing

Reference is now made to Figure 2, which schematically represents a system for distributed sensing of a facility condition, e.g. , according to the method of Figure 1, according to some embodiments of the present disclosure.

Facility 250 is the facility for which one or more facility conditions 261 are to be determined. A plurality of mobile sensing platforms 251 (e.g. , robots) are positioned within facility 250, each operated to move, over the course of its operations, along its own path 252 (paths 252 are represented schematically; they indicate that different mobile sensing platforms 251 move through different but sometimes overlapping spaces within facility 250.

Status data sensed by mobile sensing platforms 251 are collected (e.g., continuously collected), and sent to a collection node 257. Processing node 259 matches patterns within the collected status data to a facility condition 261 through an associating data structure 253, for example, an associating data structure which is a product of a machine learning algorithm (deep learning and/or convolutional neural network implementation, for example).

Optionally, at block 263, some action is taken in response to the facility condition. The action is optionally performed by robotic elements (e.g. , instances of mobile sensing platform 251) of the system, and/or by a separate system for recording, control, and/or notification. Optionally, the response comprises notification of a human supervisor of the facility condition. Optionally, the system acts autonomously and/or under at least partial supervision to change or otherwise react to the facility condition 261.

The above configuration is for a distributed but at least partially centrally managed robotic system. In some embodiments, the system uses fully autonomous robots, with no designated collection node 257, and wherein status data 255 are maintained on individual mobile sensing platforms 251, which in this case are in direct communication with each other. Also optionally when operating in an individually autonomous mode, processing node 259 is implemented by processors associated with the individual mobile sensing platforms 251.

Distributed recognition of individual facility occupants

Reference is now made to Figure 3, which is a flowchart schematically representing a method of identifying individual facility occupants based on partial matching of a signature, according to some embodiments of the present disclosure.

At block 350, in some embodiments, an initial collection of status data relating to an unidentified individual in a facility space is made.

At block 352, in some embodiments, the status data is matched to an individual signature, based on a high-confidence identification method. The high-confidence identification method comprises, for example, facial recognition, and/or identification based on a digital identification method such as bar code and/or RFID tag.

It is noted that along with the status data used in making the high-confidence identification, there is also collected some status data which is, in itself, "low- confidence" for use in identification— for example, the location of the identified individual, or a common and/or transient feature such as hair color, clothing color, or another feature.

At block 354, in some embodiments, new status data is collected related to an (initially) unidentified individual (which happens to be the individual identified in block 352).

At block 356, in some embodiments, the new status data is against matched to identify the individual, but this time only a low-confidence identification method is used, e.g., a method based on potentially ambiguous and/or non-unique information. Nevertheless, the overall confidence in the identification is increased, because the previous identification (of block 352) is judged to be sufficiently relevant to be relied upon. For example, it was within the same general location (e.g. , the individual is in/at the same room, chair, and/or table as before) and/or within a reasonable time period (e.g. , within the last minute, 5 minutes, 10 minutes, hour, or another period). Optionally, the confidence is increased because of external knowledge, e.g., that there is no one else in the area who is likely to match the low-confidence method.

Optionally, the signature used at the time of the high-confidence identification is augmented with data then-obtained to provide a basis for the later low -confidence identification. For example, features of clothing are recorded when the high-confidence identification is made, and these are later used for making the low -confidence identification of block 356.

A potential advantage of the method of Figure 3 is to allow use of a composite "signature" (comprising high-confidence and low-confidence identifying characteristics) to be used for tracking an individual over the course of a plurality of sensing encounters, even when some of the sensing encounters are, in themselves, insufficient to unambiguously identify the individual. Optionally, the composite signature is assembled over the course of a plurality of encounters between mobile sensing platforms and the individual being identified.

In some embodiments of the present invention, individual facility occupants (e.g. , guests and/or staff) encountered by particular instances of mobile robotic platforms belonging to the distributed robotic system are routinely characterized and/or identified. Characterization is optionally according to general population characteristics such as apparent gender and/or age, based on voice and/or appearance. Optionally, the distributed robotic platform may not use these characters as a basis for interacting with facility occupants (optionally applying suitable caution in phrasing to allow for the possibility of mischaracterizations). Optionally, population characteristics are treated as elements of status data used in determining facility conditions, for example, to help identify demographics attracted to particular spaces and/or event offerings of the facility.

In some embodiments, characterization is carried out to the level of individual identification (e.g., corresponding to blocks 352 and 356 of Figure 3); for example based on facial recognition, distinguishing marks, specific size, clothing (which, though changeable, is potentially identifying over time scales of a few hours), voice patterns (for example, speech patterns recognized using modules for speech-to-text and/or natural language processing), footfall sounds (e.g. , the sound of each footfall, footfall intervals), other gait features (e.g. , stride length), and/or other characteristics. Optionally, individuals may carry and/or display identification tags readable by machine, for example bar coded and/or radio frequency identification tagging. Identification is not necessarily performed to a level of absolute confidence; for example, 95% accurate identification may be sufficient for purposes of identifying patterns related to general facility use. Individual-identifying characteristics collectively are also referred to herein as comprising a "signature" of an individual facility occupant. New signatures for new facility occupants optionally are constructed from status data sensed on the fly (e.g., whenever there is no confident identification from a particular sighting). Optionally, at least portions of signatures are generated by explicit procedures, e.g., photography performed and/or forms provided at check in, greeting robots and/or kiosks which closely interact with facility occupants as they arrive (or already- arrived occupants who are not immediately recognized based on past signature characteristics). Non-signature (that is non-identifying) information may also be associated with the signature, for example, preferences and/or attributes which potentially relate to operations of the facility.

Optionally, partial signature recognition of individuals is performed (e.g. , identification based on clothing when a face is not currently observed and/or recognized, but the clothing pattern seen been recently observed by the system in association with a face that the system considers identifying with high confidence).

Insofar as characteristics of a signature are associated to each other, a distributed robotic sensing network can help maintain a more constant identification of facility occupants across a plurality of robot/occupant encounters. For example an observed characteristic of an occupant may be ambiguous as to an associated identification in and of itself. However, previous associations of the characteristic with higher confidence identifications, and/or associations of the characteristic with constraints such as position and/or time of a current observation relative to a recent high-confidence identification may substantially remove this ambiguity. A potential advantage of this is to allow even poorer-quality status data from mobile sensing platforms to be useful in performing identifications, e.g., because a particular mobile sensing platform is not equipped for high-confidence sensing/processing, and/or because sensing conditions are not optimal (e.g. , an identified individual is too far away from a sensor for facial recognition, and/or the sensor is sensing an identified individual only incidentally to the performance of another task).

In some embodiments, associations among characteristics of a signature are set up by explicit rules, for example, confidence functions which decrease for characteristics which are identifying with low confidence on their own as a function of time and/or distance from observations of the low-confidence identification characteristic in conjunction with a higher-confidence identification characteristic. Optionally, identifying conjunctions are transitive through a multiplicity of characteristics. For example, a face is first identified in conjunction with a shirt color; the shirt color is later observed in conjunction with a shoe color; and still later the shoe color is observed alone. In this case, the shoe color may be sufficient for the system to make a high-confidence identification, depending on influencing conditions such as numbers of persons present and/or time elapsed.

In some embodiments, associations among characteristics of a signature used in identification are set up by use of a machine learning technique, for example, a deep learning algorithm and/or convolutional neural network. In some embodiments, the machine learning is trained on actual status data, using human supervision to verify identifications. Optionally, a training set is at least partially synthesized, e.g. , the machine learning is based on actual and/or simulated paths of facility occupants through a map of the facility, with simulated determinations of signature characteristics available at different times/positions along those paths.

In some embodiments, once an individual is (at least tentatively) identified from a signature, a distributed robotic system can take other actions related to the individual.

In some embodiments, a robot (not necessarily the one performing the most recent status data sensing from which the individual is identified) is dispatched to interact with the identified individual. Optionally, this is in order to confirm the identification, increase a level of confidence in the identification, and/or obtain auxiliary information to be associated with an identifying signature of the individual.

In some embodiments, the interacting robot acts to gather more information about the individual which is not necessarily signature (that is, identification-related) information. This may be survey-type data (e.g. , to help assess facility occupant preferences), and/or data related to a current state of the individual, for example, to assess their openness to suggestions (e.g., for use of facilities), their status with respect to potential security concerns (e.g., are they demonstrating avoidance and/or disturbing behaviors), and/or their status with respect to potential safety concerns (e.g. , are they ill, lost, and/or distressed). Herein, non-signature data associated to an individual is referred to as "profile" data.

In particular, if there is an existing "less-preferred" facility condition (related to capacity utilization, security concerns and/or safety concerns, for example), robotic interactions with an individual may be directed toward changing the facility condition based on attributes and/or preferences particularly associated as profile data to the individual's signature.

Signatures and/or profiles are optionally stored using a central storage and/or processing facility (this corresponds, in some embodiments, to a distributed robotic system managed from a central server). Optionally, storage is distributed; for example, mobile sensing platforms are configured to query each other (and to be queried) to receive and/or provide signature and/or profile information. This corresponds, in some embodiments, to an individually autonomous distributed robotic system configuration, wherein each mobile sensing platform accesses the other sensing platforms of the system as resources available to address a signature identification and/or profile-related situation that it faces.

Distributed robotic systems and capacity utilization sensing

Reference is now made to Figure 4, which schematically represents a method of distributed capacity utilization sensing, and optionally actions to modify capacity utilization, according to some embodiments of the present disclosure.

At block 402, in some embodiments, a first mobile sensing platform collects status data that is related to utilization of a facility capacity.

At block 404, in some embodiments, an at least second mobile sensing platform collects status data that is related to facility capacity utilization.

At block 406, in some embodiments, the status data from the first mobile sensing platform and the at least second mobile sensing platform are combined, and associated via an associating data structure to a facility condition, which is a determined facility condition. Optionally, at block 408, in some embodiments, an action is taken (optionally by a robot which is networked together with the mobile sensing platforms) to modify the determined facility condition.

Reference is now made to Figure 10, which schematically represents the floor plan of architectural facility 1000 in which a distributed robotic system is operating, according to some exemplary embodiments of the invention.

Facility 1000 is an example of a hospitality facility (e.g. , a hotel) including examples of typical elements such as a ground floor 1001, an upper floor 1002, and two guest wings 1020, 1030 each comprising a hallway 1045 and a plurality of guest rooms 1040, stair wells 1010, 1010A, 1010B. Also indicated are main lobby 1007, elevator lobby 1014, elevator shafts 1012, and office 1051 (in some embodiments, a central controller for a distributed robotic system is located in office 1051). The special capacity spaces 1003, 1004, 1005, 1006 optionally represent restaurants, auditoriums, or other special use spaces, for example as variously described in relation to certain examples herein (for purposes of description, some of these spaces 1003, 1004, 1005, 1006 are assigned different roles in different examples). Mobile sensing platforms (e.g., robots, in some examples and/or embodiments) are indicated by open squares 1052, and facility occupants are represented by solid circles 1054.

In some embodiments, a distributed robotic system operates within a hospitality facility; optionally a facility providing one or more hospitality services mixed with another function: for example, a hotel, motel, resort, spa, cruise ship and/or port facility, concert venue, sports venue, conference center, visitor center, shopping center, care facility (e.g. , nursing home), supervised living facility, airport, hospital, restaurant, museum, theme park and/or other tourist attraction, warehouse, assembly line, and/or office complex.

Optionally, a distributed robotic system is used as a portable system moved among a plurality of facilities, e.g., accompanying and/or transported along with a tour group for deployment to provide tour- group services at locations visited by the group. In some embodiments, the hospitality facility comprises a combined work/communal housing environment, for example a drilling platform, cargo ship, research station, emergency services station, barracks, and/or maintenance outpost. The hotel-like floor plan of Figure 10 is used as an example of any of the previously mentioned facilities. In some embodiments, distributed sensing comprises recognition of a level of utilization of a facility capacity. For example, a pattern of space occupancy (by facility occupants 1054) corresponds, in some embodiments, to underutilization or overutilization (actual and/or impending) of a facility capacity such as a restaurant, bar, spa, business center, game room, swimming pool, sauna, game court, room service, porter service, laundry service, gift shop, reception desk, concierge desk, snack room, or another facility capacity.

The recognition optionally comprises status data matching a pattern associated to a capacity utilization condition of the facility, for example by using an associating data structure. The pattern optionally comprises status data indicating actual current utilization of the capacity (e.g., a crowd count), and/or status data which are associated to a certain future level of capacity utilization— e.g. , level of footfall noises in a hallway during a certain period which are associated (e.g., via machine learning) to likely near- term utilization (e.g. , within the next few minutes) of a facility restaurant.

Optionally, a level of facility capacity utilization is recognized explicitly (e.g., a restaurant manager signals a restaurant capacity utilization problem which results in activation of the distributed robotic system to address the problem using methods which potentially redirect the facility utilization pattern). It is noted that explicit recognitions are optionally used as training feedback to the distributed robotic system, potentially leading to automatic recognition in the future.

Optionally, the capacity utilization noted is a combination of explicit recognition and conditions attributed to distributed sensing— for example, a heavily occupied lobby is noted (and signaled to the system) by a facility manager, while an underutilized restaurant is noted by the system itself. The pattern in this case is potentially different (and may result in different actions and/or recommendations) than a pattern where the heavily occupied lobby is additionally paired with heavy utilization of elevators, porting services, and/or transportation services, which may indicate a bottleneck, rather than a clientele looking for something to do.

In some embodiments, a distributed robotic system interacts with facility occupants 1054 in order to affect utilization of one or more capacities of the facility.

In an example of such a scenario, a stage entertainment is scheduled for a certain time (e.g. , scheduled to occur in special capacity space 1004). Management (e.g. , operating out of office 1051) has the option to explicitly instruct the robotic system to take actions to begin encouraging facility occupants 1054 to move toward the stage area starting at some time before the show. At about that time, the distributed robotic system checks a current facility utilization state (determined from distributed status data sensing, e.g., as described in Figure 1). This shows where facility occupants 1054 currently are (for example, in Figure 10, a concentration of them is located in special capacity space 1006, which is optionally a restaurant), and thus to where advertising efforts for the stage entertainment may optionally be directed. Advertising efforts optionally include, for example, interactions with the sensing robots 1052 themselves, announcements, and/or electronic signage changes. Optionally, the system suggests locations for human attention, e.g., sending stage entertainers to active locations to advertise their show. In some embodiments, facility management has the option to discourage the system from pulling facility occupants 1054 away from other revenue- generating centers of the facility (for example, special facility space 1003, which optionally is a gambling casino), and/or to encourage the system to preferentially direct its efforts at over-utilized capacities of the facility. In general, the resulting system acts to manage facility capacity utilization according to provided goals, with specific actions governed by facility states observed from a distributed network of mobile sensing platforms 1052.

Other types of robotic actions optionally include issuing of discount coupons, striking up social robotic "conversations" to distract (and potentially delay and/or retain) facility occupants 1054, engaging in diverting behaviors (e.g., cleaning robots engaging in a choreographed or spontaneously generated "dance"), engaging in annoying behaviors (e.g. , the same cleaning robots actually begin cleaning), and/or engaging in implicit signaling (e.g., the cleaning robots do a dance before they clean, allowing patrons time to realize what's happening and leave the room before noise begins).

In some embodiments, machine learning is applied to the results of interactions with facility occupants 1054; for example, using as feedbacks actual changes in of facility capacity utilization resulting from particular actions suggested and/or performed by the distributed robotic system, and/or other metrics such as revenue. In another example, some robotic members 1052 of a distributed robotic system, in some embodiments, optionally are configured to perform routine maintenance/housekeeping tasks for a facility (e.g. , cleaning of floors and carpets in corridors 1045 and guest rooms 1040). Optionally, status data collected by the robots 1052 provides an indication of the performance of tasks by human staff, for example, rates of work, indications of tasks being actually performed (e.g. , linen piles indicating sheets being stripped and replaced, changes to bathroom surfaces indicating wiping, visual inspections of results, etc.). Moreover, staff members associated with the work are optionally identified, e.g. , by a matching of a signature and/or portion thereof. A potential use of this is to match work quality and/or efficiency to individual workers, with a possibility for increased housekeeping quality and/or efficiency. Insofar as the system is configured to use distributed sensing, to make inferences based on pattern associations, and/or make use of probabilistic identifications, the monitoring does not necessarily require constant pairing of a robot monitor 1052 with a human worker; rather, observations from multiple robots 1052 can be used to assemble work profiles associated with each individual worker.

Distributed robotic systems and profile awareness

Reference is now made to Figure 5, which schematically represents a method of interacting with identified individual facility occupants 1054, based on a profile of past interactions with the facility occupant, according to some embodiments of the present disclosure.

At block 502, in some embodiments, a first robotic interaction with an individual facility occupant produces an observation of a profile element, which is recorded with a profile of the individual.

At block 504, in some embodiments, a potential second robotic interaction with an individual facility occupant is initiated (and/or is put under consideration to initiate), as part of a new encounter. The second robotic interaction is with a different robotic unit.

At block 506, in some embodiments, the second robotic interaction is selected and performed, based on the profile element recorded at block 502. Alternatively, the interaction is avoided, based on the recorded profile element. Optionally, learning is applied to selecting methods of robotic interaction with specific individual facility occupants 1054. For example, profiles of recognized individuals and/or types of individuals (optionally profiles associated with distributed signatures) may be used to select an interaction method, based on what has previously been observed to produce an intended result. The profiles may be built up from experience with the individual, and/or with other individuals who are determined to be "like" that individual, for example based on statistical categorization using machine learning techniques.

Optionally, profiles are used to fine-tune actions of the distributed robotic system, for example to assist in the management of capacity utilization within the facility. For example (with continued reference to Figure 10), in a capacity utilization situation where there are two facility restaurants, one nearing full capacity (e.g. , a grill restaurant, optionally corresponding to special facility space 1006) and the other not (e.g. , a Chinese restaurant, optionally corresponding to special facility space 1003 for purposes of this example), a robot 1052 being used to advertise the underutilized restaurant may selectively approach facility occupants 1054 standing in line at the grill restaurant 1006 who are likely to change their minds.

For example, the robot 1052 may recognize an individual who (according to their profile) has not yet tried the Chinese restaurant, and/or who has eaten at the grill restaurant often enough that they may be open to and/or unaware of other possibilities— either the Chinese restaurant, or (e.g., as a second option to at least reduce capacity overutilization) another local restaurant in the area.

The method of approach is optionally also selected based on the profile. For example, a facility occupant who has previous demonstrated comfort with and/or eagerness for human/robot interactions may be, for example, engaged in conversation with the robot 1052 and/or offered a coupon from a dispenser on the robot 1052. A profile demonstrating less willingness to directly engage with robots may result in the conversation being replaced with a general announcement, a change to a nearby display (optionally part of the robot 1052 itself; optionally a stationary-location display controlled by a system in communication with the robot 1052), or another method of communication. In another example (also described with respect to Figure 10), a robot 1052 may encounter a group of people in or near elevators 1012 (e.g. , in upper floor elevator lobby 1014) that (1) have not been to the gym yet (according to their profiles), but (2) are apparently headed to the gym, e.g., as indicated by apparel, handbags, towels, or another indication, which is optionally a target of an object recognition algorithm implemented by the distributed robotic system. Optionally, the robot 1052 may ask if the group wants guidance to the gym (e.g. , special facility space 1005, in this example), and in the case of assent, may give verbal instructions and/or lead them to that location physically.

In some embodiments of the present invention, robots 1052 are configured to carry out a facility operation task which is "ambient"— that is, made available to facility occupants 1054 by robots, not necessarily at the instigation of the facility occupant. Such a facility operation task is serving food to guests a la carte, for example (this may be function of robot 1052 located in special capacity space 1006, for example). Optionally, a robot 1052 offers items to an identified facility occupant based on that person's preference as observed by a fellow robot 1052 that has interacted with that guest before (for example, a preference which is associated that person's profile).

Optionally, such a robot 1052 may refrain from approaching a person that did not seem to like a previous robot interaction (e.g. , the person was reticent, and/or it was apparent that the interaction did not give a satisfactory result). The robot 1052 optionally directs such a person to a table served by human staff, instead.

In another example, a robot 1052 plays a game with a child 1054 in the hotel (for example the robot/child pair shown in main lobby 1007 of Figure 10) that was liked by the child while interacting with other robots 1052. Optionally, the game is used as an introduction to other actions by the robot— for example, to inform the child and/or its family of other services offered by the facility, e.g., a service which represents a potentially underutilized capacity of the facility.

Profiles are optionally maintained for individuals across a plurality of visits to a facility, and/or shared between facilities under common management and/or a profile sharing arrangement.

In some embodiments, robot(s) 1052 (and/or the system as a whole) provide status data indicating a certain behavior by an individual that may require attention based on past behavioral analysis (for example by using deep learning). The type of reaction and the behavioral analysis is potentially improved upon by sharing of status data among robots 1052 and/or among robotic system installation sites.

Safety/security-related distributed sensing applications

Reference is now made to Figure 6, which schematically represents a method of detecting and optionally responding to a safety and/or security concern, according to some embodiments of the present disclosure.

At block 602, in some embodiments, a first mobile sensing platform collects status data that is related to a safety and/or security condition.

At block 604, in some embodiments, an at least second mobile sensing platform collects status data that is related to the safety and/or security condition.

At block 606, in some embodiments, the status data from the first mobile sensing platform and the at least second mobile sensing platform are combined, and associated via an associating data structure to indicate the safety and/or security condition, which is a determined facility condition.

Optionally, at block 608, in some embodiments, an action is taken (optionally by a robot which is networked together with the mobile sensing platforms) to modify the determined facility condition.

Distributed robotic system in safety concern conditions

Reference is now made to Figure 11, which schematically represents the floor plan of architectural facility 1000 in which a distributed robotic system is operating during a period of safety concern, according to some exemplary embodiments of the invention. Designations of various parts of facility 1000 are the same as described in relation to Figure 10, herein. The safety concern illustrated is represented as a fire 1110, including representations of positions 1112 where smoke is and/or is optionally sensed, and positions 1114 where heat is and/or is optionally sensed. It is noted that robots 1052 are not necessarily all in the places shown during one moment in time; the positions shown are related to where sensing occurs, potentially with a difference in time of sensing.

Multiple robots 1052 may interact together to detect safety concern conditions, and/or to take actions in response. Safety concerns, in some embodiments, are characterized by relating to rules and/or regulations rather than an immediate threat to health and/or safety. For example, the method of Figure 1 is optionally performed for crowd counting to help monitor facility space capacity ratings.

In some embodiments, mobile sensing platforms 1052 have sensing capabilities that directly {e.g., by explicit counting within an imaged field of view) or indirectly {e.g. , by sensing of footfall vibrations, noise, and/or an atmospheric condition such as temperature) provide status data that yield patterns correlating with crowd size (that is, number of human occupants within a space, such as special capacity space 1005)— and allowing crowd size estimation. This has a potential advantage for verifying that occupancy remains within regulated safety requirements for a room, and/or triggering activities (robotic or otherwise) to ensure that safety requirements are restored and/or not exceeded {e.g., opening another space, calling on supervising staff and/or emergency standby personnel), and/or triggering.

In some embodiments, safety concern conditions relate to one or more emergency safety concerns; for example, fire, flood, storm (hurricane and/or tornado, for example), earthquake, medical emergency, and/or a more facility function-specific condition such as loss of containment of material held within the facility. For purposes of explanation, a fire emergency is used as a primary example in the following discussion; however, it should be understood that the detection and/or response methods also apply, changed as necessary, to other emergency conditions, for example conditions just listed.

With respect to sensing:

• In some embodiments, robots 1052 equipped with smoke and/or heat sensing devices (or another sensing device appropriate to the emergency type) may convey status data indicating a potential emergency condition which is collected at different times and/or with different indicated intensities. Potentially, this helps to locate the source of the emergency, e.g. , at a position nearer to the first- and or most-intensity sensing robots 1052 {e.g. , in the corridor 1045 of wing 1020 of ground floor 1001), and away from a direction of later- and/or lower- intensity sensing {e.g., in the corridor 1045 of wing 1030 of ground floor 1030, and/or in a room of wing 1020 of upper floor 1002). Additionally or alternatively, in some embodiments, robots 1052 monitor facility locations during an already identified emergency condition to provide status data which may be used (e.g., according to the method of Figure 1) to suggest one or more target locations and/or access routes for emergency services personnel. For example, a pattern of smoke recognition and/or temperature sensing is used as a basis for inferring locations of fire within a facility, allowing, an identification of target areas for a firefighter to address, and/or to provide a route to target areas taking into account, e.g., lengths of available routes and/or potential obstacles.

In some embodiments of the invention, robots 1052 are activated to help address the emergency condition. For example, they take action to mitigate an emergency directly (for example, extinguish a fire), guide emergency personnel, and/or help ensure human safety. The robots 1052 are optionally activated according to emergency characteristics determined using distributed sensing and/or facility condition determination. Robotic tasks optionally include:

Taking up positions where robots 1052 can act (e.g., by providing displayed and/or audible warnings/instructions, by providing lighting, by leading along an evacuation route, and/or by physically blocking access to dangerous areas) to guide facility occupants 1054 (non-emergency personnel) away from emergency areas. For example, robot 1052 within the corridor 1045 of wing 1030 of upper floor 1002 is leading a group of facility occupants 1054 toward a stairwell 1010B which is located in a direction away from the fire 1110. Optionally, robots 1052 are aware of individuals having profiles that indicate a need for special assistance, and may be dispatched to seek, identify (e.g. , using an identifying signature) and/or assist those individuals preferentially.

• Taking up positions where robots 1052 can act (by providing warnings/instructions, by providing maps, by calling out to attract, and/or by actually leading the way) to guide emergency personnel to a facility portion where the emergency is taking place, and/or where non-emergency facility occupants 1054 are in potential need of further assistance (such as medical attention and/or human assistance to leave an area). For example, the robot 1052 located between the fire 1110 and main lobby 1007 optionally takes a role to attract emergency services personnel, while the robot 1052 located near the elevators 1012 on upper floor 1002 optionally takes a role of warning facility occupants away from use of the elevators and/or nearby endangered stairwell 1010.

• Directly acting to mitigate the emergency; e.g. , a robot 1052 is equipped with and/or equips itself with a fire-fighting apparatus (water hose, fire extinguisher), and uses it to direct material against a wall, door, and/or in another location to help achieve containment of a fire. For example, the robot 1052 positioned in special capacity space 1003 senses heat on the wall, and is directed by the system to spray water to cool the wall down.

• Taking up positions where the robots 1052 can continue to monitor a situation (e.g. a fire) to provide data to emergency and/or rescue workers, such as the location, magnitude and/or presence of trapped individuals. Any robot in corridor 1045 of wing 1020 of ground floor 1001 could take up such a role (and/or another robot 1045 could be sent there, or to another location. For example, the robot 1052 shown detecting heat within a room 1040 of wing 1020 of the upper floor 1002 is optionally sent to the room next door to determine if the fire has spread or is in danger of spreading upward.

• In some embodiments of the present invention, the robot 1052 may open doors to ensure best passage to the necessary personnel or for evacuation.

• Other actions to assist facility occupants 1054 and/or emergency services personnel; for example clearing debris and/or acting as supports and/or bulwarks against collapse and/or debris.

Distributed robotic system for security concerns

In some embodiments of the present invention, a distributed robotic system comprising mobile sensing platforms 1052 is used to perform monitoring related to security concerns. Optionally, the security concern relates to a pattern of human occupancy in a portion of the facility. Examples of human occupancy patterns which may be associated to security concerns include:

One or more individuals observed (by a succession of mobile sensing platforms

1052, optionally on the basis of an identifying signature, e.g. , a signature appearance as detected by a recognition technology such as clothing and/or face recognition)) to be present in a particular area for an unusual length of time (potentially loitering, illness, and/or medical emergency, depending on circumstances; e.g. , an individual lying on the ground vs. standing against a wall is optionally a better match to a pattern of "ill").

• One or more individuals observed (by a succession of mobile sensing platforms 1052, optionally on the basis of an identifying signature) to be making noise in a particular place and/or time, and/or making violent or otherwise unusual (e.g. , stumbling into and/or using for support) physical contact with facility elements like walls and/or furnishings (potentially drunken and/or disorderly behavior).

• One or more individuals observed (by a succession of mobile sensing platforms 1052, optionally on the basis of an identifying signature) to be present in a range of locations without explanation (potentially someone lost, and/or potentially looking for an opportunity to commit theft). Optionally, patterns of unexplained movement are further characterized (e.g. , to distinguish different subtypes of "unexplained wanderers") by one or more additional characteristics such as speed of movement, occurrence of back-tracking, and/or nature of the areas visited.

• One or more individuals observed (by a succession of mobile sensing platforms 1052, optionally on the basis of an identifying signature) to be present in a location without an associated indication in a tracked history of the facility that the same individual(s) are authorized, and/or had passed through one or more other facility locations acting as checkpoints (potentially someone who has entered a portion of the facility without authorization, and/or has gained access through an unexpected and/or non-approved entryway).

• Disappearance for an extended time of one or more individuals previously noted in a space of the facility, but without that individual passing through certain facility locations acting as checkpoints (potentially someone has left the facility through a non-approved egress, taken up hiding within the facility, and/or has encountered a problem that has left them motionless or otherwise difficult to detect).

A plurality of observed patterns of movement and/or other behavior of individuals identified (for whatever reason) as a potential security concern may be subjected to machine learning so that new instances of the same pattern may also be identified as a potential security concern, potentially before the pattern develops to the point of being an overt security concern.

In some embodiments, identified patterns of movement and/or other behavior which indicate a security concern are statistically (e.g. by machine learning) or otherwise associated with one or more future movements/behaviors. Optionally, security actions to address the security concern (such as one of those described below) are dispatched on the basis of predicted behavior (e.g. , to a particular location and/or using a particular security resource— human, and/or robotic).

For example, in some embodiments, a distributed robotic system is configured to match patterns associated with status data-indicated behaviors and/or profiles such as a drunken lost guest, a group of youngsters creating noise, and/or a person sleeping in a lobby couch.

An identifying signature, in some embodiments, comprises one or more of a plurality of different status data types and/or identifications. For example, one robot 1052 may image one side of an individual; another may have imaged the other side— under circumstances where it is likely that both robots 1052 are imaging the same individual (e.g. , simultaneously, in rapid succession, and or when only one motion signature is present in the same building region). Other examples include spotting a special mark (e.g. tattoo, item of apparel), voice recording, gait, and/or height noted next to some reference point of the facility. It is noted that the identifying signature (or any portion thereof) is not necessarily uniquely identifying; rather its recurrence is one piece of information that helps to establish a likelihood of a certain facility status condition existing and/or developing.

In some embodiments (e.g. , for use with mobile sensing platforms 1052 without a camera, for identification of individuals outside of the view of a camera, and/or for mobile sensing platforms 1052 without visual processing algorithms suitable to identify individuals), the identifying signature may be based on motion detection, vibration, voice detection and/or identification, and/or another sensing modality. Optionally, signature identification is omitted— e.g. , a sequence of observations noting unexplained motion in a stairwell is optionally flagged as a security concern whether or not there is an identifying signature available to determine that the motion is due to a particular individual, or even necessarily to human movement. In some embodiments, one or more robots 1052 of the system are operated in a security patrol mode, e.g. , according to a patrol route which may be predetermined (e.g. , from a central controller), and/or may include autonomous, central controller-instructed, and/or arbitrary movements between locations, for example in reaction to status data collected, and/or in order to avoid an overly stereotyped patrol pattern.

In some embodiments, a robotic mobile sensing platform 1052 (robot) detects status data related to security concerns incidentally, e.g. , in the course of performing another task (a "primary task", e.g., a task which guides at least 50%, 75% or more of movements of the robot 1052 between different locations). In some embodiments, the robotic mobile sensing platform 1052 is performing the primary task autonomously. Optionally, a controller of a distributed robotic system to which the robot 1052 belongs interrupts the other primary to move the robotic mobile sensing platform 1052 to perform a movement related to monitoring of a security concern. The interruption is optionally periodic during performance of a primary (e.g. , every 5 minutes, 10 minutes, or another interval), episodic, and/or opportunistic (e.g., optionally triggered by other status data patterns, and/or when the robot 1052 comes within a certain distance of a potential monitoring target in the course of performing a primary task).

Actions (such as an action described below) to address a security condition (besides sensing) are also optionally performed as interruptions to a primary task. In some embodiments, the primary task is returned to after a control interruption related to monitoring and/or addressing by another action a facility security condition.

Upon noticing a security concern, the distributed robotic system (or any of its member robots/mobile sensing platforms 1052) optionally acts to inform a supervising human, for example, a security staff member. Optionally, robots 1052 comprising and/or carrying a mobile sensing platform 1052 are security concerns are used to perform one or more actions upon the detection of a security concern. It is noted that some of these actions may elicit from the individual a response behavior (e.g. , aggression toward the robot 1052, ignoring the robot 1052, and/or vacating the facility space) which itself is associated with a status data pattern that indicates a security condition. Optionally, robots 1052 are deliberately controlled to elicit a response from an individual in an uncertain security condition (for example, is an individual lost, or is the individual looking for a chance to steal unattended belongings?), in order to obtain information. Examples include:

Moving a robot 1052 to block and/or open doors (and/or windows). Potentially, this acts to confine, contain, and/or guide individuals, for example intruders seeking to evade contact with security equipment and/or personnel.

• Moving a robot 1052 to stand in a way of an individual. An additional or alternative way to guide and/or discourage individuals.

Moving a robot 1052 out of hiding or otherwise suddenly into view ("ambush"), to potentially disrupt and/or guide an individual's path through the facility.

• Moving a robot 1052 to follow and/or observe an individual.

• Moving a robot 1052 to safely approach an individual.

Operating a robot 1052 to otherwise interact with (e.g. , speak to, display a message to, and/or gesticulate to) an individual. Optionally, the robot 1052 offers assistance, e.g., to an individual who is potentially lost, distressed, and/or experiencing a medical emergency. Optionally the robot 1052 interaction is polite and/or deliberately "dumb", even if a less innocent security condition is of concern: for example, a suspected unauthorized intruder is asked to await assistance (e.g., to potentially give time for greater resources to be brought to bear on the situation). Optionally, the interaction is more assertive; for example: a demand for identification, an order to leave or to remain in place, as may be considered appropriate to a security policy. Optionally, interactions are selected according to a profile of an individual, and/or (for example, if the individual is unidentified) to an identified characteristic of the present individual, such as age (e.g. , a child is addressed in a parental tone; a potentially elderly person is addressed with emphasis on determining if there is a medical emergency).

• Operating a robot 1052 to create a disturbance; for example, flash a light, orient a potentially dazzling light at an individual, and/or sound an alarm (siren, voice or other sound).

• Changing any behavior of a robot 1052 in the presence of an individual, without necessarily taking an action that is oriented toward the individual. For example, a cleaning robot 1052 optionally begins cleaning a stairwell floor occupied by a potential loiterer. Move one or more robots 1052 to monitor areas away from noted individual(s) implicated in a security condition, e.g. , exits and/or security observation dead zones.

The above activities may be undertaken by more than one robot 1052; for example, a plurality of robots 1052 may be used to raise a disturbance, follow an individual, block an entrance, or another action. In some embodiments, at least one robot 1052 of a distributed robotic system is a dedicated security robot 1052 which is preferentially dispatched to perform actions in response to security conditions. In some embodiments, robots 1052 specialized (e.g. , in tooling and/or programming) for one or more facility operations tasks are repurposed for security condition responses, e.g., according to a rotation schedule and/or availability.

It is noted that for purposes, e.g., of pattern training and/or security drills, robots may themselves be used to test and simulate alarm and security systems.

In some embodiments, one or more robots are moved to discover "dead zones" (e.g. , blind routes) in a security system (e.g., when the robot is no longer apparent within any other monitoring sensor's output, it stands within a dead zone). The robot optionally maps dead zones, and/or acts itself to monitor the dead zone. This may be used, e.g. , to counteract deliberate de-activation of security monitors by an intruder.

Distributed Calibration

Reference is now made to Figure 7, which is a flowchart schematically describing a method for estimation and calibration of robotic sensor bias, according to some embodiments of the present disclosure. Sensors for a plurality of mobile sensing platforms (robotic and/or robotically carried) are potentially different from one another in at least two senses (1) different sensors may produce systematically different measurement values from each other for the same measurement target, and (2) different sensors may produce measurement values which have different degrees of consistency (e.g. , variance for measurement of a same target), even after taking into account systematic differences. Moreover, both systematic differences and consistency differences can change over time.

As an example: a plurality of robots navigates an environment, using wheel rotation encoders as a measure of distances travelled. A wheel encoder for a particular robot that is used to indicate travel distances may be attached to a wheel that intermittently (e.g. , once per rotation on average) loses rolling contact with the ground, resulting in measurement errors. In some embodiments of the invention, this produces a systematic difference in measurements of distances compared to other robots, which may be understood as a bias and/or calibration error. Over time (e.g. , within a single period of use), the bias may change, for example, due to wear on the wheel, changes in temperature, pickup of material from the environment, and/or redistribution of wheel bearing lubrication. Moreover, the intermittent loss of rotation may be more or less regular, for example, exactly once per rotation, once every rotation with some probability, and potentially also for a longer or shorter distance on each rotation. Other types of sensors may have variable calibration and/or consistency due to the same and/or different factors.

In some embodiments, systematic measurement differences among mobile sensing platforms are corrected for by an iterative adjustment of calibration values as new data is collected. Optionally, differences in consistency of measurements are accounted for by iterative adjustment of weighting given to different mobile sensing platforms, and/or individual sensors thereof.

Potentially, the weighting improves result accuracy: emphasizing accurate robots by adding them in a weighted mean (or other combination) of measurements, so that they will contribute more to the mean. A single mobile measurement platform comprising a plurality of sensors can have more than one weight; for example, one robot can have bigger weight in one measure and less in another. Lowering relative weighting of less accurate robots potentially minimizes their impact.

At block 100, in some embodiments, mobile sensing platforms of the system perform data collection. The data collected optionally comprises any quantitative measurement, for example: mass, volume, distance, velocity, temperature, etc. Measurements which provide a direct or indirect estimate of mobile sensing platform position (and/or objects in a target environment encountered by such mobile sensing platforms) are an embodiment of particular interest; position sensing is a common application of robotic sensor measurements.

Sensing is performed by a plurality of mobile sensing platforms, each a plurality of times, within a target environment. The target environment comprises a plurality of targets for measurement by sensing. Moreover, the sensing produces a multiplicity of measurements for different mobile sensing platforms which can be compared to one another. Comparison may be between measurements of the same target (e.g. , a feature of a same target, and/or the relative position of a same target, optionally expressed as a distance between two same targets). Optionally, comparison is facilitated by use of a method of interpolation and/or extrapolation to estimate values of measurements equivalent to one another (e.g., linear, bilinear, quadratic or higher-order, Lanczos, spline fitting, coherence-based modeling, or another method).

In some embodiments, mobile sensing platform position, and optionally time of measurement, are also recorded and stored together with quantitative measurements (it is noted that the quantitative measurements to be calibrated may actually be of position, and/or a parameter used to estimate position).

At block 110, in some embodiments, the system performs calibration of data from the individual mobile sensing platforms according to a most recently calculated set of mobile sensing platform calibration values (for example, calculated or recalculated as described with respect to block 140). Optionally calibration is per mobile sensing platform type (or other grouping), per mobile sensing platform, and/or per sensor type carried by each mobile sensing platform. For example, distances to objects may be calculated from a single mobile sensing platform using a plurality of sensing methods, such as laser, ADAR, and machine vision (e.g., one or both of stereo imaging and triangulation).

At block 120, in some embodiments, the system performs analysis of the calibrated data in preparation for the updating of calibration values for the mobile sensing platforms. In overview, the updating (comprising blocks 120, 130, and 140) uses comparison of measurements from the various mobile sensing platforms with corresponding reference measurements at positions within the target environment. This can be done with a robotic management center which, e.g. , performs statistical (e.g. , machine-learning based) data analysis, and re-distributes results to individual mobile sensing platforms, e.g. , in the form of correction and/or calibration tables and/or other data structures.

Optionally, the plurality of mobile sensing platforms are also used (e.g., via network computing) to provide processors which perform the calibration, analysis and/or estimation operations of blocks 110, 120, and/or 140. In some embodiments, the analysis produces a weighted mean 130, which acts as the new reference value. In some embodiments, a reference value is calculated by another method, and weighted mean 130 should be understood to be optionally substituted with any such alternatively produced reference value. At block 140, differences from the weighted mean 130 for individual mobile sensing platforms are used to recalculate calibration values before returning to data collection at block 100.

In more detail: measurement, in some embodiments, occurs within a target environment for which a known measurement reference state (for example a map reflecting a current best estimate of measurements for targets within the target environment) is available. The known measurement reference state optionally comprises one or more weighted means 130, but may be otherwise constructed. In some embodiments, reference measurements which the environment's reference state comprises are generated from previous measurements of the mobile sensing platforms operating within the measured target environment. Reference measurements are optionally interpolated and/or extrapolated to fill in gaps in available measurement data.

In some embodiments, the reference measurements used are generated as a weighted average, or other weighted combination of measurements collected during iterations of block 100. Weighting of a particular measurement source is optionally increasing according to a corresponding increasing consistency (e.g., after calibration) with some corresponding reference value.

In some embodiments, weights are determined using a machine learning technique such as a deep learning and/or convolutional neural network algorithm. In an example implementation of an unsupervised learning algorithm, calibrated measurement values are used as inputs to a machine learning algorithm for a determination of mobile sensing platform-dependent weights. Teaching feedback to the machine learning algorithm for adjusting the weights is optionally provided by some metric of reference measurement consistency (e.g. , over time and/or for repeated measurements of the same target) when measurements are combined (e.g. , averaged) using the weights. Optionally, calibration parameters (estimated at block 140) and weights are co-determined using a deep learning machine learning algorithm.

Optionally, seed weights are provided; for example, a previously selected subset of robots which are known to have high accuracy and/or precision are given maximum weightings; while other, potentially less accurate and/or precise robots are given lower weightings. Weightings themselves are optionally adjusted over time: for example, as robots become better calibrated, they may begin to make a more reliable contribution (a contribution which introduces less variance) to the reference measurements. In some embodiments, weighting is increased for new and/or recently maintained mobile sensing platforms as they acquire an established record of measurements consistent with the reference measurements.

In some embodiments, a distributed robotic system calculates a weighted mean estimation of the value of one or more measured parameters (that is, to obtain reference measurements), for example according to the equation:

Vmean - ^ w . wherein: v>i = measured value of robot i w t = weight of robot i v mean = mean value

In some embodiments, another algorithm of obtaining weighted reference measurements is used, for example, maximum likelihood estimation.

At block 140, in some embodiments, the system provides new estimations of calibration factors for the robots of the self-calibrating system. Re-estimation of calibration at block 140, is calculated for one or more of: single robots, a robotic group (e.g. , a group of robots which are expected to have similar calibration values), and/or a plurality of different robot groups, e.g. , groups defined by robots which are expected to be similar among themselves in calibration, but potentially distinct from robots in other groups.

The flowchart optionally iterates for as long as robots of the distributed robotic system remain active, resulting in continuous monitoring and/or updating of calibrations, weighting, and/or the measured state of the monitored environment. Optionally, histories of evolving results are stored in detail. Optionally, stored histories are reviewed from time to time, for example, to use in identifying situations and/or aspects of robot performance in real- world environments which are prone to leading to errors, and may be targeted for development of algorithmic and/or hardware improvements. Reference is now made to Figure 8, which is a schematic representation of an environment 200 of a distributed robotic system, according to some embodiments of the present disclosure. Environment 200 optionally comprises a warehouse. Optionally, the method of Figure 7 is carried out within the context of environment 200.

In some embodiments, environment 200 comprises a robotic warehouse— that is, a warehouse wherein warehousing operations (e.g., placing, picking, rearranging and/or inventorying warehouse stock) are performed at least partially using a distributed robotic system.

In some embodiments, environment 200 comprises static objects 210, movable (e.g., portable) objects 220, robots 230, and sensors 240.

Robots 230 operating as a distributed robotic system within environment 200 (e.g., as part of an automatic warehouse) can use the method of Figure 7 to potentially improve navigation accuracy and/or environmental mapping (locations of objects 220, 210). In some embodiments, each robot 230 constantly measures static and/or movable objects 210, 220; transferring the data to an analysis computer (not shown).

Static sensors 240 optionally also provide measurements— static sensors 240 optionally comprise cameras, or any other sensor-equipped installation, for example an internet of things (It) device. The analysis computer can be located anywhere in the world. The analysis computer returns to the robots an estimation of positions within environment 200 which has been adjusted for calibration and/or consistency as described in relation to Figure 7.

Distributed Calibration Examples

Environment 200 is optionally a type of environment other than a warehouse. Examples include:

Observable feature dimension estimation: In some embodiments, aircraft

(airplane, drone, and/or helicopter aircraft, for example) travel above and/or among 3-D objects such as mountains, buildings etc. While they are moving above and/or among them, they measure the physical and/or angular dimensions of various features, and/or estimated distances of aircraft travel as they move above and/or among them. Robots (e.g., robots between two given buildings) provide measurements to a computer configured to carry out the method of Figure 7, which then returns a recalculated estimate of actual distances to the two buildings. The robots may also plan their own movements using calibration adjustments used by the computer.

Crowd counting: In some embodiments, humans form a crowd within an area also occupied and/or surrounded by robots and optionally "internet of things" (IoT) devices (online cameras and/or sensors). The robots and/or devices are configured to sense the amount of people located in the area; e.g., based on floor vibrations, ambient sound levels, voice analysis, visual scene analysis, and/or another measurement method. Each sensor sends an estimation of number of people to an computer configured to carry out the method of Figure 7, optionally along with a specification of the region of the area which the estimate samples. The computer produces an estimate of the actual number of people in the crowd; optionally returning it to the robots or otherwise providing it, for example, as a value used as a basis for determining further action, as a record of area occupancy, or for another use. Optionally, robots adjust their reports according to calibration adjustments provided by the computer.

Temperature measurement: In some embodiments, hospital service robots move among and take measurements of a plurality {e.g. , dozens) of patients over time scales which repeat measurements, e.g., every few minutes. Measurements may be, for example, of the body temperature of patients {e.g., using infrared thermal sensing). Each robot sends temperature estimate results to a computer configured to carry out the method of Figure 7, which then outputs an estimate of actual body temperature of each measured patient. It is noted that in this case measurements need not be referenced to actual positions; rather they may be referenced to individual patients identified by another method, for example, an electronically read ID token (such as an ID bracelet), and/or a recognition method such as face recognition. Optionally, robots adjust their temperature reports according to calibration adjustments provided by the computer.

Outdoor Applications: In some embodiments, robots move within the setting of an agriculture field, potentially comprising great distances and/or a dirty environment liable to produce inaccuracies in sensor results, e.g., due to contamination and/or irregularity of the environment.

Robotic platforms may be configured, for example, to harvest, inspect, plant, spray, weed, and/or otherwise tend a crop. During operation, the platforms collect measurements indicating estimated locations of objects. For example, each robot sends measurement estimates to a computer configured to carry out the method of Figure 7, which then outputs an estimate of actual positions of objects in the field based on sensing from a plurality of robotic platforms (as well as appropriate calibration factors). Optionally robots use the calibration values and/or centrally estimated positions to guide their own navigation.

Reference is now made to Figure 9, which is a top view of a multi-location robotic environment 201 (e.g., distinct areas 200A, 200B, 200C, 200D), according to some embodiments of the present disclosure. In some embodiments, the method of figure 7 is carried out across a plurality of locations, for example locations 200A, 200B, 200C, 200D of Figure 9.

Server 205 (used for performing processor-computed actions of Figure 7) is optionally coupled to receive data from and/or distribute results to all the distinct areas 200A, 200B, 200C, 200D. Alternatively or additionally, server cloud 206 is used to perform processor-computed actions of Figure 7.

In some embodiments, mobile sensing platforms that are part of a same distributed robotic system but moving within different areas can interact in one or more of the following manners:

• Calibration is performed across types of robots; calibration values determined for robots in one area are optionally distributed to robots in other areas. For example, at multiple sites (like the distinct areas 200A, 200B, 200C, 200D), robots of the same and/or related models and/or comprising similar implementations of features (e.g., shared software, same model(s) of hardware sensors) are installed. Data from robots with similarities are optionally grouped for calibration/weighting in order to calculate trends and robot behavior. For example some robot type may be found to systematically measure too short, so that all robots of that type can be assigned a calibration factor that compensates. It is noted that the calibration factor is optionally condition dependent; e.g., dependent on a travel surface (carpet vs. hard floor), temperature, or another factor.

• Individual robots can move from one area to another; their previous calibration values are optionally maintained upon entry into a new area. As used herein with reference to quantity or value, the term "about" means "within ±10% of.

The terms "comprises", "comprising", "includes", "including", "having" and their conjugates mean: "including but not limited to".

The term "consisting of means: "including and limited to".

The term "consisting essentially of means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.

As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "a compound" or "at least one compound" may include a plurality of compounds, including mixtures thereof.

The words "example" and "exemplary" are used herein to mean "serving as an example, instance or illustration". Any embodiment described as an "example" or "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.

The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments". Any particular embodiment may include a plurality of "optional" features except insofar as such features conflict.

As used herein the term "method" refers to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the chemical, pharmacological, biological, biochemical and medical arts.

As used herein, the term "treating" includes abrogating, substantially inhibiting, slowing or reversing the progression of a condition, substantially ameliorating clinical or aesthetical symptoms of a condition or substantially preventing the appearance of clinical or aesthetical symptoms of a condition.

Throughout this application, embodiments of this disclosure may be presented with reference to a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the disclosure. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as "from 1 to 6" should be considered to have specifically disclosed subranges such as "from 1 to 3", "from 1 to 4", "from 1 to 5", "from 2 to 4", "from 2 to 6", "from 3 to 6", etc. ; as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.

Whenever a numerical range is indicated herein (for example "10-15", "10 to 15", or any pair of numbers linked by these another such range indication), it is meant to include any number (fractional or integral) within the indicated range limits, including the range limits, unless the context clearly dictates otherwise. The phrases "range/ranging/ranges between" a first indicate number and a second indicate number and "range/ranging/ranges from" a first indicate number "to", "up to", "until" or "through" (or another such range-indicating term) a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numbers therebetween.

Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.

All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting.

It is appreciated that certain features, which are, for clarity, described in the context of separate embodiments within the present disclosure, may also be provided in combination in a single embodiment. Conversely, various features, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the present disclosure. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.