Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CONTROL SYSTEMS FOR AUTOMATIC BARRIERS
Document Type and Number:
WIPO Patent Application WO/2022/211793
Kind Code:
A1
Abstract:
Barrier control systems and methods are disclosed herein that utilize a sensor subsystem with at least one sensor to capture data from a region proximate to a barrier. The sensor subsystem may detect objects and/or the open or closed state of a barrier. The system may determine minimum aperture values, such as barrier widths and/or heights, for each of a plurality of future time increments. For example, the system may calculate the size of the object and/or predict the trajectory of the object using a trained occupancy prediction machine learning model. The system may utilize the size and/or a predicted trajectory of the objects to determine aperture values for future time increments to avoid collisions and/or minimize airflow. A motor control interface may transmit the aperture values to a motor control unit that directly controls the opening and closing of the barrier.

Inventors:
HAYES ALEXANDER MARK (AT)
Application Number:
PCT/US2021/025021
Publication Date:
October 06, 2022
Filing Date:
March 31, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HAYES ALEXANDER MARK (AT)
International Classes:
E05F15/73; E05F15/74; E05F15/79; E06B11/02; E06B11/06; G06T7/20; G06T15/08
Domestic Patent References:
WO2020229175A12020-11-19
WO2018191260A12018-10-18
WO2019238718A12019-12-19
Foreign References:
DE102016119339A12018-04-12
US20190271185A12019-09-05
US20180321758A12018-11-08
US9879466B12018-01-30
CN110454027A2019-11-15
Attorney, Agent or Firm:
FLANAGAN, Justin (US)
Download PDF:
Claims:
What is claimed is:

1. A barrier control system, comprising: a sensor subsystem that includes imaging sensors to image a region proximate a roll- up door; an aperture estimation subsystem to: calculate a height, of an object imaged by the sensor subsystem, predict a trajectory of the object using an occupancy prediction machine learning model, identify future time increments, based on the predicted trajectory, during which the object is predicted to occupy a barrier actuation region of the roll-up door, and determine a minimum height opening state for the roll-up door for the future time increments based on the calculated height of the object; and a motor control unit to actuate the roll-up door to attain the determined minimum height opening state during the future time increments during which the object is predicted to occupy the barrier actuation region of the roll-up door.

2. A barrier control system, comprising: a sensor subsystem, with at least one sensor to capture sensor data, to detect an object within a threshold distance of a barrier, an aperture analysis subsystem to utilize the sensor data captured by the sensor subsystem to: identify future time increments during which the object is predicted to occupy a barrier actuation region of the barrier, and determine an aperture value for the barrier to attain during the future time increments, wherein the barrier actuation region includes a region within which at least a portion of the barrier moves during actuation; and a motor control interface to transmit, to a motor control unit, the aperture value for the barrier to attain during the future time increments.

3. The barrier control system of claim 2, wherein the aperture analysis subsystem is configured to identify the future time increments during which the object is predicted to occupy a barrier actuation region of the barrier by utilizing stored sensor data that includes (i) a current position of the object and (ii) at least two previous positions of the object,

4. The barrier control system of claim 2, wherein the aperture analysis subsystem is configured to identify the future time increments during which the object is predicted to occupy a barrier actuation region of the barrier by utilizing the sensor data to calculate at least one vectorial component of: a second derivative of position associated with the object, a third derivative of position associated with the object a fourth derivative of position associated with the object, a fifth derivative of position associated with the object, and a sixth derivative of position associated with the object.

5. The barrier control system of claim 2, wherein the aperture value comprises a minimum aperture value to accommodate a dimension of the object.

6. The barrier control sy stem of claim 2, wlierein the barrier actuation region is defined as one of: a region of space beneath a roll-up door, and a region of space in which a pivoting door traverses when opening.

7. The barrier control system of claim 2, wherein the aperture analysis subsystem is configured to: utilize a trained occupancy prediction machine learning model to identify the future time increments during which the object is predicted to occupy a barrier actuation region of the barrier.

8. The barrier control sy stem of claim 7, wlierein the occupancy prediction machine learning model comprises a trained path prediction machine learning model to predict a trajectory of the object that traverses the barrier actuation region of the barrier.

9. The barrier control system of claim 7, further comprising: a machine learning training subsystem to: detect, via the sensor subsystem, a route traversed by the object that includes time increments during which the object occupied the barrier actuation region, and provide the detected route of the object as training feedback to the occupancy prediction machine learning model.

10. The barrier control sy stem of claim 7, wlierein the occupancy prediction machine learning model is trained using object trajectory' data provided by a plurality of barrier control systems installed in multiple locations.

11. The harrier control system of claim 7, wherein the occupancy prediction machine learning model receives low-level sensor data from the sensor subsystem.

12. The barrier control system of claim 7, further comprising at least one preprocessing subsystem to preprocess low-level sensor data from the sensor subsystem to generate high-level abstractions of the sensor data for delivery to the occupancy prediction machine learning model.

13. The barrier control system of claim 12, wherein the at. least, one preprocessing subsystem comprises one of: an object position tracker to provide high-level abstraction data indicating a position of the detected object, and a leaning angle calculator to provide high-level abstraction data indicating an angle at which a human is leaning.

14. The barrier control system of claim 12, wherein the at least one preprocessing subsystem comprises an object type identifier to identity a characteristic of the object.

15. The barrier control system of claim 12, wherein the at least one preprocessing subsystem comprises a point cloud generator to generate high-level abstraction data in the form of a multidimensional point cloud of each of a plurality of detected objects segmented by object type.

16. The barrier control system of claim 2, wherein the sensor subsystem comprises a LiDAR sensor.

17. The barrier control sy stem of claim 2, wherein the sensor subsystem comprises a plurality of optical imaging sensors.

18. The barrier control system of claim 17, wherein at least one of the optical imaging sensors comprises an infrared imaging sensor, and wherein another of the plurality of optical imaging sensors comprises a visible light imaging sensor.

19. The barrier control system of claim 2, wherein the barrier comprises a roll-up door, and wherein the minimum aperture value comprises a height value for a bottom edge of the roll-up door to, at least, attain.

20. The barrier control system of claim 2, wherein the barrier comprises at least one of: a sectional panel door, a hinged door, a sliding door, a rotatable turnstile, a pivot arm, retractable plates, a folding door, retractable spikes, a revolving door, a passenger access barrier, a train door, and a vehicle door.

21. A barrier control system, comprising: a processor; and a non-transitory computer-readable medium with instructions stored thereon that, when executed by the processor, cause the barrier control system to: identify, using an occupancy prediction machine learning model, future time increments during which an object detected by a sensor subsystem is predicted to occupy a barrier actuation region of a barrier; predict a barrier opening state to avoid collisions between the barrier and the object for the future time increments during which the object detected by the sensor subsystem is predicted to occupy the barrier actuation region; determine if the occupancy prediction machine learning model is sufficiently trained; in response to a determination that the occupancy prediction machine learning model is sufficiently trained:

(i) transmit, to a motor control unit, the predicted barrier opening state for the identified future time increments during which the object is predicted to occupy the barrier actuation region; in response to a determination that the occupancy prediction machine learning model is insufficiently trained:

(i) calculate, via a preprogrammed trajectory algorithm, a barrier opening state for the future time increments based on a calculated trajectory, and

(ii) transmit, to the motor control unit, the minimum barrier opening state for the future time increments based on the calculated trajectory, and track, via the sensor subsystem, subsequent object movement data and provide the movement data as training data for the occupancy prediction machine learning model.

22. The barrier control system of claim 21, wherein the motor control unit employs a model predictive controller which plans its future control inputs to minimize a cost function of collision likelihoods over a prediction horizon.

23. The harrier control system of claim 21, wherein the predicted barrier opening state is used by the motor control unit as a minimum constraint to be exceeded during the future time increments.

24. The barrier control system of claim 23, wherein the motor control unit utilizes a model predictive controller that plans future control signals to minimize a cost function associated with at least one of: avoiding collisions and airflow transmission.

25. The barrier control system of claim 23, wherein the sensor subsystem comprises at least one of: a LiDAR sensor, a visible light imaging sensor, an infrared imaging sensor, an ultrasonic sensor, and a millimeter-wave sensor.

26. The barrier control system of claim 23, wherein the preprogrammed trajectory algorithm uses a worst-case trajectory to calculate the barrier opening state for the future time increments based on a calculated trajectory,

27. The harrier control system of claim 23, wherein the predicted barrier opening state for the identified future time increments during which the object is predicted to occupy the barrier actuation region comprises a predicted minimum barrier opening state to avoid collision.

28. The barrier control system of claim 23, wherein the instructions, when executed by the processor, further cause the barrier control system to: mark the occupancy prediction machine learning model as: insufficiently trained while a sequential count of accurate predictions is below a threshold training value, and sufficiently trained while the sequential count of accurate predictions exceeds the threshold training value.

29. The barrier control system of claim 23, wherein the motor control unit employs a model predictive controller which plans its future control inputs to minimize an airflow-based cost function over a prediction horizon.

30. A method, comprising: predicting, using an occupancy prediction machine learning model, a barrier opening state to avoid collisions between an object detected by a sensor subsystem and a barrier for future time increments during which the object is predicted to occupy a barrier actuation region of the barrier, transmitting, to a motor control unit, the predicted barrier opening state for the future time increments during which the object is predicted to occupy the barrier actuation region of the barrier.

31. A banter control system, comprising: a plurality of sensors to generate sensor information associated with objects detected proximate a barrier; an aperture analysis subsystem to determine a minimum aperture value for the barrier to attain at future time increments based on the sensor information, wherein the aperture analysis subsystem comprises a machine learning model trained with a collision-based cost function; and a motor control interface to transmit control instructions to a motor control unit wherein the control instructions include the minimum aperture value for the future time increments to a motor control unit.

32. The barrier control system of claim 31, wherein the machine learning model of the aperture analysis subsystem is further trained with an airflow-based cost function.

33. The barrier control system of claim 32, wherein the airflow-based cost function comprises a cost term to minimize airflow from one side of the barrier to another side of the barrier.

34. The barrier control system of claim 32, wherein the airflow-based cost function comprises a cost term to minimize energy consumption associated with climate control of at least one side of the barrier.

35. The barrier control system of claim 31, wherein the machine learning model of the aperture analysis subsystem is further trained with cost term to minimize wear and tear on at least one mechanical component associated with the barrier.

36. The barrier control system of claim 31, wherein the sensor information comprises position information from one or more of a LiDAR sensor, a time-of-flight camera, an induction loop, a floor scale, and a reflector tracker.

37. A barrier control system, comprising: a processor; and a non-transitory computer-readable medium with instructions stored thereon that, when executed by the processor, cause the barrier control system to: in response to a determination that an occupancy prediction machine learning model is sufficiently trained: transmit, to a motor control unit, a barrier opening state for future time increments during which an object is predicted to be within a barrier actuation region of a barrier by the occupancy prediction machine learning model; and in response to a determination that the occupancy prediction machine learning model is insufficiently trained: calculate, via a preprogrammed algorithm, a barrier opening state for future time increments based on a calculated trajectory' of the object, and transmit, to the motor control unit, the calculated barrier opening state for the future time increments based on the calculated trajectory of the object.

Description:
Control Systems for Automatic Barriers

RELATED APPLICATIONS

[0001] None

TECHNICAL FIELD

[0002] This disclosure relates to control systems for barriers that open and close, such as roll-up doors, sliding doors, gates, turnstiles, and the like. Additionally, this disclosure relates to artificial intelligence, machine learning, and model predictive control.

BRIEF DESCRIPTION OF THE DRAW INGS [0003] FIG. 1 illustrates a diagram of the state of a barrier relative to time, according to one embodiment.

[0004] FIG. 2A illustrates actuation of a barrier according to different control systems for detected objects and predicted future objects, according to one embodiment.

[0005] FIG. 2B illustrates actuation of the barrier according to different control systems, according to one embodiment.

[0006] FIG. 2C illustrates a diagram of the actuation of lateral-moving door panels by different control systems, according to one embodiment.

[0007] FIG. 3A illustrates a block diagram of an example of a barrier control system, according to one embodiment.

[0008] FIG. 3B illustrates a block diagram of another example of a barrier control system, according to one embodiment.

[0009] FIG. 3C illustrates a block diagram of another example of a barrier control system, according to one embodiment.

[0010] FIG. 4A illustrates a flow chart of a method for calculating a minimum opening value for a barrier based on a trajectory' calculated using a static, preprogrammed trajectory' algorithm, according to one embodiment.

[0011] FIG. 4B illustrates a flow chart, of a method for calculating a minimum opening value for a barrier based on a trajectory ' predicted by a path prediction machine learning model, according to one embodiment. [0012] FIG. 4C illustrates a flow chart of a method for calculating a minimum opening value for a barrier before and after training of a machine learning model, according to one embodiment.

[0013] FIG. 4D illustrates a flow chart of a method for calculating a minimum opening value for a barrier based on future time increments during which an object is predicted to occupy a barrier actuation region using an occupancy prediction machine learning model, according to one embodiment.

[0014] FIG. 5A illustrates a block diagram of a barrier control system utilizing a machine learning model, according to one embodiment,

[0015] FIG. 5B illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model, according to one embodiment.

[0016] FIG. 5C illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with some pre-processed data, according to one embodiment.

[0017] FIG. 5D illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with pre-processed data, according to one embodiment.

[0018] FIG. 6A illustrates a block diagram of a barrier control system that includes integrated sensors, according to one embodiment.

[0019] FIG. 6B illustrates an example of a barrier control system with integrated sensors mounted above a roll-up door, according to one embodiment.

[0020] FIG. 6C illustrates an example of a barrier control system with an integrated sensor mounted to the side of a roll-up door, according to one embodiment.

[0021] FIG. 6D illustrates the roll-up door partially rolled up to show a bander actuation region that includes the leading edge of the roll-up door, according to one embodiment.

[0022] FIG. 7 illustrates an example field of view' of sensors integrated into a barrier control system, according to one embodiment.

[0023] FIG. 8 illustrates a barrier control system analyzing trajectory information of nearby equipment and a person, according to one embodiment.

[0024] FIG. 9A illustrates a modeled trajectory' with no acceleration, according to one embodiment.

[0025] FIG. 9B illustrates modeled trajectory possibilities of an object with relatively small acceleration possibilities, according to one embodiment. [0026] FIG. 9C illustrates modeled trajectory possibilities of an object with relatively large acceleration, according to one embodiment.

[0027] FIG. 9D illustrates modeled trajectory possibilities of an object with relatively large acceleration with a known obstacle, according to one embodiment.

[0028] FIG. 10 illustrates an example diagram of predicted trajectories of an object relative to a barrier, according to one embodiment.

[0029] FIG. 11A illustrates a graphical representation of a predicted minimum required opening height and an actual minimum required opening height with respect to time, according to one embodiment,

[0030] FIG. 11 B illustrates a graphical representation of measurements between the predicted minimum opening heights and the actual minimum opening heights used to train a machine learning process, according to one embodiment.

[0031] FIG. 12A illustrates a graph of the opening aperture of a door (e.g., door height) relative to time for two different control systems, according to one embodiment.

[0032] FIG. 12B illustrates a graph of the difference in the aperture opening of the doors controlled by the two different control systems of FIG. 12 A, according to one embodiment.

DETAILED DESCRIPTION

[0033] Barriers, such as doors and gates, are frequently used to control access to, for example, secure locations and climate-controlled locations. For example, a roll-up door may ¬ be used to control access to a refrigerated room. When the door is rolled down (e.g., closed), the door provides a barrier that helps maintain the refrigerated room cold. The roll-up door can be manually or automatically rolled up, at least partially, to allow a person or vehicle to enter the refrigerated room. The door may remain open until the person or vehicle leaves the refrigerated room, or the door may be closed until the person or vehicle is ready to leave the refrigerated room. Similar principles can be applied to a wide range of barriers configured to prevent access to unauthorized individuals, maintain controlled climate conditions, and/or otherwise maintain a separation between the two sides of the barrier. In some instances, a barrier may exist to prevent entry by unauthorized persons. In many instances, a barrier may- only be present to prevent or reduce the flow of a fluid (e.g., a gas or a liquid) between two rooms (or between a room and outside of the room).

[0034] The barrier control systems and methods described herein are applicable to a wide variety of motorized and electronically opened barriers including, without limitation, automatic industrial doors, panel doors, roll-up doors, rolling doors, pivot turnstiles, folding doors, sliding doors, lifting gates, swinging doors, revolving doors, passenger access barriers, train doors, bus doors, subway doors, sectional panel doors, pivot arms, rotatable turnstiles, and the like. For the sake of brevity, many of the examples described herein are provided in the context of an electronically controlled (motorized) roll-up door that controls access to a refrigerated room. However, it is appreciated that, many of the same principles, functionalities, systems, methods, subsystems, and other components of the barrier control systems and methods described herein are equally applicable or can be adapted for use with other types of barriers, including the various types of barriers listed above.

[0035] According to various embodiments, a barrier control system includes a sensor subsystem with one or more sensors to detect an object, such as a person, an animal, a vehicle (e.g., a forklift, truck, car, or the like). The sensor subsystem may determine various information about the object, including, without limitation, the location of the object relative to a barrier, the width of the object, the height of the object, velocity information, trajectory' information, acceleration information, and/or characteristics of the object (e.g., the direction a person is looking, the direction a person’ s head is turned, the angle at which a person i s leaning, etc.). The sensor subsystem may also include sensors to monitor moisture, air pressure, temperature, and other environmental characteristics of one or both sides of the harrier and/or of the detected objects.

[0036] As described herein, in some embodiments raw low-level sensor data is provided directly to a machine learning model (such as an occupancy prediction machine learning model and/or a path prediction machine learning model). In other embodiments, the raw low-level sensor data is pre-processed by specialized algorithms or machine learning models to generate high-level abstractions that are provided as inputs to the machine learning model. For example, raw low-level sensor data may be pre-processed algorithms or specialized machine learning models to determine any of wide variety of object characteristics which can be provided as inputs to the machine learning model for predicting an object’s trajectory. Examples of object characteristics that can be determined via the pre-processed algorithms or specialized machine learning models (or by the machine learning model that receives the raw data) include, but are not limited to object position, object velocity, object acceleration, object heights, object widths, object depths, object types (e.g., identification), object orientation, a direction in which a pedestrian or a vehicle operator is looking, a size of a load, identification of tools being carried, hand movements of a pedestrian or operator of a vehicle (e.g., movement of a steering wheel), angle of a body (e.g., the direction a pedestrian or operator of a vehicle is leaning), and/or the like. In some embodiments, the object characteristics may include other derivatives of position in addition to velocity and acceleration including, for example, the object's snap or jounce, the object's crackle, and/or the object's pop.

[0037] In some embodiments, any combination of the above-identified object characteristics may be provided as pre-calculated high-level abstraction inputs into the machine learning model. In other embodiments, the machine learning model may be trained using raw data and the above-identified object characteristics may be explicitly determined by the machine learning model, impliedly determined by the machine learning model, or not determined at all if they are not used by the trained neural network.

[0038] Examples of suitable sensors include, but are not limited to, LiDAR sensors, stereo cameras (e.g., visible light or infrared), time of flight cameras, ultrasound sensors, induction coils, pressure sensors, radar, imaging millimeter-wave sensors (e.g., mmWave sensors), temperature sensors, humidity sensors, and the like. The sensor subsystem may utilize the collected data to generate two-dimensional point clouds, three-dimensional point clouds, visible light images, false-color images, tables or graphs representing object positions and velocities as a time series, frequency responses of measured objects, and the like.

[0039] Processing of the sensor data can be performed locally (i ,e., within the barrier control system) and/or remotely (e.g., in remote servers or computing devices). In various embodiments, the sensor subsystem includes at least one sensor to monitor the location of a leading of the barrier being opened. For example, at least one sensor may monitor the relative location of the bottom edge of a roll-up door or the opposing edges of a two-panel pivoting door that opens inward or outward.

[0040] According to various embodiments, a sensor subsystem captures images or other sensory data to detect an object and detect the relative location of at least one edge of a barrier. An object analysis subsystem may utilize the data from the sensor subsystem to calculate trajectory information of the object. For example, the sensor subsystem may capture images via a red, green, blue (RGB) image sensor and/or generate a point cloud via a LiDAR sensor. The object analysis subsystem may process the data from the sensor subsystem to calculate trajectory information of an object proximate to the barrier and/or size information of the object.

[0041] Any of a wide variety of machine learning algorithms may be utilized for the occupancy prediction and/or path prediction machine learning models described herein. For example, an artificial neural feedforward network, a long short-term memory artificial neural network, or another neural network can be trained to predict the minimally required opening degree of an aperture (e.g., the height of a door, the width of a door, the pivot angle of a gate, etc.) based on object size and predicted trajectory. The output of the machine learning model may provide minimally required opening state information for each of a plurality of future increments of time. As the predicted trajectory of a given object changes, the minimally- required opening state information for a given increment of time may also change. A motor control unit may determine opening and/or closing actions to take at a given time based on the minimally required opening state information provided by the machine learning model. The motor control unit may utilize principles of model predictive control to control the opening and closing of the barrier, as discussed in greater detail below.

[0042] A basic automatic door may include a sensor that opens the door when it detects a moving object with specific regions on each side of the door. For example, a basic automatic door for a refrigerated room may open to a maximum height at a fixed opening speed in response to detecting a person or a vehicle within an external triggering region. The door may- remain fully open until the person or vehicle is no longer within the external triggering region. In some instances, the person or vehicle may not enter the refrigerated room, in which case the door opened unnecessarily. If the person or vehicle does enter the refrigerated room, the door may remain fully open until the person or vehicle is outside of a corresponding internal triggering region. In various circumstances, a basic automatic door may open to a larger aperture than necessary, open too soon (e.g., for a slow-moving person), open too late (e.g., for a fast-moving vehicle), remain open for longer than necessary, and/or otherwise provide suboptimal operation.

[0043] As compared to existing automatic door controllers, the presently described barrier control systems and methods reduce the passage of air or other fluids during opening cycles, reduce the chance of collisions between vehicles, pedestrians, and the barrier, reduce temperature fluctuations, reduce moisture fluctuations, save energy, reduce the exchange of contaminants, reduce the likelihood of tailgating by unauthorized individuals and animals, provide controlled access, provide notifications of unexpected behaviors, provide notifications of accidents, and/or facilitate automatic maintenance scheduling.

[0044] Some of the infrastructure that can be used with embodiments disclosed herein is already available, such as general-purpose computers, computer programming tools and techniques, digital storage media, and communication links. Many of the systems, subsystems, modules, components, and the like that are described herein may be implemented as hardware, firmware, and/or software. Various systems, subsystems, modules, and components are described in terms of the funetion(s) they perform because such a wide variety of possible implementations exist. For example, it is appreciated that many existing programming languages, hardware devices, frequency hands, circuits, software platforms, networking infrastructures, and/or data stores may be utilized alone or in combination to implement a specific control function.

[0045] It is also appreciated that two or more of the elements, devices, systems, subsystems, components, modules, etc. that are described herein may be combined as a single element, device, system, subsystem, module, or component. Moreover, many of the elements, devices, systems, subsystems, components, and modules may be duplicated or further divided into discrete elements, devices, systems, subsystems, components, or modules to perform subtasks of those described herein. Any of the embodiments described herein may be combined with any combination of other embodiments described herein. The various permutations and combinations of embodiments are contemplated to the extent that they do not contradict one another.

[0046] As used herein, a computing device, system, subsystem, module, or controller may include a processor, such as a microprocessor, a microcontroller, logic circuitry', or the like. A processor may include one or more special-purpose processing devices, such as application- specific integrated circuits (ASICs), a programmable array logic (PAL), a programmable logic array (PLA), a programmable logic device (PLD), a field-programmable gate array (FPGA), or another customizable and/or programmable device. The computing device may also include a machine-readable storage device, such as non-volatile memory, static RAM, dynamic RAM, ROM, CD-ROM, disk, tape, magnetic, optical, flash memory, or another machine-readable storage medium. Various aspects of certain embodiments may be implemented or enhanced using hardware, software, firmware, or a combination thereof.

[0047] The components of some of the di scl osed embodiments are described and illustrated In the figures herein to provide specific examples. Many portions thereof could he arranged and designed in a wide variety of different configurations. Furthermore, the features, structures, and operations associated with one embodiment may be applied to or combined with the features, structures, or operations described in conjunction with another embodiment. In many instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of this disclosure. The right to add any described embodiment or feature to any one of the figures and/or as a new' figure is explicitly reserved.

[0048] The embodiments of the systems and methods provided within this disclosure are not intended to limit the scope of the disclosure but are merely representative of possible embodiments. In addition, the steps of a method do not necessarily need to be executed in any specific order, or even sequentially, nor do the steps need to be executed only once. [0049] FIG. 1 illustrates a diagram 100 of the state of a harrier relative to time, according to one embodiment. In the illustrated diagram 100, the vertical axis represents the actuation of the barrier between a closed state and an open state. The horizontal axis represents time and includes a current time marker 105. As illustrated, a detected object 107 requires the barrier to be opened to a first actuation state. A safety margin 110 can be applied to ensure clearance to avoid collisions between the object traversing the barrier and the harrier. Additional objects 150 are predicted to traverse the barrier in the near future, as described in greater detail below. [0050] A traditional barrier control system, such as a motion-based control system 115 shows an actuation that is slightly too late, intersecting the safety margin zone, and ultimately opening much more than necessary. In contrast, a barrier control system based on a trained machine-learning model (as described in greater detail herein), opens to the height specified according to the safety margin, but no farther.

[0051] FIG. 2A illustrates actuation of a barrier according to different control systems 220 and 230 for detected objects 210 and predicted future objects 250, according to one embodiment. In the illustrated diagram, the detected objects 210 and predicted future objects 250 are shown with safety margins (if any) included. As illustrated, the traditional, motion- based controller 220 opens much more than necessary (e.g., beyond 6 meters). In contrast, the machine-learning-based control system 230 opens only as much as needed to avoid collisions. [0052] The integration of the region beneath the lines of the motion-based controller 220 and the macliine-learning-based control system 230 corresponds to possible airflow that traverses the barrier. In the ease of refrigerated rooms, minimizing airflow that traverses the barrier can reduce cooling costs and/or temperature fluctuations.

[0053] FIG. 2B illustrates actuation of the barrier according to different control systems, according to one embodiment. As illustrated, the predicted objects 250 from FIG. 2A have now 7 traversed the barrier as actual objects 251. Again, the motion-based controller 220 opens much more than necessary (e.g., beyond 6 meters), while the machine-leaming-based control system 230 opens only as much as needed to avoid collisions.

[0054] FIG. 2C illustrates a diagram 203 of the actuation of lateral -moving door panels 280 and 285 by different control systems, according to one embodiment. As illustrated, a traditional, motion-based controller (with a timer) 264 causes the lateral -moving door panels 280 and 285 to open much more than necessary. The machine-leaming-based control system 262 opens only as much as needed 260 to avoid collisions, or more specifically to a width that includes a fixed or learned safety margin 261. [0055] For example, the machine-learning based control system 262 may learn from forklift operators that slow down due to the doors being opened only just enough (e.g., to the width of the object. 260). The slower forklift driving may result in the doors being opened longer. The machine-learning-model may self-adjust based on a minimize airflow cost function to open the doors wider (e.g., dynamically adjusting the safety margin 261) to open the doors sufficiently for the forklift drivers to slow down less (or not at all), as determined to minimize airflow. [0056] FIG. 3A illustrates a block diagram of an example of a barrier control system 301 that includes a sensor subsystem 310, an aperture analysis subsystem 320, a motor control interface 330, a path prediction machine learning model 340, an occupancy prediction machine learning model, and a machine learning training subsystem 350. According to various embodiments, the sensor subsystem 310 includes one or more sensors or sensor arrays to capture data from a region or regions proximate to a barrier. For example, the sensor subsystem 310 may detect an object in motion, such as a vehicle or pedestrian, that is within a threshold distance from the barrier (e.g., within the range and field of view of the sensors in the sensor subsystem 310). In various embodiments, the sensor subsystem 310 may capture images of a region proximate to a barrier (e.g., via an RGB image sensor or a LiDAR scanning sensor). [0057] In some embodiments at least one sensor of the sensor subsystem 310 is positioned and configured with a field of view that allows for detection of an edge of the barrier that moves as the barrier is opened and closed. For example, the sensor subsystem 310 may include a camera that is positioned with a field of view sufficient to capture images of objects as they approach a region in front of a roll-up door and images of the leading edge of the roll-up door as it transitions from a closed state to an open state.

[0058] The sensor subsystem 310 may comprise any of a wide variety and/or combination of sensors. Examples of suitable sensors include, but are not limited to, LiDAR sensors, stereo cameras (e.g., visible light or infrared), time of flight cameras, ultrasound sensors, induction coils, pressure sensors, radar, imaging millimeter- wave sensors (e.g., mmWave sensors), temperature sensors, humidity sensors, and the like. The sensor subsystem may utilize the collected data to generate three-dimensional images, twO~dimensional point, clouds, three- dimensional point clouds, visible light images, false-color images, tables or graphs representing obj ect positions and velocities as a time series, frequency responses of measured obj ects, and/or the like.

[0059] In one embodiment, the sensor subsystem 310 includes two cameras positioned above or beside a door (e.g., a stereo camera system) to capture images outside of a room (i.e., on one side of the door) and two more cameras positioned above or beside the door to capture images inside of the room (i.e., on the other side of the door). A framerate for image capture (e.g., 15, 30, 60, 100, or 240 frames per second) may be selected to provide adequate time- resolution for object position, velocity, and/or acceleration calculations, while (in some embodiments) minimizing data collection, storage, and processing. Each stereo camera system may, for example, provide a 70° vertical field of view and a 120° lateral field of view. Each stereo camera system may be positioned such that one edge of the vertical field of view can capture images of the leading edge of the door when it is fully open or approximately fully open. In such a position, the stereo camera system can be used to monitor the position of door panels or other portions of a barrier and detect objects at a relatively far distance from the barrier without any blind spots.

[0060] According to various embodiments, the sensor subsystem 310 can track the position of multiple objects at the same time and calculate velocity vectors for each object. The sensor subsystem 310 may generate a three-dimensional bounding box for each object that specifies a height, width, and/or depth. In some embodiments, the sensor subsystem 310 can detect and distinguish between different objects (e.g., detect an object as either a pedestrian or a forklift). In other embodiments, the path prediction machine learning model 340 may be trained to detect and distinguish between different objects. In various embodiments, the sensor subsystem 310 includes temperature compensation circuits to compensate for temperature fluctuations and/or an accelerometer to align the coordinate system and facilitate detection of the floor of a facility. [0061] In some embodiments, the aperture analysis subsystem 320 may calculate size information of an object (e.g., height information relevant to the height a roll-up door must open, or width information relevant to the width pivoting lateral doors must open). In some embodiments, the size information may be provided by the sensor subsystem 310, while in other embodiments, the aperture analysis subsystem 320 may calculate the size information using images or point cloud information provided by the sensor subsystem.

[0062] The aperture analysis subsystem 320 may predict a trajectory' of each detected object using the trained path prediction machine learning model 340. In some embodiments, the path prediction machine learning model 340 may receive images or a point cloud as an input and be trained to detect objects, determine object locations, and predict trajectories. In other embodiments, the data input into the path prediction machine learning model 340 may include high-level detection results that can serve to expedite training on trajectory predictions. In embodiments in which raw or low-level sensor data is provided as in input to the machine learning model 340, the machine learning model 340 may be trained to implicitly develop its own object detection and position tracking functionalities. In embodiments in which raw or low-level sensor data is pre-processed to generate data with a higher level of abstraction (e.g., data associated with detected objects, object positions, object dimensions, object velocities, etc.), the machine learning model 340 can be more quickly trained and/or refined to predict object trajectories.

[0063] Based on the size information of each object and the predicted trajectory of each detected object, the aperture analysis subsystem 320 may determine a minimum aperture value for the barrier to attain for each of a plurality of future time increments. The motor control interface 330 may communicate the minimum aperture values for each future time increment to a motor control unit. The motor control unit may initiate the opening and closing of the barrier to ensure the minimum aperture values are attained at each respective time increment. The actual route of the object may be tracked by the sensor subsystem 310 and provided to a machine learning training subsystem 350. The machine learning training subsystem 350 can use the actual route of the object to further improve or refine the path prediction machine learning model 340.

[0064] As a specific example, the sensor subsystem 310 may detect a forklift and two pedestrians proximate the barrier. The forklift may be determined to have a height of 3 meters and each pedestrian may be determined to have a height of 2 meters. The aperture analysis subsystem 320 may utilize the path prediction machine learning model 340 to predict trajectories of all three objects. The predicted trajectory of the forklift, may indicate that the forklift will traverse a roll-up door from time increments 3 to 5. Accordingly, the barrier control system may inform the motor control unit, via the motor control interface 330, that the roll-up door should be opened to a minimum height of 3 meters during time increments 3 and 5. [0065] The predicted trajectory of the first pedestrian may indicate that the first pedestrian does not intend to enter the room via the roll-up door. Accordingly, no minimum height information may be transmited to the motor control unit. A trajectory of the second pedestrian calculated by itself using a static, trajectory algorithm might normally indicate that that the second pedestrian will traverse the roll-up door from time increments 5 to 9. However, the training of the path prediction machine learning model may have included numerous training data in which a pedestrian and a forklift are entering a room at approximately the same time. In this training data, the pedestrians may have frequently yielded to the forklift. Accordingly, without necessarily evaluating the “why” of it, the path prediction machine learning model 340 may predict a trajectory for the second pedestrian indicating that the second pedestrian will traverse the roll-up door from time increment 6-10. Accordingly, the barrier control system may inform the motor control unit, via the motor control interface 330 that the roll-up door should be opened to a minimum height of 2 meters during time increments 6-10.

[0066] According to various embodiments, the barrier control system 301 may utilize the occupancy prediction machine learning model 345 instead of or in addition to the path prediction machine learning model 340. The occupancy prediction machine learning model 345 may utilize low-level data, raw data, pre-processed data (high-level data) and/or other sensor data collected by a sensor subsystem. The occupancy prediction machine learning model 345 may identify future time increments (e.g., a set of continuous or discontinuous time increments) during which a detected object (or objects) is predicted (by the machine learning model 345) to occupy a barrier actuation region of a barrier. The barrier control system 301 may predict barrier opening states suitable to avoid collisions between the barrier and the object for the future time increments during which the object detected by the sensor subsystem is predicted to occupy the barrier actuation region.

[0067] FIG. 3B illustrates a block diagram of another example of a barrier control system 302 that further includes the motor control unit 335, a preprogrammed trajectory algorithm subsystem 360, and a model predictive control subsystem 370. In the illustrated embodiment, the barrier control system 302 includes an integrated motor control unit 335. In contrast, the barrier control system 301 of FIG. 3A communicates with an external and independently functioning motor control unit.

[0068] The barrier control system 302 also includes a preprogrammed trajectory algorithm subsystem 360 that can calculate a worst-case trajectory' based on fixed variables. In some embodiments, a barrier control system may be deployed for operation in a facility with minimal or no training of the machine learning model 341 (e.g., the path prediction machine learning model 340 (FIG. 3 A) and/or the occupancy prediction machine learning model 345 (FIG. 3B)). In such embodiments, the untrained or undertrained machine learning model 341 may not yet be capable of accurately predicting object trajectories. To ensure that the barrier control system 302 can be immediately used, the barrier control system may initially utilize a static, preprogrammed trajectory algorithm subsystem 360 to calculate the trajectories of detected objects (e.g., a worst-case trajectory'). The aperture analysis subsystem 320 may determine aperture values (e.g., minimum aperture values) for the barrier to attain during future time increments based on the trajectory calculated for each detected object.

[0069] In various embodiments, the preprogrammed trajectory algorithm subsystem 360 may calculate a worst-case trajectory based on a function of the current position and velocity of a detected object and default or preprogrammed maximum possible acceleration values. For example, while a trained neural network may correctly predict that a forklift driver driving away from a barrier is unlikely to turn around and go through the barrier, a worst-case trajectory algorithm may indicate that such a trajectory is possible and, therefore, inform the motor control unit to he open to a specific height during specific time increments ‘just in case” the worst-case trajectory is realized.

[0070] The actual route of the forklift in the example above may he used as training data to improve the machine learning rnodei(s) 341. Once the machine learning model 341 has demonstrated an ability to predict trajectories of objects with sufficient accuracy (in the case of a path prediction machine learning model) or predict time increments during which the object, is calcuiated/estimated/predicted to occupy the barrier actuation region of the barrier (in the case of the occupancy prediction machine learning model). The barrier control system 302 may begin utilizing the trajectories predicted by a path prediction machine learning model and/or the occupancy predictions generated by the occupancy prediction machine learning model instead of the trajectories calculated using the preprogrammed trajectory algorithm subsystem 360.

[0071] The barrier control interface 302 also includes a model predictive control subsystem 370 to enhance the operation of the motor control unit 335. In some embodiments, including those that do not utilize model predictive control, the motor control unit 335 may attempt to open the barrier to the minimum aperture value specified for each time increment by the aperture analysis subsystem 320. In other embodiments, the barrier control system may utilize principles of model predictive control to determine an optimal control input to be executed by the motor control unit 335, as described in greater detail below.

[0072] FIG. 3C illustrates a block diagram of another example of a barrier control system 303, according to one embodiment. As illustrated, the barrier control system 303 includes a bus 305 that connects a processor 307, a memory 309, and a network interface 311 to a computer-readable storage medium 390. The computer-readable storage medium 390 includes a sensor subsystem module 391, a machine learning model 392 (e.g., for path prediction and/or occupancy prediction), a minimum barrier opening state calculation module 393, a preprogrammed trajectory algorithm module 394, a trajectory training data feedback module 395, and a prediction accuracy module 396.

[0073 ] Many of the functions of the modules 391-396 are similar to those of the previously described subsystems with similar or identical names. For example, the sensor subsystem module 391 may receive and process data from one or more sensors to detect objects proximate to the barrier, categorize or identify the object, detect a relative location of an edge of the barrier or otherwise determine a state of openness of the barrier, calculate size or other dimension information of the object, and the like. The machine learning model 392 may be trained using various training data from local sources and/or training data gathered from other, remote locations by other barrier control systems. The machine learning model 392 may be used to predict a trajectory and/or occupancy of one or more detected objects. For example, an occupancy prediction machine learning algorithm may predict time increments during which the object is predicted to occupy a barrier actuation region of the barrier (e.g., a region of space where a roll-up or sliding door moves).

[0074] In various embodiments, the machine learning model 392 predicts the trajectory and/or occupancy of a given object based on the current object position, current object velocity, type of object, size of object, and/or the predicted trajectories of other objects. Accordingly, in some instances, the machine learning model may be trained such that a predicted trajectory or occupancy state (e.g., future time increments during which the object is predicted to occupy the barrier actuation region of the barrier) of an object is based, at least in part, on the relative locations of other objects and/or the precited trajectories of the other objects.

[0075] A minimum barrier opening state calculation module 393 may determine a minimum aperture value for the barrier to attain for each of a plurality of future time increments based on the calculated size information and predicted trajectory' of the object. In the event that more than one object is expected to traverse the barrier at the same time or approximately the same time, the minimum aperture value for the barrier during the relevant time increments wall be the greater of the individual minimum aperture values to ensure that neither of the two objects collides with the barrier.

[0076] The preprogrammed trajectory algorithm module 394, as previously described, may be used to ensure safe actuation and opening of the barrier until the machine learning model is fully or sufficiently trained. The preprogrammed trajectory algorithm module 394 may calculate a trajectory of detected objects based on a worst-case scenario in which detected objects accelerate toward the barrier at a selected maximum acceleration value and/or maximum velocity value. The selected maximum acceleration and/or velocity values may be a theoretical maximum value for the given type of object, or it may be a lower, fixed value manua ly selected as reasonable for the type of object in question. For instance, the maximum theoretical velocity value of a pedestrian may be around 11 m/s; however, the preprogrammed trajectory algorithm module 394 may utilize a manually assigned value of 6 m/s based on the assumption that pedestrians are sufficiently unlikely to sprint through the barrier. Similarly, the preprogrammed trajectory algorithm module 394 may utilize other, reasonable acceleration and velocity values to calculate worst-case trajectories for each of a variety of different types of objects.

[0077] The training data feedback module 395 may utilize actual route information of an object, as detected by the sensor subsystem, as training data to improve the machine learning model 392, The prediction accuracy module 396 may evaluate the trajectory' predictions and/ occupancy predictions made by the machine learning model 392 and the actual route of objects to determine when the barrier control system 303 can switch from using the preprogrammed trajectory algorithm module 394 to the machine learning a model 392. For example, the prediction accuracy module 396 may determine that the machine learning model 392 is sufficiently trained based on a sequential count of accurate trajectory and/or occupancy predictions exceeding (e.g., equal to or greater than) a threshold value.

[0078] FIG. 4A illustrates a flow chart of a, method 401 for calculating a, minimum opening value for a barrier based on a trajectory' calculated using a static, preprogrammed trajectory algorithm, according to one embodiment. As illustrated, a sensor subsystem is used to detect, at 410, an object proximate a barrier. The system calculates, at 425, an object trajectory using a static, preprogrammed trajectory algorithm. The calculated trajectory is used to calculate, at 435, a minimum opening value for each of a plurality of future time increments for the barrier. The minimum opening values for each of the future time increments are transmitted, at 470, to a motor control unit.

[0079] FIG. 4B illustrates a flow chart of a method 402 for calculating a minimum opening value for a harrier based on a trajectory predicted by a path prediction machine learning model, according to one embodiment. The sensor subsystem is still used to detect, at 410 the object proximate the barrier. The system then predicts, at 420, an object trajectory using a path prediction machine learning model. The system calculates, at 430, a minimum opening value for each of a plurality of future time increments for the barrier. The minimum opening values for each of the future time increments are transmitted, at 460, to a motor control unit. The system then provides, at 480, an actual route of the object as detected by the sensor subsystem as training data for the path prediction machine learning model.

[0080] FIG. 4C illustrates a flow chart of a method for calculating a minimum opening value for a harrier before and after training of a machine learning model, according to one embodiment. FIG. 4A above outlines a method 401 that uses an algorithm to calculate a trajectory based on, for example, a worst-case scenario. FIG. 4B illustrates a method 402 that uses a trained machine learning model. FIG. 4C relates to a method in winch a preprogrammed trajectory algorithm is used to calculate a trajectory while the path prediction machine learning model is being trained. According to various embodiments, the system may be immediately deployed for operation. During the initial training period, the barrier opens to prevent collisions using the preprogrammed trajectory algorithm. However, the initial operation may cause the barrier to open earlier than necessary, open unnecessarily, open to a wider aperture than needed, close later than possible, and/or exhibit other inefficiencies. Once the path prediction machine learning model is sufficiently trained, the system may discontinue using the preprogrammed trajectory algorithm in favor of the path prediction machine learning model.

[0081] As illustrated, the sensor subsystem detects, at 410, an object proximate to the barrier. In parallel (or sequentially) a path prediction machine learning model is used to predict, at 420, an object trajectory, and an object trajectory is calculated, at 425, using the static, preprogrammed trajectory algorithm. Minimum opening values are calculated, at 430 and 435, for each of a plurality of future time increments based on the predicted and calculated trajectories, respectively. If the machine learning model is sufficiently trained, at 450, then the minimum opening values calculated based on the trajectory' predicted by the path prediction machine learning model are transmitted, at 460, to a motor control unit.

[0082] Otherwise, if the machine learning model is not yet sufficiently trained, at 450, then the minimum opening values calculated based on the trajectory calculated using the static, preprogrammed trajectory algorithm are transmitted, at 470, to the motor control unit. In some embodiments, once the machine earning model is identified or marked as sufficiently trained, elements 425, 435, and 470 may be omited from the flow to avoid unnecessary' computations. The sensor subsystem may be used to continually track the route of the object and the actual route of the object may be provided, at 480, as training data to improve the accuracy of the path prediction machine learning model. The path prediction machine learning model may be marked, at 490, as sufficiently trained once the sequential count of accurate predictions exceeds a threshold training value, as described herein,

[0083] FIG. 4D illustrates a flow chart of a method for calculating a minimum opening value for a barrier based on future time increments during which an object is predicted to occupy a barrier actuation region using an occupancy prediction machine learning model, according to one embodiment. The sensor subsystem is still used to detect, at 411 the object proximate the barrier. The system then identifies, at 421, future time increments during which an object is predicted to occupy a barrier actuation region using an occupancy prediction machine learning model. The system calculates, at 431, an opening value (e.g., a minimum opening value) for the future time increments. The opening values for each of the future time increments are transmited, at 461, to a motor control unit. The system then provides, at 481, an actual route or other position information of the object as detected by the sensor subsystem as training data for the path prediction machine learning model.

[0084] FIG. 5 A illustrates a block diagram of a machine learning model for training a barrier control system, according to one embodiment. The illustrated embodiment includes a sensor array 510 that may, for example, be a part, of a sensor subsystem. The sensor array captures raw or low-level data that can be directly transmitted 517 to the aperture analysis subsystem 550 that includes a machine learning model 530. Alternatively, the raw or low-level data 515 may be transmitted to specialized pre-trained algorithms 520 that perform specific tasks, such as position tracking, determining the leaning angle of people, estimate a forklift load type and/or weight, and/or generate a three-dimensional point cloud. The high-level abstractions 525 generated by the specialized pre-trained algorithms 520 may be provided as inputs to the machine learning model 530.

[0085] While the machine learning model 530 could potentially he trained using all the raw' or low-level data 515, the training would likely require more training samples to accurately predict object trajectories. Specific sensor data may be pre-processed by specialized pre-trained algorithms that operate independently of the facility in which the barrier control system is installed. For example, a pre-trained algorithm 520 may be used to detect an object's position and provide a high-level abstraction 525 of the object's position as an input into the machine learning model 530.

[0086] In other embodiments, the data captured by the sensor array 510 can be recorded and transmitted to the machine learning model 530 in a raw or low-level format. The data may be transmitted using wireless networking infrastructure, wired network infrastructure, various wireless protocols, via the Internet, etc. The machine learning model 530 may receive the raw or low-level data and be trained to predict object trajectories. The relative significance of the received raw or low-level data does not need to be determined ahead of time.

[0087] In contrast, some or all the raw data may be low-level processed on-site or in the cloud prior to being provided to the machine learning model 530. For instance, the machine learning model 530 may be trained to predict object trajectories using pre-processed high-level abstractions 525, such as object positions, the leaning angles of persons, forklift load type, forklift, weight estimations, a three-dimensional point cloud, etc. In such embodiments, the machine learning algorithm 530 does not need to learn the underlying processes of object detection, the leaning angles of persons, etc. Instead, the machine learning model 530 can leverage existing trained models and algorithms to make better use of the training data, which is especially important during initialization and early usage when the machine learning model is “data-starved ”

[0088] Accordingly, the proposed aperture analysis subsystem may utilize a modular approach that includes a machine learning model 530 that receives, as inputs, high-level abstractions from pre-built or pre-train ed algorithms 520 for specialized analysis and raw low- level sensor data 515. For example, the machine learning model may be a path prediction machine learning model or an occupancy prediction machine learning model, as described herein.

[0089] In some embodiments, the machine learning model 520 may utilize the high-level abstractions 525 from the pre-built or pre-trained algorithms 520 during initial training and then use raw or low-level sensor data 517 to refine the training over time. In such embodiments, duplicate data (pre-processed high-level abstractions 525 and raw data 517) may be available to the machine learning model 530. In some embodiments, the barrier control system may limit or restrict data transmission and/or storage to high-level abstraction data 525 to intentionally decrease the training data size, reduce data bandwidth requirements, and/or reduce data store requirements,

[0090] FIG. 5B illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model, according to one embodiment. In the il lustrated embodiment, the sensor array 510 provides raw or low-level sensor information 515 to the aperture analysis subsystem 550. The aperture analysis subsystem 550 includes an end- to-end machine learning model that, using one or more cost functions (e.g., a weighted average of cost functions), determines aperture openings for a barrier based on the provided raw or low- level sensor data 515. Barrier control instructions 540 can be sent to a barrier control unit to control the actuation of the barrier.

[0091] Accordingly, the barrier control system includes a plurality of sensors that generate any of a wide variety of sensor information that detects various possible information about objects proximate a barrier. The aperture analysis subsystem 550 uses the end-to-end trained machine-learning model to determine a minimum aperture value for the barrier to attain at each of a plurality of future time increments based on the sensor information. For example, the end- to-end machine-learning model may be trained to minimize a cost function to avoid collisions, minimize a cost function associated with airflow traversing the barrier, minimize a cost function associated with energy costs to cool one or both sides of barrier, minimize a cost function of wear and tear on mechanical components, or weighted combinations of such cost functions. A motor control interface, as described herein, may transmit the barrier control instructions with the minimum aperture values for the future time increments to a motor control unit.

[0092] FIG. 5C illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with some pre-processed data, according to one embodiment. In the illustrated embodiment, the sensor array 510 provides raw or low-level sensor information 515 to the aperture analysis subsystem 550. Additionally, some of the data is preprocessed 517 to generate a high-level data stream. The high-level data stream is provided to the aperture analysis subsystem 550 together with the raw or low-level sensor information 515. Again, the aperture analysis subsystem 550 may include an end-to-end machine learning model that, using one or more cost functions (e.g., a weighted average of cost functions), determines aperture openings for a barrier based on the provided raw or low-level sensor data 515. Barrier control instructions 540 can be sent to a barrier control unit to control the actuation of the barrier.

[0093] FIG. 5D illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with pre-processed data, according to one embodiment. In the illustrated embodiment, the sensor array 510 provides raw or low-level sensor information 515 to various data preprocessing subsystems to generate various high level data streams 517. The high-level data streams 517 are provided to the aperture analysis subsystem 550. Again, the aperture analysis subsystem 550 may include an end-to-end machine learning model that, using one or more cost functions (e.g., a weighted average of cost functions), determines aperture openings for a barrier based on the preprocessed or high-level data streams 517. Barrier control instructions 540 are sent to a barrier control unit to control the actuation of the barrier.

[0094] According to various embodiments, the sensor information from the sensor array 510 may comprise actual position data. In some embodiments, the sensor array 510 may comprise radar sensors or other sensor types that provide velocity information (e.g., Doppler). The aperture analysis subsystem 550 may explicitly extrapolate position data (and explicitly determine a trajectory), as described in the embodiment in FIG. 5A. In other embodiments, the end-to-end machine-learning model of the aperture analysis subsystem 550 of FIGS. 5B-D may determine the minimum aperture values for future time increments without explicit intermediary calculations.

[0095] According to vari ous embodiments of FIGS. 5 A, 5C, and 5D, the processed sensor data from the sensor array 510 may be used to calculate at least one vectorial component of a derivative of position. For example, system (or subsystems thereof) may calculate a velocity vector of an object (the first derivative of position), an acceleration vector of an object (the second derivative of position), a jerk vector of an object (the third derivative of position), a snap vector of an object (the fourth derivative of position), a crackle vector of an object (the fifth derivative of position, and/or a pop vector of an object (the sixth derivative of position). Vector components of derivative of position may be used by the aperture analysis subsystem 550 to determine minimum aperture values for future time increments during which an object is predicted to be within a barrier actuation region of a barrier.

[0096 ] Vectorial components are generally defined orthogonal to each other. Accordingly, the system may determine a vector component of a velocity, acceleration, jerk, snap, crackle, and/or pop of an object in a direction orthogonal to a plan of the barrier actuation region, in a direction defined by the shortest distance from the object to the barrier actuation region, or in a direction defined by an arbitrary coordinate system.

[0097] FIG. 6 A illustrates a block diagram of a barrier control system 600 that includes integrated sensors 605, 610, 615, and 625, according to one embodiment. For example, the barrier control system 600 may include dual or stereo cameras 605 for capturing images of objects proximate to a barrier. Block diagram elements 610, 615, and 625 represent any of the wide variety of sensors listed herein. In some embodiments, the barrier control system comprises disparate components that are not integrated into a single unit. For example, the sensor subsystem of a barrier control system may be mounted near a door and communicate with other portions of the barrier control system that are located nearby or in remote locations (e.g., cloud-based servers).

[0098] FIG. 6B illustrates an example of a barrier control system 600 with integrated sensors mounted above a roll-up door 650, according to one embodiment. While the illustrated example only shows one side of the roll-up door 650, in many embodiments a second set of sensors, or even a second barrier control system 600, may be positioned on the other side of the roll-up door 650 to provide actuation of the door for pedestrians and vehicles approaching the roll-up door 650 from either side.

[0099] FIG. 6C illustrates an example of a harrier control system 600 with an integrated sensor mounted to the side of a roll-up door 650, according to one embodiment. According to various embodiments, one or more sensors of the barrier control system 600 may be physically mounted (e.g., positioned) and configured with a field of view sufficient to capture images of the panels of the roll-up door 650 as it is rolled up and down. For example, the cameras 605 may be configured and positioned to capture images of the leading edge 655 of the roll-up door 650 as the aperture of the roll-up door 650 changes (i.e., as it is rolled up and down). While the illustrated embodiment includes a leading-edge 655 of a roll-up door 650 that is straight, it is appreciated that the leading edge of other types of barriers may not be straight and may not move straight lines.

[00100] FIG. 6D illustrates the roll-up door 650 partially rolled up with the leading edge 655 of the roll-up door 650 slightly open. A barrier actuation region i s shown that includes a volume extending up from the shaded region 699 on the floor to encompass a region within which the barrier moves when the barrier transitions between open and closed states. The barrier actuation region 699 includes the region within which at least a portion of the barrier moves during actuation.

[00101] As may be appreciated by one of skill in the art, the barrier actuation region may be different for an arm the pivots up and down or for pivoting doors. For instance, the barrier actuation region for a pivoting door that swings outward and/or inward would be relatively large.

[00102] In various embodiments, the barrier control system utilizes captured images of the leading edge(s) of a barrier to develop mathematical models of the barrier actuation dynamics. For example, in embodiments in which the barrier control system includes an integrated motor control unit, the barrier control system may test the barrier actuation dynamics to determine, for example, barrier opening speed, barrier closing speed, barrier acceleration, barrier inertia upwards, barrier inertia downwards, barrier direction changing delays, etc. Furthermore, the barrier control system may utilize captured images of the leading edge(s) of a barrier for accurate model and calibrated control parameters and/or to provide alerts or notifications if the barrier operation changes (e.g., to detect fault conditions due to damage, collisions, or wear). In some embodiments, a separate sensor is used to monitor the leading edge(s) of a barrier. [00103] In some embodiments, the barrier control system may model some barriers using linear models. However, the barrier control system may model other barriers using nonlinear mathematical models to account for noniinearities in the aperture actuation. For example, the barrier control system may model rapid roller doors using nonlinear models to account for the nonlinear change in coil radius as the door is rolled up. According to various embodiments, the barrier control system may utilize a separate barrier actuation artificial neural network to model barriers with nonlinear aperture actuations. In some embodiments, barrier actuation dynamics may be known and provided as inputs to the barrier control system as built-in grey box models to speed up calibration processes.

[00104] In some embodiments, the machine learning model may he pretrained with data from one or more other barriers in one or more different environments. The pretrained machine learning model may be fine-tuned or augmented with training data during actual use after installation on a specific barrier.

[00105] FIG. 7 illustrates an example field of view 750 of sensors integrated into a barrier control system 700, according to one embodiment. The field of view may be defined relative to a plane 775 in which the barrier aperture is opened and closed. For example, the sensors of the barrier control system 700 may be positioned above a barrier that exists on the plane 775. The field of view 750 may be adapted for a particular application, the size of the room(s) on either side of the barrier, the expected velocities of the objects traversing the barrier, and/or other factors. In example implementations, the field of view 750 may include a lateral dimension between 45 degrees (e.g., for a barrier at. the end of a narrow hallway or for a barrier that is only entered from straight on) and 180 degrees (e.g., along the entire plane 775 of the barrier actuation). In one example embodiment, the lateral field of view is approximately 120 degrees and the vertical field of view is approximately 70 degrees.

[00106] FIG. 8 illustrates a barrier control system 800 positioned above a roll-up door 825 analyzing trajectory information of a forklift 875 and a pedestrian 850. Based on predicted trajectories of the forklift 875 and/or the pedestrian 850, the barrier control system 800 has caused the roll-up door 825 to open to the illustrated height.

[00107] FIG. 9A illustrates a modeled trajectory 901 with no acceleration, according to one embodiment. Specifically, an object at. position 10 along the vertical axis is shown with minimal deviation with respect to time along the horizontal axis.

[00108] FIG. 9B illustrates modeled trajectory 902 possibilities of an object, with relatively small acceleration possibilities, according to one embodiment. As illustrated, the possible positions of the object vary slightly from the position 10 over time,

[00109] FIG. 9C illustrates modeled trajectory ' 903 possibilities of an object with relatively large acceleration, according to one embodiment. As illustrated, the possible positions of the object vary widely depending on the specific trajectory ' taken by the object.

[00110] FIG. 9D illustrates modeled trajectory 904 possibilities of an object with relatively large acceleration with a known obstacle, according to one embodiment. Again, the possible positions of the object vary widely depending on the specific trajectory' taken by the object. However, the obstacle makes some trajectories unlikely or impossible. Collectively, FIGS. 9A- 9D Illustrate the additional complexity in predicting future locations of objects, especially as the possible acceleration is assumed to be high.

[00111] FIG. 10 illustrates an example diagram of predicted trajectories 1000 of an object relative to a barrier 1050, according to one embodiment. As illustrated, at each “current position” indicated by a circle, the barrier control system predicts a trajectory. While many predicted trajectories 1000 do not correspond to the actual future location of the vehicle, some of the predicted trajectories 1000 indicate the vehicle will traverse the barrier 1050. In such instances, the barrier control system causes the barrier to open to allow the vehicle to traverse without collision. In some embodiments, the barrier control system may receive raw data and determine (or receive high-level abstractions that indicate) positions, velocity, accelerations, and/or other object characteristics (as described herein) at each location of the vehicle.

[ 00112 ] The machine learning model of the barrier control system may explicitly output the full future trajectory of the detected object(s) in the form of a time series, in which case a cost function can compare the predicted and recorded future trajectories and penalize deviations. The machine learning model may additionally or alternatively identify future time increments during which an object or objects are predicted to occupy a barrier actuation region. Ultimately, the barrier control system operates a barrier or causes a barrier to operate to allow an object to traverse the barrier regardless of how fast the object approaches or accelerates, and without relying on human attentiveness. The barrier control system may also operate to minimize the time the barrier is open and/or minimize the degree to which the barrier is open. According to various embodiments, the barrier control system determines the degree to which the barrier must be open (e.g., determine the height to which the barrier must be open) prior to the barrier beginning to open. The barrier control system may generate time-series data that indicates the minimum opening degree (e.g., minimum aperture value, minimum height value, or minimum barrier opening state) required at each time interval in the time series.

[00113] A possible cost function can be the mean of the squared Euclidian distances between predicted position and reference at every given time step for one and the same object for all present objects for all recorded time steps whereas a weighting function of various shapes can be applied to the squares that, for example, penalizes deviations more heavily the closer they occur to the door, an example of which can be seen in Equation 1 below :

[00114] Equation 1

[00115] In Equation 1, d i denotes a time series vector belonging to object i whose elements are the Euclidian distance between the object's predicted and actual position. y\[ V0S,itwu ; s a vector containing weights that may, for instance, depend on factors such as the object type and velocity (indicating the possible severity of a collision). The w V0S,iMU m a y inversely scale with the duration between the time at which the prediction is made and the time at which the object enters the aperture of the barrier (i.e., traverses the barrier). If the duration is longer than the prediction horizon of the controller or the duration of time that the door would require to open to said object's height (e.g., if the controller does not employ a prediction horizon), incorrect actuations pose no risk, and therefore should not be penalized. As an object gets closer to the door, the importance of making accurate predictions increases since there is less time for the barrier to open. Accordingly, in some embodiments, the proximity of the object to the door at the time the prediction is made is used. The length of and v ar y depending on the object.

[00116] Some objects, such as forklifts can alter their height as they drive. Pedestrians and vehicles approaching an aperture at an angle or around a bend of a curved path may vary their effective widths as they move. Some forklifts can also change their effective width while they are traveling a straight path. The barrier control system may predict the width or height of the object that is expected at the time the object traverses the barrier. The barrier control system may cause the harrier to open in accordance with the predicted width, height, or another size characteristic.

[00117] The barrier control system may include an algorithm or trained neural network to predict a height (or other dimension or size characteristic relevant to the aperture of the barrier) of an object for the time increments during which the object is predicted to traverse the barrier. Size predictions can still be useful when objects do not change as they approach the barrier. In such instances, the algorithm effectively filters out noise from successive height measurements. In some embodiments, the largest measured height during any time the object was visible is used as the basis for determining the minimum height opening state. Accordingly, the barrier control system may utilize a cost function of an approaching object's height as set forth in Equation 2 below:

[00118] Equation 2

[00119] In Equation 2, h\ denotes predicted heights of the object as a tie vector, h L represents the corresponding measured reference heights, and w ; hei9ht represents the weight vector. [00120] The prediction algorithm of the barrier control system may be trained using the combined costs functions of position and height, and , such that the training cost function C is defined in Equation 3 below: Equation 3

[00122] A prediction of the future 2-D positions, 5), of all objects by the sensor subsystem can he converted to a prediction of the minimally required aperture opening degree using matrices. For example, S i may be a 2 x n matrix corresponding to the object i with n denoting the number of time steps for which the prediction is made, and the top and bottom of each column contain the coordinates respectively for each time step.

[00123] While the principles can generally be extended to a wide variety of aperture types and a wide variety of barrier types, the following example uses a vertically opening folding door. The minimally required opening degree is equal to the height of the tallest object about to traverse or currently traversing the plane of a door panel. A height vector as a time series, c an ca l cu l a ted from the set of S i .

[00124] As another example, the barrier control system may detect three objects proximate the door and predict future positions S 1 , S 2 , and S 3 , for n time increments. The barrier control system may calculate h redlcted starting from a 1 ,n vector fdled with zeros. Each entry in the vector that corresponds to a time increment at which at least one of the objects is about to enter or currently within a set minimum distance of the plane of door panel movement may be replaced by the tallest predicted height of all objects fulfilling this condition. In other embodiments, a non-circular shape is used instead of a minimum distance, and any object within the shape is considered within the plane of door movement. The shape can be defined larger for fast-moving objects. The proposed systems and methods allow the barrier control system to target individual opening degrees for each panel independently of each other. Not all panels must open to the same height.

[00125] FIG. HA illustrates a graphical representation 1100 of a predicted minimum required opening height and an actual minimum required opening height with respect to time, according to one embodiment. The barrier control system may utilize a cost function that compares the shapes of the predicted minimum required opening degree and an actual minimum required opening degree in terms of the vector space defined with time on one axis (the horizontal axis as illustrated) and the ‘ minimal required opening height” on the other axis (the vertical axis as illustrated).

[00126] FIG. 11B illustrates a graphical representation 1150 of measurements between the predicted minimum opening heights and the actual minimum opening heights used to train a machine learning process, according to one embodiment. Specifically, for each point of t h e S y S†e m may calculate the smallest Euclidian distance to any point. The cost function used for training the machine learning model may be the sum of the distances for all points. In some embodiments, the axes are scaled to ensure the cost function does not cause the training to produce an always closed prediction. [00127] FIG. 12A illustrates a graph 1200 of the opening aperture of a door (e.g., door height) relative to time for two different control systems, according to one embodiment. The “predictive door” graph shown as a solid line represents the aperture of the door with respect to time-based on control signals from a barrier control system according to the various embodiments described herein. The short-dashed line represents the actual height of objects passing through the aperture of the door. The long-dashed line shows the aperture actuations under the control of a traditional automatic door controller.

[00128] FIG. 12B illustrates a graph 1250 of the difference in the aperture opening of the doors controlled by the two different control systems of FIG. 12A, according to one embodiment. Again, the solid line represents the aperture of the door with respect to time-based on control signals from a barrier control system according to the various embodiments described herein. The long-dashed line shows the aperture actuations under the control of a traditional automatic door controller. The short-dashed line represents the difference in aperture opening provided by the two different control systems.

[00129] This disclosure has been made with reference to various exemplary embodiments, including the best mode. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope of the present disclosure. While the principles of this disclosure have been shown in various embodiments, many modifications of structure, arrangements, proportions, elements, materials, and components may be adapted for a specific environment and/or operating requirements without departing from the principles and scope of this disclosure. These and other changes or modifications are intended to be included within the scope of the present disclosure.

[00130] This disclosure is to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope thereof. likewise, benefits, other advantages, and solutions to problems have been described above with regard to various embodiments. However, benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element. This disclosure should, therefore, he determined to encompass at least, the following claims.