Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
BAGGAGE AND PARCEL LOADING END EFFECTORS, SYSTEM, AND METHODS OF USE
Document Type and Number:
WIPO Patent Application WO/2021/183958
Kind Code:
A1
Abstract:
A system (10) for sequential loading of objects (14) into a container (18), the system comprising: a first robot (22) including an engaging end effector (40), the engaging end effector (40) operable to apply a vacuum force to engage an object (14); a second robot (24) including a conveyor end effector (56), the conveyor end effector (56) operable to receive the object (14) from the engaging end effector (40) and convey the object (14) into the container (18); and a control system (30) including a processor (300) to control the first robot (22) and the second robot (24) to coordinate transfer of the object (14) into the container (18).

Inventors:
SWANSON BRIAN (US)
KILIBARDA VELIBOR (US)
KINSELLA MARTIN (US)
FINZEL BRYAN (US)
TAPPO FREDDIE (US)
FRAZER WILLIAM (US)
HAMEL KENNETH (US)
Application Number:
PCT/US2021/022219
Publication Date:
September 16, 2021
Filing Date:
March 12, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
LEIDOS SECURITY & AUTOMATION LTD (GB)
SWANSON BRIAN (US)
International Classes:
B25J9/16; B25J15/00; B25J15/06; B64F1/36; B65G61/00
Domestic Patent References:
WO2020040103A12020-02-27
WO2018075884A12018-04-26
WO2012092939A12012-07-12
Foreign References:
JPH06171762A1994-06-21
US4242025A1980-12-30
DE10012090A12001-09-27
DE102007062534A12009-06-25
DE102007052012A12009-05-07
EP1145805A22001-10-17
Attorney, Agent or Firm:
BURNS, David R. et al. (US)
Download PDF:
Claims:
What is claimed is:

1. A system for sequential loading of objects into a container, the system comprising: a first robot including an engaging end effector, the engaging end effector operable to apply a vacuum force to engage an object; a second robot including a conveyor end effector, the conveyor end effector operable to receive the object from the engaging end effector and convey the object into the container; and a control system including a processor to control the first robot and the second robot to coordinate transfer of the object into the container.

2. The system of claim 1, wherein the engaging end effector comprises: a mounting plate; an actuator connected to a mounting plate defining an actuator axis of rotation; and a coupler connected to the actuator.

3. The system of claim 1, wherein the engaging end effector comprises: a base connected to the coupler operable to selectively engage and disengage the object, the actuator operable to selectively rotate the base relative to the mounting plate about the actuator axis of rotation; and a vacuum pad connected to the base, the vacuum pad operable to selectively generate a vacuum force perpendicular to the base to selectively engage and disengage the object.

4. The system of claim 3, wherein the vacuum pad comprises: at least two vacuum zones, the at least two vacuum zones being selectively and independently controlled by the control system.

5. The system of claim 3, wherein the engaging end effector comprises: a plurality of pins through which the vacuum force is created, at least subsets of the plurality of pins being selectively and independently controlled by the control system.

6. The system of claim 3, wherein the first robot comprises a wrist; and an extension arm having a first end connected to the wrist and a second end connected to the mounting plate, the first end defining an extension arm axis of rotation operable for selected rotation of the extension arm and the base relative to the wrist, the extension arm axis of rotation is independent of the actuator axis of rotation.

7. The system of claim 1, wherein the conveyor end effector comprises: a mounting plate; an actuator connected to a mounting plate defining an actuator axis of rotation; and a coupler connected to the actuator.

8. The system of claim 7, wherein the conveyor end effector comprises: a base connected to the coupler, the actuator operable to selectively rotate the base relative to the mounting plate about the actuator axis of rotation; a pair of opposing arms connected to the base extending outward from a first end to a second end in a lateral direction, the arms longitudinally separated relative to the base along a longitudinal direction; a first roller rotatably connected to the pair of arms adjacent the arms first end; a second roller rotatably connected to the pair of arms adjacent the arms second end, one of the first or the second roller; and a belt engaged to the first and the second rollers, the belt operable to selectively transfer the object positioned on the belt to selectively position and deposit the object into the container.

9. The system of claim 8, wherein the first robot comprises: a wrist; and an extension arm having a first end connected to the wrist and a second end connected to the mounting plate, the first end rotatable about an extension arm axis of rotation to rotate the extension arm and conveyor end effector relative to the wrist, the extension arm axis of rotation is independent of the actuator axis of rotation.

10. The system of claim 8, wherein the conveyor end effector comprising: a pusher connected to the belt, the pusher operable to engage the object positioned on the belt to transfer the object relative to the base.

11. The system of claim 1, wherein the control system comprises: a memory storing an image of the object, object data for the object, and an identifier associated with the object.

12. The system of claim 11, wherein the processor of the control system is programmed to extract data from the image corresponding to features of the object and store the extracted data.

13. The system of claim 1, wherein the processor of the control system is programmed to: execute a machine learning model to output a prediction as to whether the engaging end effector is capable of engaging and transferring the object.

14. The system of claim 13, wherein the processor of the control system is programmed to: operate in a training and test mode in which the processor controls the end engaging effector to attempt to engage and transfer the object independent of the prediction; determine an outcome of whether the engaging end effector was successful in engaging and transferring the object; and train the machine learning model based on the outcome.

15. The system of claim 1, wherein the processor of the control system is programmed to: operate in a production mode in which the processor controls the end engaging effector to attempt to engage and transfer the object in response to the prediction indicating that the engaging end effector can successfully engage and transfer the object.

16. The system of claim 1 , wherein the processor of the control system is programmed to: determine an available interior volume of the container based on one or more images of the container; determine whether the object is capable of fitting in the available interior volume; and determine a location at which the conveyor end effector is to deposit the object in the container in response to determining that the object is capable of fitting in the available interior volume.

17. The system of claim 16, wherein the processor of the control system is programmed to determine whether the object is capable of fitting in the available interior volume and a location at which the object is to be placed using a heuristic function.

18. The system of claim 16, wherein processor of the control system is programmed to: generate a point cloud of the interior volume of the container using the one or more images; and determine the available interior volume based on the point cloud.

19. A method for sequential loading of objects into a container, the method comprising: controlling, via a control system, an operation of a first robot including an engaging end effector to apply a vacuum force to engage an object, and controlling, via a control system, a second robot including a conveyor end effector to receive the object from the engaging end effector and convey the object into the container.

20. The method of claim 19, further comprising extracting data from an image of the object, the data corresponding to features of the object, and execute a machine learning model to output a prediction as to whether the engaging end effector is capable of engaging and transferring the object based at least in part on the data extracted from the image.

21. The method of claim 20, further comprising: controlling to engaging end effector to attempt to engage and transfer the object independent of the prediction; determining an outcome of whether the engaging end effector was successful in engaging and transferring the object; and training the machine learning model based on the outcome.

22. The method of claim 1, further comprising: controlling the end engaging effector to attempt to engage and transfer the object in response to the prediction indicating that the engaging end effector can successfully engage and transfer the object.

23. The method of claim 19, further comprising: determining an available interior volume of the container based on one or more images of the container; determining whether the object is capable of fitting in the available interior volume; and determining a location at which the conveyor end effector is to deposit the object in the container in response to determining that the object is capable of fitting in the available interior volume.

24. A conveyor end effector for use in an automated loading device for sequentially loading objects into a container comprising: a mounting plate connected to an automated loading device; an actuator connected to the mounting plate defining an actuator axis of rotation; an object conveyor device comprising: a coupler connected to the actuator; a base connected to the coupler, the actuator operable to selectively rotate the base relative to the mounting plate about the actuator axis of rotation; a pair of opposing arms connected to the base extending outward from a first end to a second end in a lateral direction, the arms longitudinally separated relative to the base along a longitudinal direction; a first roller rotatably connected to the pair of arms adjacent the arms first end; a second roller rotatably connected to the pair of arms adjacent the arms second end, one of the first or the second roller comprises a powered roller ; and a belt engaged to the first and the second roller, the powered roller operable to selectively rotate the belt relative to the base, the belt operable to selectively transfer an object positioned on the belt relative to the base to selectively position and deposit the object into a container.

25. An engaging end effector for use with an automated loading device for selectively engaging objects: a mounting plate connected to an automated loading device; an actuator connected to the mounting plate defining an actuator axis of rotation; an object engaging device comprising: a coupler connected to the actuator; a base connected to the coupler operable to selectively engage and disengage the object, the actuator operable to selectively rotate the base relative to the mounting plate about the actuator axis of rotation; and a vacuum pad connected to the base, the vacuum pad operable to selectively generate a vacuum force perpendicular to the base to selectively engage and disengage the object.

Description:
BAGGAGE AND PARCEL LOADING END EFFECTORS, SYSTEM,

AND METHODS OF USE

RELATED APPLICATION

[0001] The present application claims priority to and the benefit of U.S. Provisional Application No. 62/988,633, filed on March 12, 2020, the disclosure of which is incorporated by reference herein in its entirety.

TECHNICAL FIELD

[0002] This disclosure relates to a system and end effectors for use with automated loading devices. In one application, the system and end effectors are useful for loading travel passenger checked baggage into containers for loading onto passenger aircraft.

BACKGROUND

[0003] In today’s global and fast-moving economies, passenger mass transit, and in particular air travel, continues to rapidly increase. With the increase in passenger airline travel, there is increased pressure on airlines and airports to move the passengers and their checked luggage through the airports as quickly and efficiently as possible.

[0004] Following check-in and security screening in a main terminal, each checked bag is routed, and further sorted and gathered, typically by flight number. The sorted bags are then loaded into movable containers, for example containers commonly called unit load devices (ULDs), for transfer onto the airplane. Where it is not possible or efficient to use ULDs, other containers such as baskets and/or trays are used to transfer the flight-sorted bags for loading into the designated airplane.

[0005] Even in the most sophisticated and automated baggage handling systems, at several places between baggage check-in and loading/unloading of the airplane, bags must be manually handled by operators for various reasons and purposes. Due in large part to the baggage size, weight and variations thereof, the level of human physical effort and complex ergonomic movements to complete these physical bag loading/unloading tasks are high.

[0006] One area typically requiring manual bag handling (or human intervention) is the loading of the flight-sorted bags into the containers (for example ULDs described above). This is due to many reasons, including the almost unlimited differences in the sizes, shapes, rigidity, volumes, and weights of passenger bags. For example, the high variation in the physical characteristics of passenger bags has made it very difficult to automate, for example using programmable robots, the physical transfer of high volumes of the flight-sorted bags into a container. Further difficulties in automating loading of the containers exist in that the containers have a definite size and interior volume space, the available volume space for the next bag decreasing, and changing in three-dimensional shape, as each bag is deposited into the container.

[0007] There is a need for devices and methods that would solve or improve on the difficulties and disadvantages in the area of loading objects into movable containers, for example checked airline passenger bags into ULDs, for further processing of the objects and/or bags.

SUMMARY

[0008] Disclosed herein is an engaging end effector, a conveyor end effector, and a system for engaging and loading obj ects into a container using the engaging and conveyor end effectors. [0009] In one example, an engaging end effector is used in an automated loading device for selectively engaging objects. The exemplary engaging end effector includes a mounting plate connected to an automated loading device, for example a multi-axis programmable robot. The engaging end effector further includes an actuator connected to a coupler and a base operable to selectively rotate the base relative to the mounting plate about an axis of rotation. The base is operable to selectively engage an object, for example a travel passenger bag, positioned within a path of travel of the base.

[0010] In one example, the engaging end effector base includes a vacuum pad operable to selectively generate a vacuum force to selectively engage and disengage an object positioned along the base path of travel adjacent to the vacuum pad. In another example, the engaging end effector includes an extension arm positioned between the mounting plate and the automated loading device to extend the reach or path of travel of the base. The exemplary extension arm includes an axis of rotation relative to the automated loading device which is independent of the actuator axis of rotation further increasing the path of travel of the base.

[0011] In one example, a conveyor end effector is used in an automated loading device for selectively loading objects into a container, for example an airport checked passenger bag. The exemplary conveyor end effector includes a mounting plate connected to an automated loading device, for example a multi-axis programmable robot. The conveying end effector further includes an actuator connected to a conveyor operable to selectively rotate the conveyor relative to the mounting plate.

[0012] In one example of the conveyor end effector, the conveyor end effector includes a base connected to the actuator and having a pair of opposing arms extending outward from the base. A first roller and a second roller are rotatably connected to the pair of arms. One of the first and second roller comprises a power roller for selectively rotating a belt operable to transfer an object positioned on the belt to selectively position and deposit the object into the container within the conveyor path of travel.

[0013] In another example, the conveying end effector includes an extension arm positioned between the mounting plate and the automated loading device to extend the reach or path of travel of the base. The exemplary extension arm includes an axis of rotation relative to the automated loading device which is independent of the actuator axis of rotation further increasing the path of travel of the base.

[0014] In one example of a system for engaging and selectively positioning and depositing objects in a container, the system uses a first automated loading device including an engaging end effector having a path of travel and a second automated loading device including a conveyor end effector having a path of travel. In one example, the object is a travel passenger bag.

[0015] The exemplary system engaging end effector includes a mounting plate connected to an automated loading device, for example a multi-axis programmable robot. The engaging end effector further includes a base operable to selectively engage the object, for example a travel passenger bag, positioned within the engaging end effector path of travel. The system conveyor end effector further includes a mounting plate connected to a conveyor having a base and a powered belt operable to position and deposit an object on the conveyor into a container within the conveyor end effector path of travel. In one example, the engaging end effector path of travel is in overlapping communication with the conveyor end effector path of travel.

[0016] In one example of operation of the system, the engaging end effector autonomously selectively engages an object and coordinatingly disengages and deposits the object onto the conveyor end effector conveyor. The conveyor end effector autonomously positions the conveyor and transfers the object relative to the base to selectively position and deposit the object into an available space in the container.

[0017] In one example of the system engaging end effector and the conveyor end effector, each of the engaging and conveyor end effectors include an actuator connected to the respective mounting plate and the base. Each actuator is operable to selectively rotate the respective base about an axis of rotation relative to the mounting plate to increase the path of travel of each of the engaging and conveyor end effector. In another example of the system, each of the engaging end effector and the conveyor end effector includes an extension arm connected to the respective mounting plate and respective first or second automated device. Each extension arm includes an axis of rotation relative to the respective first or second automated device to increase the path of travel of the respective engaging end effector and the conveyor end effector. [0018] In one example, systems and methods are disclosed for sequential loading of objects into a container. The systems and methods can include a first robot or automated loading device with an engaging end effector and a second robot or automated loading device with a conveyor end effector. The engaging end effector is operable to apply a vacuum force to engage an object and the conveyor end effector is operable to receive the object from the engaging end effector and convey the object into the container. A control system including a processor controls the first robot and the second robot to coordinate transfer of the obj ect into the container. A memory storing an image of the object, object data for the object, and an identifier associated with the object.

[0019] In one example, the processor of the control system is programmed to extract data from the image corresponding to features of the object and store the extracted data.

[0020] In one example, the processor of the control system is programmed to execute a machine learning model to output a prediction as to whether the engaging end effector is capable of engaging and transferring the object.

[0021] In one example, he processor of the control system is programmed to operate in a training and test mode in which the processor controls the end engaging effector to attempt to engage and transfer the object independent of the prediction, determine an outcome of whether the engaging end effector was successful in engaging and transferring the object, and train the machine learning model based on the outcome.

[0022] In one example, the processor of the control system is programmed to operate in a production mode in which the processor controls the end engaging effector to attempt to engage and transfer the object in response to the prediction indicating that the engaging end effector can successfully engage and transfer the object.

[0023] In one example, the system of claim 1, the processor of the control system is programmed to determine an available interior volume of the container based on one or more images of the container; determine whether the object is capable of fitting in the available interior volume; and determine a location at which the conveyor end effector is to deposit the object in the container in response to determining that the object is capable of fitting in the available interior volume. The processor of the control system can be programmed to determine whether the object is capable of fitting in the available interior volume and a location at which the object is to be placed using a heuristic function and/or generate a point cloud of the interior volume of the container using the one or more images and determine the available interior volume based on the point cloud.

[0024] Any combination and/or permutation of the embodiments are envisioned. Other objects and features will become apparent from the following detailed description considered in conjunction with the accompanying drawings. However, it is to be understood that the drawings are designed as an illustration only and not as a definition of the limits of the present disclosure

BRIEF DESCRIPTION OF THE DRAWINGS

[0025] To assist those of skill in the art in making and using embodiments of the present disclosure, reference is made to the accompanying figures, wherein elements are not to scale so as to more clearly show the details, and wherein like reference numbers indicate like elements throughout the several views.

[0026] FIG. 1 is a schematic plan view of an example of a system for engaging and selectively positioning and depositing objects into a container using first and second automated devices according to embodiments of the present disclosure.

[0027] FIG. 2 is a front perspective view of one example of an engaging end effector according to embodiments of the present disclosure.

[0028] FIG. 3 is a plan view of the engaging end effector of FIG. 2.

[0029] FIG. 4 is a front view of the engaging end effector of FIG. 3.

[0030] FIG. 4A is a bottom perspective view of the end effector of FIG. 3.

[0031] FIG. 4B is a bottom view of the end effector of FIG. 3.

[0032] FIG. 4C is a bottom perspective view of one example of an embedded pin shown in

FIG. 4.

[0033] FIG. 4D is a front perspective view of another example of an engaging end effector according to embodiments of the present disclosure.

[0034] FIG. 4E is a perspective view of an alternate example of an engaging end effector in an exemplary use for passenger baggage according to embodiments of the present disclosure. [0035] FIG. 4F is an enlarged partial perspective view of the engaging end effector in FIG. 4E.

[0036] FIG. 4G is an enlarged partial perspective view of the engaging end effector in FIG. 4F.

[0037] FIG. 4H is a bottom perspective view of the engaging end effector in FIG. 4G.

[0038] FIG. 41 is a front perspective view of another alternate example of an engaging end effector according to embodiments of the present disclosure. [0039] FIG. 5 is a schematic front view of an example of a conveyor end effector in use with an automated loading device for positioning and depositing objects into a container according to embodiments of the present disclosure.

[0040] FIG. 6 is a schematic plan view of the conveyor end effector in FIG. 5.

[0041] FIG. 7 is a front perspective view of the conveyor end effector in FIG. 6.

[0042] FIG. 8 is a rear perspective view of the conveyor end effector in FIG. 7.

[0043] FIG. 9 is a plan view of the conveyor end effector in FIG. 7.

[0044] FIG. 10 is a front view of the conveyor end effector in FIG. 7.

[0045] FIG. 11 is a right side view of the conveyor end effector in FIG. 7.

[0046] FIG. 12 is a schematic of an example of a central control system according to embodiments of the present disclosure.

[0047] FIG. 13 is a schematic flow chart of one example of a method for engaging, positioning and depositing objects into a container according to embodiments of the present disclosure.

[0048] FIG. 14 a schematic flow chart of one example of a method for training a machine learning model to predict whether the engaging end effector will be successful in engaging and transporting a bag is described in accordance with embodiments of the present disclosure. [0049] FIG. 15 a schematic flow chart of one example of a method for deploying a trained machine learning model to control an operation of engaging end effector in accordance with embodiments of the present disclosure.

DETAILED DESCRIPTION

[0050] Referring to Fig. 1, one example of a system 10 for engaging and selectively positioning and depositing objects, for example travel passenger checked bags or parcels 14 (hereinafter “bags 14”), into a container 18 using a first 22 and second 24 automated devices is shown. In the example, first automated device 22 includes an inventive engaging end effector 40, and second automated device 24 includes an inventive conveyor end effector 56 described. [0051] In one example embodiment of system 10, the first automated load device 22 and second automated load device 24 are positioned in an automated loading cell 20. In one example, automated loading cell 20 is a portion of a make-up module in a large municipal or mass transit passenger airport. The make-up module is where, for example, passenger checked bags have already been pre-sorted by a predetermined metric, for example flight number, and are loaded into containers 18, for example unit load devices (ULDs). In one example, the make up module may include a manual loading station where the containers are partially filled in the automated load station and then the remaining spaces are manually filled by bag handlers. The filled containers 18 are then transferred out of the make-up module to an aircraft stand where the filled containers 18, or individual bags 14, are loaded into an aircraft hold for flight.

[0052] In the example automated load cell 20, the first automated load device 22 and second automated load device 24 are positioned adjacent to, and in communication with, a path of travel 26 whereby, in one example, the bags 14 sequentially (one after another) in a single file line, travel into and through automated load station 20. In one example, a plurality of autonomously propelled and navigated devices, for example automated guided vehicles (AGVs) 28, each carry a bag 14 into the automated load cell 20. A central and/or one or more local control systems (individually and collectively referred to as a central control system 30) in electronic communication, through hardwire or wireless communication protocols, with the AGVs 28 and the first automated load devices 22 and second automated load devices 24 control the operation, movement and coordination as known by those skilled in the art.

[0053] In one example, the first automated device 22 includes a robot 34 (for example see Fig. 4D) and the engaging end effector 40 and the second automated device 24 including a robot 34 and the conveyor end effector 56. The first automated load devices 22 and second automated load devices 34, including respective engaging end effector 40 and conveyor end effector 56 are in communication with a central control system 30. The system 10 is operable to use the engaging end effector 40 and the conveyor end effector 56 in a cooperative way to load objects, for example bags 14, into a container 18.

[0054] Referring to Figs. 2-4, one example of an inventive engaging end effector 40 is shown. In one exemplary application, engaging end effector 40 is used with the first automated load device 22 in the automated load cell 20 (Fig. 1). It is understood that engaging end effector 40 can be used in applications other than to engage/disengage travel passenger bags 14 in a make-up module in airports. For example, engaging end effector 40 can be used with automated devices 22, 24 to selectively engage/disengage other objects, for example boxes, packages or parcels, that require transfer and deposit into a container for further transfer, shipping or processing.

[0055] Referring to Fig. 1, exemplary engaging end effector 40 is used with an automated device, for example the programmable multi-axis robot 34. Robot 34 includes a plurality of arms 36 (one shown for ease of illustration only) rotatable relative to one another defining a plurality of axes of freedom with a distal arm including a wrist 38. Wrist 38 includes a mounting plate for mounting an end effector, for example engaging end effector 40 or conveyor end effector 56. An actuator, for example an electric motor, may be included to impart rotation or movement to the end effector 40, 56 relative to the robot 34. Robot 34, through wrist 38, may further provide electrical and data wiring harnesses to provide an electrical power source or data signals, and/or other supply lines, for example, pressurized pneumatic, hydraulic, and water cooling lines, through the wrist 38 to the end effector 40, 56 to suit the particular application as known by those skilled in the art. It is understood that first automated load device 22 can include other automated devices other than robot 34 to suit the particular application and performance requirements.

[0056] In one example of automated loading cell 20, the first automated device 22 and robot 34 can be in electronic and/or data communication with the central control system 30 to provide signals and/or instructions to operate and control the robot 34 and respective attached end effector 40 or 56 to guide the end effector 40 or 56 to positions along a longitudinal (x coordinate) direction 44, a lateral (y-coordinate) direction 46, and vertical (z coordinate) direction 48. Control system 30 may further receive feedback signals from sensors (e.g., accelerometers, gyroscopes, optical sensors, acoustic sensors, encoders, cameras, pressure and piezoelectric sensors) on the robot 34 and/or attached respective end effector 40 or 56.

[0057] Referring to Figs. 2-4, engaging end effector 40 includes a mounting plate 80 connected to an extension arm 84. Exemplary mounting plate 80 may be made from aluminum, steel or other materials. The configuration, construction and materials used for mounting plate 80 may vary to suit the particular application and performance specifications.

[0058] Exemplary extension arm 84 includes a first end 83 connected to the robot wrist 38. In one example of engaging end effector 40, extension arm 82 is selectively rotatable relative to the robot wrist 38 about an axis of rotation 86 by a robot actuator thereby moving the engaging end effector 40 about an arcuate path of travel 88 (Fig. 1). An advantage of use of the extension arm 82 is to increase the reach range of the robot 34 and keep the robot wrist 38 from having to enter the container 18 which could subject the wrist 38 to collision with the container and possible damage to the wrist 38. In one non-limiting example, extension arm 82 is about 39 inches (1000 millimeters (mm)) in length and made from aluminum or steel. It is understood that extension arm 82 may be of configurations, shapes, different lengths longer and shorter, and made from different materials to suit the particular application and performance standards. It is further understood that extension arm 82 can be eliminated and the mounting plate 80 can be mounted directly to robot wrist 38.

[0059] Exemplary engaging end effector 40 further includes an actuator 90 connected to the mounting plate 80. In one example, actuator 90 is an electric motor in communication with an electrical or other power source from the first automated load device 22, for example robot 34. In one example, actuator 90 is selectively activated or energized by communication signals or other instructions received from central control system 30 to selectively rotate engaging end effector 40 along arcuate path of travel 108 (Fig. 1) relative to extension arm 82 and first automated load device 22. In the example shown, arcuate path 108 is a minimum reach or range of first automated device 22, for example when the robot 34 arms 36 are the least extended or fully contracted. Other devices, configurations and functions for actuator 90 may be used to suit the particular application and performance requirements.

[0060] Exemplary engaging end effector 40 further includes a coupler 94 connected to the actuator 90 operable to selectively rotate the coupler 94 about an axis or rotation 100 relative to the mounting plate 80. In one example, a base 106 is rigidly connected to the coupler 94. On selected activation of the actuator 90, the coupler 94 and base 106 rotate relative to the mounting plate about axis of rotation 100 thereby moving the base 106 about an arcuate path of travel 108 (Fig. 1) relative to the mounting plate 80 and extension arm 82.

[0061] Referring to the Figs. 2, 4, 4A and 4B, an embodiment engaging end effector 40 can be in the form of an engaging vacuum or suction end effector. Engaging vacuum end effector 40 can further include a vacuum engagement pad 110 connected to the base 106. In one example, engagement pad 110 is a pneumatic vacuum or suction pad including through vacuum air holes 112 from a pad engaging surface 114 which are each in air flow communication with a pneumatic vacuum source provided by the first automated load device 22 (Fig. 1), for example robot 34. In one example, engaging pad 110 is made from a compressible, open cell foam. It is understood that other types of compressible foams and/or compressible materials which have a characteristic to at least partially conform to the exterior contour of the bag 14 (or objects as described herein), without scratching or damaging the bag 14, can be used. Where engaging end effector 40 is in the form of a vacuum/suction device, materials that assist in forming an air and/or vacuum seal against the bag 14 can be used, but are not necessarily required.

[0062] Vacuum air tubes secured to arm 82 in communication with the vacuum source and the vacuum air holes 112 are used. In one example, with reference to Figs. 4F, 4G and 4H, 41, an alternate extension arm 82B can be used. Exemplary arm 82B includes a hollow two-piece arm defining an interior cavity 116 housing the air tubes 118 (two shown) which are integrally positioned within the hollow arm 82B interior cavity 116. Air tubes 118 are in air flow communication with the vacuum source and vacuum air holes 112.

[0063] In one example, the vacuum source selectively generates a vacuum force 120 through the plurality of vacuum air holes 112 perpendicular to the engaging surface 114 sufficient to engage, secure and support a bag 14 to the engagement pad 110 against the force of gravity when the engagement pad 110 is placed against bag 14, or in close proximity thereto. In one example, a manifold or plenum having two or more, or a plurality of, air channels in communication with certain vacuum holes 112 can be used to more evenly distribute the vacuum force 120 or the flow of air between an air tube 116 and certain vacuum air holes 112. Different devices and methods for creating a vacuum force 120 and directing the flow of air from the engaging surface 114 through the air tubes 116 can be used. Referring to Figs. 4 and 4A-4C, engaging end effector 40 in the form of the vacuum engaging pad 110 includes a plurality of embedded pins 124 connected to the base 106 and extending perpendicular therefrom into the engaging pad 110. A clearance through pin holes 132 extend through the engaging pad 110 allowing the engaging pad 110 to axially compress toward base 106 without substantial binding or friction against the respective pins 124.

[0064] An example of an embedded pin 124 is shown in Fig. 4C. In the example, each pin 124 includes a cylindrical-shaped base terminating in a frustoconical-shaped end 128 as generally shown. In the example, end 128 includes a concave portion 130 at a distil end. Depending on the application and bag 14, concave portion 130 may also serve to function as a suction cup and increase the engagement of the bag 14 to the engaging pad 110. Pins 124 also serve to support or stabilize the shape and/or structure of the pad 110 when engaged with a bag 14, for example when the first automated device 22, robot 34, laterally moves along a path of travel 88, 108 (Fig. 1).

[0065] In the example shown in Figs. 4, 4A, and 4B, three different axial length pins can be used, first 124A, second 124B and third 124C, and are selectively positioned about base 106 and engaging pad 110. As shown in Fig 4, second pins 124B are longer than first pins 124A, and third pins 124C are axially longer than second pins 124B. In the example shown in Fig. 4B, the shortest first pins 124A (four shown) are positioned in a center area of the base 106, the second pins 124B (12 shown) are positioned generally radially outward of the first pins 124 on the base 106, and the third pins 124C (20 shown) are positioned generally radially outward of the second pins 124B adjacent a perimeter of pad 110 as generally shown. Pins 124 may be made from steel, nonferrous metals, composites, polymers, elastomers and other materials. It is understood that different pin shapes, lengths, configurations, numbers, materials and positional location relative to the pad 110, base 106 and other pins 124 can be used to suit the particular application.

[0066] In one example of use of vacuum engaging pad 110, for example engaging a soft- sided bag 14, on contact of a bag 14 with engaging surface 114 and generation of a vacuum force, pad 110 will axially compress toward the base 106. Provided there is enough contact between the bag 14 and engaging surface 114, the vacuum force can force axial compression of the pad 110 until at least one or more of each set of first 124 A, second 124B and third 124C pins contacts the bag 14. Due to the exemplary different length pins 124 and exemplary positions as described, this forms a concave shape of the compressed pad 110 thereby generating more contact surface (and friction) by engaging surface 114 with the bag 14 to more securely engage the bag 14 to the pad 110.

[0067] In one example of operation, the above described pad 110 and pins 124 further provide stability in the engagement of pad 110 with an engaged bag 14 during movement. For example, when engaging end effector 40 and engaged bag 14 are moved by first automated device 22, 34, for example along paths of travel 88 or 108, the pin 124 ends 128 provide contact and friction resistance to relative movement between the bag 14 and pad 110 in the lateral or shear direction. Other devices and methods than the described pins 124 and pad 110 may be used to engage and secure a bag to the engaging end effector 40 to suit the particular application. [0068] Exemplary engaging end effector 40 further includes one or more sensors (e.g., optical sensors, acoustic sensors, cameras) connected to the engagement pad 110, and/or the base 106, operable to detect the presence of an AGV 28 and/or object, for example bag 14, positioned within one or both of the arcuate paths of travel 88 and/or 108 (collectively referred to as the range of travel). The one or more sensors can be in communication with the control system 30.

[0069] Fig. 4D depicts another pad 110A configuration of an embodiment of the engaging end effector 40. The structure and operation of the embodiment of the engaging end effector shown in Fig. 4D can be the same as the structure and operation of the embodiment of the engaging end effector described with reference to Figs. 4 and 4A-C. As shown in Fig. 4D the pad 110A can be formed by individual suction cups 136 for engaging a bag 14. Each suction cup can be operatively coupled to a vacuum source via pins or shafts 134 and the air tubes 116 as described herein to create a vacuum force 120. The suction cups 136 can be formed from a pliable and/or resilient material such as polymers (e.g., rubber, silicone, plastic, Kevlar). The pins of shafts 134 can be operative coupled to the base 106 and can have an air channel extending along an longitudinal axis of the pins or shafts 134. A length of the pins or shafts 134 can be identical or can vary in the same manner as described herein with reference to pins 124. The pins of shafts 134 and associated suction cups 136 can be individually controllable, collectively controllable, or controlled in groups to apply varying cumulative vacuum forces 120 on bags 14. The suction applied can be varied based on the size, weight, and type of bag 14 to be engaged by the engaging end effector 40. In some embodiments, a guide and safety rod 138 can be disposed at a distal end of the engaging effector 40 to prevent the engaging end effector from being damaged by incidental contact with objects other than bags 14 or other objects to be picked up by the engaging end effector 40.

[0070] Referring to Figs. 4E-4I, an alternate example of an engagement pad 110B in the form of a vacuum engaging pad is shown. As shown in Figs. 4E and 4F, pad 110B is equally useful with the first automated load device 22, in the form of the robot 34, including an extension arm 82A, the mounting plate 80, the actuator 90, and the coupler 94, and the base 106 (shown largely transparent in the figures for ease of illustration).

[0071] As seen in the Figs. 4G and 4H example, engaging pad 110B includes separate first vacuum zone 140 and a second vacuum zone 142, defined by an outer pad ring 150 connected to base 106 as generally shown. Exemplary outer pad ring 150 includes a continuous perimeter portion 154 adjacent an outer perimeter of the base 106, and an center portion 158 separating the first 140 and second 142 vacuum zones.

[0072] As best seen in Fig. 4H, each of the first vacuum zone 140 and second vacuum zone 142 can be in communication with a vacuum inlet 160 defined by the coupler 94. Each inlet 160 is in air flow communication with the air tube 118 and the vacuum source described above. It is understood that inlet 160 may include alternate configurations, numbers, orientations and positional locations than as shown to suit the particular application.

[0073] Each exemplary first vacuum zone 140 and second vacuum zone 142 further includes an inner pad ring 164 connected to the base 106 as generally shown and described for outer pad ring 150. In one example, each of the outer pad ring 150 and inner pad ring 164 can be made from the same foam material and axially compress toward base 106, as described above for engaging pad 110. It is understood that outer pad ring 150 and inner pad ring 164 can be of other configurations, shapes, sizes and materials than that of pad 110 and/or to suit the particular application and performance specifications.

[0074] In the example pad 110B, each of the first vacuum zone 140 and second vacuum zone 142 can be independently operated from one another through one or more sensors included in the pad 110B and/or engaging end effector 40. For example as shown in Fig. 4F, in a normal operation, substantially all of each of first vacuum zone 140 and second vacuum zone 142 can be covered by bag 14 such that there is little to no vacuum leakage or loss of vacuum pressure (for example where a portion of first vacuum zone 140 or second vacuum zone 142 are not covered, or do not overlap, a portion of the bag causing a leak in vacuum or suction pressure) providing for a very secure engagement between pad 110B and bag 14. In an alternate example, a small bag, or misaligned bag, only overlaps first vacuum zone 140, but only a small portion of second vacuum zone 142. Pressure valves and sensors, for example force and/or pressure sensors positioned in the outer pad ring 150 and/or the inner pad ring 164, or at or near the vacuum inlets 160, could signal the central control system 30 (or the first automated load device 22 vacuum source, that a significant portion of second vacuum zone 142 is not used and turn off the vacuum force 120 for second vacuum zone 142. Alternately, increased vacuum force 120 can be applied to the first vacuum zone 140 to increase the engagement with bag 14. In another example, based on feedback signals from the described sensors, the control system 30 can vary and/or adjust the vacuum force 120 based on the zone(s) 140, 142 engaged and/or predetermined sensor readings.

[0075] It is understood that variations size, shape and configuration, and components of pad 110B may vary to suit he particular application. It is also understood that the vacuum zones 140, 142, including but not limited to, a greater or fewer number of zones, the shape, configuration and orientation of the zones, may vary to suit the particular application and performance specification. As an example, with reference to Fig. 41, which depicts an engaging pad HOC for an embodiment of the engaging end effector 40 that includes a single vacuum zone having a single outer pad ring 150A and the vacuum inlet 160.

[0076] It is understood that the exemplary engaging vacuum end effector 40 may take different forms, include different components, and operate differently to suit the particular application and performance specifications. For example, engagement pad 110 or 110A may be circular, square, rectangular, polygonal, H-shaped, U-shaped, concave, convex, or other shapes and configurations to suit the application. It is further understood that engaging end effector 40 can take different forms other than a vacuum device to engage or grasp an object, for example bag 14, to suit the particular application and performance specifications. In one alternate example, engagement pad 110 or 110A may take the form of one or more pneumatic suction cups(e.g., as shown in Fig. 4D). In an alternate example of engaging end effector 40 utilizes a gripping claw or hand to grasp or otherwise engage bags 14.

[0077] Referring to Fig. 1, in one example of operation of engaging end effector 40, AGV 28 carrying a bag 14 along path of travel 26 can be directed by control system 30 to stop at an assigned or predetermined location within the engagement end effector 40 range of travel. In response to sensing or verifying the presence of the AGV 28 and/or bag 14 in the predetermined location, first automated load device 22 can autonomously move and position engaging end effector 40 thereby positioning engagement pad 110 or 110A in contact, or proximity in the vertical direction 48. In response to determining or verifying that the engagement pad is positioned in proximity to the bag 14, the control system 30 can generate the vacuum force 120 to engage the bag 14 with the engagement pad 110 or 110A. The first automated load device 22 thereafter can be directed by the control system 30 to transfer the engaging end effector 40 and engaged bag 14 to an assigned or predetermined position for further processing of the bag 14. Other methods and steps for the first automated device and engaging end effector 40 to engage and transfer bag 14 to a predetermined or assigned position can be used. Although inventive engaging end effector 40 is described in one example of use in system 10, and in coordination with conveyor end effector 56, it is understood that inventive engaging end effector 40 can be used independently of system 10 and/or conveyor end effector 56, and with other automated devices, and in other applications, than as shown and described herein.

[0078] In Figs. 5-11, one example of an inventive conveyor end effector 56 is shown. In one exemplary application, conveyor end effector 56 is used with a second automated load device 24 in an automated load cell 20 as described herein for conveyor end effector 56 and second automated load device 24. It is understood that conveyor end effector 56 may be used in applications other than to engage/disengage travel passenger bags 14 in a make-up module in airports. For example, conveyor end effector 56 may be used with automated devices 22, 24 to selectively engage/disengage other objects, for example boxes, packages or parcels, that require transfer and deposit into a container for further transfer, shipping or processing.

[0079] In the example conveyor end effector 56, the second automated load device 24 can be generally similar to the first automated load device 22 and can include central control system 30, robot 34, robot arms 36, wrist 38 Modifications to second automated device 24 to suit the conveyor end effector 56 can be used.

[0080] Referring to Figs. 5-7, one example of the conveyor end effector 56 includes a mounting plate 180 connected to an extension arm 184 Exemplary mounting plate 180 is similar in construction and materials as the mounting plate 80, and may vary to suit the application and performance specifications as described above for mounting plate 80.

[0081] Exemplary extension arm 184 includes a first end 183 connected to the robot wrist 38 (Fig. 5). In one example of conveyor end effector 56, extension arm 182 is selectively rotatable relative to the robot wrist 38 about an axis of rotation 186 by a robot 34 actuator thereby moving the conveyor end effector 56 about an arcuate path of travel 188 (Fig. 1). In the example shown, arcuate path 188 is a minimum reach or range of second automated device 24, for example when the robot 34 arms are the least extended or fully retracted. As shown in Fig. 1, second automated device 24 and conveyor end effector 56 include a reach range 188A which includes the complete interior cavity of container 18. In one example, extension arm 182 can be generally similar in form and construction as extension arm 82 described above. Extension arm 182 can be modified from extension arm 82 to suit the particular application and performance specifications. It is further understood that extension arm 182 can be eliminated and the mounting plate 180 can be mounted directly to robot wrist 38.

[0082] Exemplary conveyor end effector 56 further includes an actuator 190 connected to the mounting plate. In one example, actuator 190 is an electric motor in communication with an electrical or other power source from the second automated load device 24, for example robot 34, as described above for actuator 90. Actuator 190 is in communication with, and is activated or energized by central control system 30 as described above for actuator 90 to selectively move conveyor end effector 56 along arcuate path of travel 208 (Fig. 1) relative to extension arm 182 and second automated device 24 as generally described above for actuator 90. Alternate actuators, configurations, and functions for actuator 190 described above for actuator 90 may be used as known by those skilled in the art.

[0083] Exemplary conveyor end effector 56 further includes a conveyor portion 192 including a coupler 194 connected to the actuator 190 operable to selectively rotate the coupler 194 about an axis or rotation 200 relative to the mounting plate 180. Conveyor portion 192 further A base 206 is connected to the coupler 194. On selected activation of the actuator 190, the coupler 94 and base 206 rotate relative to the mounting plate 180 about axis of rotation 200 thereby moving the base 206 about an arcuate path of travel 208 (Fig. 1) relative to the mounting plate 180 and extension arm 182 as similarly described for engaging end effector 40 as generally described above

[0084] Still referring to Figs. 5-7, exemplary conveyor end effector 56 conveyor portion 192 further includes a pair of opposing arms 210 connected to base 206 as generally shown. Exemplary arms 210 include a first end 212 and a distil cantilevered second end 214 In the example and orientation shown in Fig. 5, arms 210 extend outwardly from base 206 in the lateral direction 46. As shown in Figs. 6 and 7, arms 210 are separated from one another along the longitudinal direction. Arms 210 can be made from aluminum or steel. Other configurations, shapes, sizes and materials can be used.

[0085] As shown in Fig. 7, conveyor portion 192 further includes a first roller 216 rotatably connected to the respective arms 210 adjacent first end 212. A second longitudinal roller 226 is rotatably connected to the respective arms 210 adjacent the second end 214. In anon-limiting example, in use in an airport for an automated loading cell 20 and containers 18, the lateral direction 46 distance between the first roller 216 and second roller 224 can be approximately 27.5 inches (700 mm). It is understood that the distance between the first 216 and second 224 can be different lengths, longer or shorter, to suit the particular application and performance requirements.

[0086] In one example, one of the first roller 216 or second roller 226 can be a powered or driven roller used to forcibly move a belt 240 engaged with the first 216 and second 224 roller and circumferentially positioned there around. In one example, the powered of the first 216 or second 226 rollers is a drum-type roller including an electrical motor and drive device inside the roller. The powered roller is in electrical communication with a power source provided by the second automated device 24,34 and is selectively activated by central control system 30. In one example, both of the first roller 216 and second roller 224 can be powered rollers activated/deactivated in a synchronous manner.

[0087] In one example, the powered roller(s) include position sensors, for example encoders, which are in communication with the control system 30. In another example, the powered roller(s) include additional devices, for example, a gearbox and an electromechanical brake to quickly slow and stop the roller from rotating, providing further control and flexibility in the use of the conveyor end effector 56. Other forms, sizes, positions and configurations of powered rollers, and rollers 216, 224 may be used to suit the particular application and performance specifications. In the above-described example, as the same control source is used for the conveyor end effector 56 and second automated device, for example robot 34, coordinated movements can be made between the robot 34 and conveyor end effector 56. [0088] In the above example, belt 240 is a continuous or endless form of belt similar in materials and construction as industrial conveyor belts. Exemplary belt 240 provides a friction surface to frictionally engage objects, for example bags 14, positioned thereon to thereby move the object relative to the base 206 and second automated device 24 as further described below. Other forms, configurations, shapes, sizes and materials for belt 240 can be used to suit the particular application and performance specifications. It is understood that conveyor end effector 56 can be of different sizes, shapes, configurations, components, materials, and functions than that described and illustrated to suit the particular application.

[0089] Referring to Fig. 10, one example of conveyor end effector 56, and conveyor portion 192 includes a pusher device 250 connected to the belt 240. In the example, pusher 250 is an elongate bar extending in the longitudinal direction 44 horizontally across belt 240. As shown in Fig. 10, one example of pusher 250 extends in a vertical direction 48 as generally shown. In one example of operation, for example a soft-sided bag extends partially off the belt 240 onto one or both of the arms 210, which causes relative movement of the belt 240 and the bag, as the belt 240 continues to advance toward second end 214, the pusher 250 can contact bag 14 and forcibly move the bag 14 toward the second end 214. In the example, belt 240 can continue to rotate in an endless manner as described above, or may reverse direction to reposition pusher 250 back toward base 206 to await deposition of another bag on conveyor portion 192. Other forms, sizes, shapes, and configurations of pusher 250 to suit the particular application can be used.

[0090] Exemplary conveyor end effector 56 further includes one or more sensors 244 (shown schematically in Fig. 12), including but not limited to image vision devices (e.g., cameras, depth cameras, infrared cameras, optical sensors, acoustic sensors), stationarily positioned on one or more places, for example on the arms 210 or mounting plate 180 (Fig. 5), operable to detect, for example, the presence of an object, for example bag 14, positioned on belt 240 as generally shown. One or more of the sensors 244 can be stationarily positioned near one or both of arms 210, second end 214, to scan, and/or otherwise detect, available open spaces or volumes within container 18 that are available to receive a bag 14 or other object as more fully described below. The one or more sensors 244 are in electronic signal, hardwire or through wireless protocols, communication with the control system 30. Other sensors, in form, configuration, placement and function can be used.

[0091] In an alternate example of conveyor end effector 56, conveyor 192 can include two (2) parallel, side-by-side belts. In the above example, belt 240 is shown and described as a single belt. In the alternate example, the two parallel belts can include separate and independent rollers, one of which would be a powered roller as described above, and can be activated and rotated independently of one another by control system 30. This two belt example can provide additional capability to reorient an object, for example bag 14, relative to the belt. For example, one belt can be rotated away from the base, and the other belt can be rotated in an opposite direction (toward the base) which can have the effect of rotating the bag 14 relative to the conveyor. Other forms and configurations of conveyor 192 to suit the particular application can be used.

[0092] In an alternate example of conveyor end effector 56, the conveyor 192, and belt 240, are transversely mounted relative to the mounting plate 180. Using Fig. 7 for illustration purposes only, the alternate example is structured so that the arms 210 and belt 240 can be positioned substantially parallel to longitudinal direction 44 relative to the mounting plate 180 (Fig. 7 is structured so the arms 210 and belt 240 are positioned parallel to the lateral direction 46. Other configurations and orientations of arms 210 and belt 240 relative to the mounting plate 180 to suit the application can be used. [0093] Referring to Figs. 1, 5 and 6, in one example of operation of conveyor end effector 56, and as further discussed below for system 10, conveyor end effector 56, and more particularly conveyor 192, is positioned in a predetermined or assigned position by the second automated device 24 through communication signals from central control 30. In one example, conveyor 192 can be positioned within the reach or range 88 and/or 108 of engaging end effector 40 (Fig. 1). In this position, conveyor end effector 56 receives an object, for example bag 14, positioned on conveyor 192 and belt 240 as shown in Fig. 5. Other ways or methods for conveyor end effector 56 to receive an object or bag 14 on the conveyor 192 and belt 24 can be used.

[0094] As shown in Figs. 5 and 6, the illustrated container 18 includes 3-dimensional spaces or volumes: two (2) rows deep 260A, 260B (in lateral direction 46), three (3) columns across 266A, 266B, 266C (in longitudinal direction 44), and three (3) rows high 270A, 270B, 270C (in the vertical direction 48) as generally shown. It is understood that containers 18, including ULDs, can include more rows 260, columns 266, and rows 270 than as described and illustrated herein.

[0095] In the example, using a partially fdled container 18 shown in Figs 5 and 6, for example, the second automated device 24 moves the conveyor end effector 56, and more particularly conveyor portion 192 including the bag 14, in proximity to the open-sided container 18. The above-described one or more sensors connected to the conveyor end effector 56 are, in one example, used to image or scan the interior of the container 18 and calculate, or otherwise determine, the available space(s) or volumes that are available to receive a bag 14. The sensor information is sent to the central control system where, for example through software stored and executed in the central control system, identified the available space(s) and/or 3- dimensional volumes suitable to receive the bag 14 positioned on the conveyor portion 192. This determination/calculation of an available container space or volume may, in one example, take into consideration previously scanned and stored metrics, for example the exterior dimensions or size of the particular bag 14 positioned on belt 240, from the central control system 30. For the example illustrated in Figs. 5 and 6, the control system 30 can determine that (a) based on the scanned data from the interior of the container 18, and (b) based on the previously scanned and stored dimensions for the particular bag 14 positioned on belt 240, that volume space first row 260A, second column 266B, and second row 270B is suitable to receive the particular bag 14.

[0096] As shown in the Fig. 5, control system 30 can send the appropriate signals and instructions to the second automated load device 24 to autonomously move and position conveyor end effector 56 conveyor portion 192 to the position as generally shown in Figs. 5 and 6. In response to detection and/or verification that conveyor end effector 56 can be positioned in the predetermined or assigned position for the available container 18 space/volume, the powered roller, for example first roller 216, can be autonomously activated or energized to move belt 240 and deposit bag 14 into the calculated or determined open space first row 260A, second column 266B, and second row 270B as generally shown. As described above, the reach range 188A of automated device 24 and conveyor end effector 56 is at least a distance, area, or volumetric envelope to include the full interior cavity of container 18. In one example, on detecting and/or verifying that the bag 14 is no longer positioned on belt 240, for example through one or more of the sensors 244, the control system 30 moves the conveyor end effector 56 from the interior of the container 18 to another assigned or predetermined destination, for example to receive another bag 14 as described herein.

[0097] In one example of autonomous operation of conveyor end effector 56, bags 14 (or other objects) can be autonomously and sequentially positioned to fill the first rows 260A and B, and first columns 266 A, B and C, prior to beginning to position bags 14 in a second row 270B. It is understood that alternate methods and sequences to fill the container 18 interior spaces/volumes in rows 260, columns 266, and vertical rows 270 can be used to suit the particular application. In one example, central control system 30 can have preprogrammed sequences, for example by row 260, column 266 and row 270, for the positioning of second automated device 24, 34 and conveyor end effector 56 for the deposit of bags 14 in the manner described. It is further understood that some human operator assistance and/or intervention can be employed to, for example, select the available spaces within container 18, and/or direct the movement of the conveyor end effector 56 to the available container spaces, and/or in other ways.

[0098] In the example described and illustrated, second automated device 24 and conveyor end effector 56 can continue to receive and deposit bags 14 within empty spaces in container 18 until it is determined that container 18 is full and/or there are no more suitable spaces/volumes to deposit any more bags 14 by conveyor end effector 56. In response to determining or verifying, for example through the imaging sensor(s) (e.g., one or more of sensors 244) on the conveyor end effector 56 described herein, that container 18 is full, the control system 30 can send signals directing second automated load device 24 to return conveyor end effector 56 to a start position, for example within the reach range 88 and/or 108 of the first automated load device 22 to receive another object, for example bag 14, for loading into an empty container positioned within the reach range 188A of the second automated load device 24. Other devices and methods for implementation and use of inventive conveyor end effector 56 can be used. Although inventive conveyor end effector 56 is described in one example of use in system 10, and in coordination with engaging end effector 40, it is understood that inventive conveyor end effector 56 can be used independently of system 10 and engaging end effector 40, and with other automated devices, and in other applications than as shown and described herein.

[0099] Referring to Fig. 12, an exemplary embodiment of the central control system 30 is shown for use in system 10. The control system hardware components together, or combined with additional hardware and/or software are useful for each of the central or local control systems described above (individually and collectively referred to as central control system 30). [0100] In the Fig. 12, control system 30 includes a computing device, or multiple computing devices, working cooperatively. The exemplary control system includes hardware components, including but not limited to, a processor 300, data memory storage device 302, one or more controllers (including but not limited to programmable logic controllers (PLC)) 304, signal transmitter and receiver 306 for sending and receiving signals 316, actuators 308 (for example electric motors on the automated devices 22, 24 and actuators 90, 190), and sensors 310 (including for example sensors 244), such as accelerometers, gyroscopes, optical sensors, acoustic sensors, encoders, cameras, and/or pressure and piezoelectric sensors. These hardware components are in data signal communication with one another, either through hard wire connections or wireless communication protocols, through a bus 318, or other suitable hardware. Other hardware components, including additional input and output devices 312, to suit the particular application and performance specifications can be used. Examples of input devices include, but not limited to, touch sensitive display devices, keyboards, imaging devices and other devices that generate computer interpretable signals in response to user interaction. Examples of output devices include, but not limited to, display screens, speakers, alert lights and other audio or visually perceptible devices. Control system 30 is powered by the power source 314.

[0101] Exemplary processor 300 can be any type of device that is able to process, calculate or manipulate information, including but not limited to digital information, that is currently known or may be developed in the future. As one example, the processor can be a central processing unit (CPU). As another example, the processors is a graphical processing unit (GPU). It is contemplated that multiple processors 300 and servers can be utilized to support, for example, automated loading cell 20. These processors can be on site at the airport, for example for security concerns, and/or in the “cloud” (cloud computing through remote servers and systems).

[0102] The exemplary data memory storage device 302 may include devices which store information, including but not limited to digital information, for immediate or future use by the processor 300. Examples of memory storage devices include either or both of random access memory (RAM) or read only memory (ROM) devices. The memory storage device may store information, such as program instructions that can be executed by the processor 300 and data that is stored by and recalled or retrieved by the processor 300. Additionally, an operating system and other applications can be stored in the data memory storage device 302. Non limiting examples of memory storage device 302 include a hard disk drive or a solid state drive. Alternately, portions of the stored information may be stored in the cloud (remote storage devices or data centers) and selectively retrieved through hardwire and/or wireless protocols. [0103] In one example of system 10, control system 30 includes a suitable software operating system and preprogrammed software to execute predetermined actions, functions or operations of the system 10 described herein. The operating system and various software may be stored in the data memory storage device 302, and processed and executed by the processor 300 through controller 304 and actuators 308.

[0104] In many of the above-described examples, system 10, or components thereof, for example automated devices 22, 24, engaging end effector 40 and conveyor end effector 56, sensors 310, AGV 28 and other system 10 devices described herein, receive operational instructions and commands through data signals hardwired or wirelessly streamed in real time from the central control system 30. Examples of communication networks that may be in use at an airport or other described applications may include, but are not limited to, large area networks (LAN) or a wide area network (WAN). Examples of wireless communication networks, systems and protocols usable with system 10 include wireless access points for communication based on IEEE standard 802.11 (also known as Wi-Fi). Other wireless communication protocols, for example BLUETOOTH, radio frequency controlled, or 4G or 5G LTE communications, including predecessor and successor systems, suitable for the particular application and performance specifications can be used as known by those skilled in the art. Other wired communication systems and components for communication may be based on IEEE standard 802.3 (also known as the Ethernet) may be used in certain applications. Other forms of communication networks, wired and wireless communication protocols, systems and devices can be used. [0105] In the example described above, the autonomous actions, positioning and movements of each automated devices 22, 24, and engaging 40 and conveyor 56 end effectors are, in one example, the result of receiving hard wired and/or wireless data signals from the central control system 30. In one example, the data signals from the central control center 30 can be supplemented or aided in part from data gathered by the individual automated devices 22, 24 and/or engaging effector 40 or conveyor effector 56 and communicated to the central control system 30. Data received from the above described sensors 310 can be hard wire or wirelessly sent to the central control system 30 for analysis or calculations to aid, supplement and/or determine the signals sent from the central control system 30 to the automated devices 22, 24, and/or the engaging end effector 40 or conveyor end effector 56 as described herein. In one example, artificial intelligence and/or machine learning software and/or systems may be used to assist system 10 in engaging, moving and transferring different types of bags 14. Additional, and alternate, hardware and software to support the devices and functions described herein can be used.

[0106] The control system 30 can receive images captured from one or more sensors, e.g., imaging devices as the bags 14 are processed (e.g., at check-in, as the bags 14 are being transported on one or more conveyors) before the bag reaches the engaging end effector and the images can be stored and linked to identifiers encoded in a bag tag that is affixed to the bags (e.g., in memory 302). Additionally, bag data associated with the bags 14 can be collected and stored with the identifiers and data extracted from the images (e.g., in memory 302). The bag data can include, a size, weight, and/or type of bag. The control system 30, via the processor 300, can perform one or more imaging processing and/or machine vision to process the images and extract features from the images. For example, the control system 30 can use Stitching/Registration, Filtering, Thresholding, Pixel counting, Segmentation, Inpainting, Edge detection, Color Analysis, Blob discovery & manipulation, Neural net processing, Pattern recognition, Optical character recognition, blurring, normalized lighting, greyscaling, OTSU, thresholding, erosion/dilation, convert correct hull, contour detection, blob/mass calculation normalization, and/or Gauging/Metrology. Using the image processing and/or machine vision techniques the control system 30, via the processor 300, can extract data including one or more features of the bags, e.g., handles, zippers, contours, materials, dimensions, labels, and the like, and including one or more boundaries and/or transitions between the features.

[0107] In an example application, the processor 300 of the control system 30 can execute one or more machine learning models to train and validate the one or more machine learning models using the data extracted from images of bags and/or bag data to predict whether the engaging end effector 40 can successfully engage and move bags. The machine learning model(s) can be trained by the control system 30 to classify bags based on the predictions of whether the bags can be successfully engaged and transported using the end effector (as predicted pass or predicted fail). For bags which the trained machine learning model predicts that the engaging end effector can successfully engage and transport, the control system 30 can instruct and/or control the engaging end effector to engage and transport the bags. For bags which the trained machine learning model predicts that the engaging end effector cannot successfully engage and transport, the control system 30 can instruct and/or controls the engaging end effector to skip the bags. The machine learning models can be continuously updated and/or optimized based on whether the machine learning model correctly predicts that a bag can be successfully engage and transported by the engaging end effector.

[0108] In an example embodiment, the processor 300 of the control system 30 can execute the machine learning model in two modes: a training and test mode and a production mode. In the training and test mode, the control system 30 can extract and collect data from images of bags being processed using one or more image processing and/or machine vision techniques and collect bag data include, for example, a size, weight, and type of bag captured in each image. The control system 30 can link the images of the bags, data extracted from the images, and the respective bag data to identifiers assigned to the bags, which can be encoded in bag tags that are affixed to the bags. Based on data extracted from the images and the bag data, the machine learning algorithm can predict whether the engaging end effector 40 can successfully engage and move bags and can store the prediction with the identifier of the bag. When the bags reach the engaging end effector, the identifiers on the bag tags can be scanned/read to retrieve predictions for the bags. In the training and test mode, the control system 30 can instruct the engaging end effector to attempt to engage and transport the bags regardless of whether the machine learning algorithm predict that the engaging end effector predicted the engaging end effector would be successful or not. Initially, the ability to correctly predict whether or not the engaging end effector can successfully engage and transport a bag can be inaccurate until enough bags have been processed by the machine learning models and the actual outcomes (e g., whether the engaging end effector was successful or not) is known. As more bags are processed, the accuracy of the machine learning models can improve. Once the machine learning model accurately, to a specified threshold, predicts whether or not bags can be successfully handled by the engaging end effector, the machine learning is considered to be sufficiently trained, at which time, the control system 30 can determine that the machine learning model has been sufficiently trained and can transition from the training and testing mode to the production mode.

[0109] In the production mode, the control system 30 can rely on the predicted outcome output by the machine learning model when determining whether to instruct and/or control the engaging end effector, such that when the trained machine learning model predicts that a bag can be successfully engaged and transported by the engaging end effector, the engaging end effector attempts to engage and transport the bag and when the trained machine learning model predicts that a bag cannot be successfully engaged and transported by the engaging end effector, the engaging end effector does not attempt to engage and transport the bag. In one embodiment, because the trained machine learning model can be predict whether the engaging end effector can successfully engage and transport a bag before the bag reaches the engaging end effector, the control system 30 can use the predict to alter or adjust the routing bags based on the predictions of the machine learning model. For example, if the machine learning model predicts that the engaging end effector cannot engage and transport a bag, the bag can be routed to a manual load cell, where the bag can be loaded into a container by a human. In the production mode, the machine learning model can continue to be trained by the control system such that when the machine learning model incorrectly predicts that the engaging end effector can successfully engage and transport a bag, but the engaging end effector fails to engage or transport the bag, the incorrect prediction can be fed back into the machine learning model to adjust the outcome prediction. In the event that the machine learning model incorrectly predicts that the engaging end effector will be successful in engaging and/or transporting a specified percentage of bags, the control system 30 can transition an operation of the machine learning model to the training and test mode to re-train the machine learning model.

[0110] In one non-limiting example, the machine learning algorithm from which the machine learning model is derived can be a convolution neural network, although other machine learning algorithms can be employed. For example, the one or more machine learning algorithms utilized by the control system 30 can include, for example, supervised learning algorithms, unsupervised learning algorithm, artificial neural network algorithms, artificial neural network algorithms, association rule learning algorithms, hierarchical clustering algorithms, cluster analysis algorithms, outlier detection algorithms, semi -supervised learning algorithms, reinforcement learning algorithms and/or deep learning algorithms Examples of supervised learning algorithms can include, for example, AODE; Artificial neural network, such as Backpropagation, Autoencoders, Hopfield networks, Boltzmann machines, Restricted Boltzmann Machines, and/or Spiking neural networks; Bayesian statistics, such as Bayesian network and/or Bayesian knowledge base; Case-based reasoning; Gaussian process regression; Gene expression programming; Group method of data handling (GMDH); Inductive logic programming; Instance-based learning; Lazy learning; Learning Automata; Learning Vector Quantization; Logistic Model Tree; Minimum message length (decision trees, decision graphs, etc.), such as Nearest Neighbor algorithms and/or Analogical modeling; Probably approximately correct learning (PAC) learning; Ripple down rules, a knowledge acquisition methodology; Symbolic machine learning algorithms; Support vector machines; Random Forests; Ensembles of classifiers, such as Bootstrap aggregating (bagging) and/or Boosting (meta-algorithm); Ordinal classification; Information fuzzy networks (IFN); Conditional Random Field; ANOVA; Linear classifiers, such as Fisher's linear discriminant, Linear regression, Logistic regression, Multinomial logistic regression, Naive Bayes classifier, Perceptron, and/or Support vector machines; Quadratic classifiers; k-nearest neighbor; Boosting; Decision trees, such as C4.5, Random forests, ID3, CART, SLIQ, and/or SPRINT; Bayesian networks, such as Naive Bayes; and/or Hidden Markov models. Examples of unsupervised learning algorithms can include Expectation-maximization algorithm; Vector Quantization; Generative topographic map; and/or Information bottleneck method. Examples of artificial neural network can include Self-organizing maps. Examples of association rule learning algorithms can include Apriori algorithm; Eclat algorithm; and/or FP-growth algorithm. Examples of hierarchical clustering can include Single-linkage clustering and/or Conceptual clustering. Examples of cluster analysis can include K-means algorithm; Fuzzy clustering; DBSCAN; and/or OPTICS algorithm. Examples of outlier detection can include Local Outlier Factors. Examples of semi-supervised learning algorithms can include Generative models, Low-density separation; Graph-based methods; and/or Co-training. Examples of reinforcement learning algorithms can include Temporal difference learning; Q- learning; Learning Automata; and/or SARSA. Examples of deep learning algorithms can include Deep belief networks; Deep Boltzmann machines; Deep Convolutional neural networks; Deep Recurrent neural networks; and/or Hierarchical temporal memory.

[0111] The control system 30 can use images of the interior volume of a container to determine with more bags can be inserted into the container and/or where to place the bags in the container. For example, one or more of the sensors 310 (for example sensor 244) can be used to capture images of the interior volume of the container. The point cloud can be used to determine whether there is any space available in the container, an available volume of space in the container, and/or where to place the next bag or a sequence of bags. For example, the processor 300 of the control system 30 can execute a heuristic function that receives bag dimensions and available interior volumes of the container as inputs and outputs a location at which a bag should be inserted into the container. In some embodiments, the control system can use the Jaccard Similarity or Index when determining whether one or more bag will fit within the available area of the container based on, for example, the dimensions of the available interior volume of the container and the dimensions of the one or more bags to be inserted into the available interior area.

[0112] In an example embodiment, the processor 300 can execute the heuristic function in two modes: an ad-hoc mode and a batch mode. The ad-hoc mode is used to process one bag at a time without taking into account the parameters of subsequent bags to be processed. In the ad-hoc mode, the interior volume of the container is determined as described herein and the heuristic function determines whether the bag will fit in the container, and if so, the location at which the bag should be inserted into the container. The batch mode is used to simultaneously process information (bag data) associated with multiple bags at once for determining whether the bags will fit in the container and a location at which the bags should be placed in the container. In the batch mode, the control system 30 receives information about a specified number of bags (e g., four bags) and information about the available interior volume of the container as an input to the heuristic function, and outputs a determination of whether the bags will fit and the locations at which the bags will fit. In an example embodiment, when batch mode is used, after information for a batch of bags is processed by the heuristic function, the interior volume of the container can be imaged after each bag in the batch is placed in the container to generated an update point cloud, and the heuristic function is re-run for each subsequent bag to be placed in the batch based on the updated point cloud. Re-scanning the interior volume, updating the point cloud, and re-executing the heuristic function for each bag allows the control system to accommodate imperfections in the container loading process. For example, when a bag is placed in the container, one or more bags can be compressed or distorted and/or can shift.

[0113] The amount of suction or vacuum force the engaging end effector applies to a bag can be controlled by the control system 30. For example, before the engaging end effector is instructed and/or controlled to engage and transport the bag, the bag tag of the bag can be scanned/read to extract the identifier encoded therein, and the control system 30 can retrieve the bag data for the bag (e.g., size, weight, and type of bag). Using the bag data, the processor 300 of the control system 30 can determine the amount of suction or vacuum force to apply to the bag. As an example, as the size and/or of the bags to be engaged by the engaging end effector decreases, less suction or vacuum force can be used, e.g., fewer of the vacuum zones can be used or suction is applied through fewer of the pins 124 or 134. As another example, as the size and/or of the bags to be engaged by the engaging end effector increases, more suction or vacuum force can be used, e.g., more of the vacuum zones can be used or suction is applied through more of the pins 124 or 134.

[0114] Referring to Figure 13, an inventive method 400 for engaging and selectively positioning and depositing objects in a container, for example traveler passenger bags 14 in a ULD container, is shown. In one example, method 400 includes use of a first automated device 22 including an engaging end effector 40, and a second automated device 24 including conveyor end effector 56 as described above. The first 22 and second 24 automated devices and respective engaging end effector 40 and conveyor end effector 56 are in communication with a control system 30 as described above.

[0115] In one example of method 400, in a first exemplary step 405, the first automated device 22 and engaging end effector 40 are positioned in proximity to an object, for example a bag 14, positioned along a path of travel 30 within the first automated device reach range 108 as described above. In one example, sensors can be used to detect or verify the position of the bag 14 on AGV 28.

[0116] In step 410, engaging end effector engages bag 14 and removes the bag from AGV 28. As described in the example above, engaging end effector 40 can be a vacuum end effector which is selectively activated by control system 30 to generate a vacuum force 120.

[0117] In exemplary step 415, the second automated device 24 with conveyor end effector 56 is moved to a position a conveyor portion 192 within the reach range 108 and/or 88 of the first automated device 22.

[0118] In exemplary step 420, the first automated device, engaging end effector 40 and the engaged bag 14 are moved and aligned to deposit bag 14 onto the conveyor portion 192 of the conveyor end effector 56. In one example, one or more sensors are used to detect and/or verify alignment of the engaging end effector 40 with the conveyor end effector 56. The engaging end effector 40 disengages bag 40 through, for example, cessation of the generated vacuum force, allowing bag 14 to fall by gravitation force onto conveyor portion 192.

[0119] In exemplary step 425, the second automated device 24 moves and positions the conveyor end effector and onboard bag 14 in proximity to a container 18 including interior cavity space or volume sufficient to receive onboard bag 14.

[0120] In one example, an optional step 430 includes one or more sensors connected to the conveyor end effector 56 scanning and/or imaging the interior cavity of container 18. The data obtained can be used by control system 30 to calculate or determine the available open space(s) or volume(s) within container 18 of sufficient size to receive onboard bag 14.

[0121] In exemplary step 435, the second automated device and conveyor end effector 56 are positioned to deposit the onboard bag 14 in the predetermined or calculated available open space in container 18.

[0122] In exemplary step 430, the engaging end effector 40 and conveyor end effector 56 can be moved to a start position, for example for the engaging end effector 40 to engage a bag 14, and conveyor end effector to receive a bag 14 from the engaging end effector.

[0123] It is understood that method 400 can include additional or alternate steps, or the described steps in an alternate order, to suit the particular application and performance specifications as known by those skilled in the art.

[0124] Referring to Figure 14, an inventive method 500 for training a machine learning model to predict whether the engaging end effector will be successful in engaging and transporting a bag is described in accordance with embodiments of the present disclosure. At step 502, bags are imaged, bag data is collected for each bag, and a bag tag encoded with a bag identifier is affixed to each bag. At step 504, data is extracted from the images of the bags by the process 300 of the control system 30. For each bag, the image of the bag, the data extracted from the images of the bags, the bag data, and the identifier encoded in the bag tag are linked and stored in memory 302. At step 506, for each bag, the processor 300 executes the machine learning model to output a prediction as to whether the engaging end can successfully engage and transport the bag. At step 508, the prediction for each bag is linked to the correspond identifier associated with the bag tag affixed to the bag and the prediction is stored in memory 302 with the images, the data extract from the images, the bag data, and the identifiers At step 510, when a bag reaches the engaging end effector, the bag tag is scanned to extract the identifier, and the stored prediction is retrieved from memory using the identifier. At step 512, the engaging end effector attempts to engage and transport the bag, and at step 514, the control system 30 determines whether the prediction output by the machine learning model was correct, which can be used to reclassify or label a set of data associated with the bag to improve the predictions made by the machine learning mode for bags in the future.

[0125] Referring to Figure 15, an inventive method 600 for deploying a trained machine learning model to control an operation of engaging end effector in accordance with embodiments of the present disclosure. At step 602, an incoming bag tagged with a bag tag encoded with an identifier, bag data is collected, and one or more images of the bag are captured. At step 604, data is extracted from the one or more images of the bag by the process 300 of the control system 30, and at step 606, the image of the bag, the data extracted from the image of the bag, the bag data, and the identifier encoded in the bag tag are linked and stored in memory 302. At step 608, the processor 300 executes the trained machine learning model to output a prediction as to whether the engaging end can successfully engage and transport the bag. At step 610, the prediction for the bag is linked to the identifier associated with the bag tag affixed to the bag and the prediction is stored in memory 302 with the one or more images of the bag, the data extract from the one or more images, the bag data, and the identifiers. At step 612, when a bag reaches the engaging end effector, the bag tag is scanned to extract the identifier, and the stored prediction is retrieved from memory using the identifier. At step 614, the prediction is processed to determine if the engaging end effector can engage and transport the bag. If the prediction indicates that the engaging end effector can successfully engage and transport the bag, at step 616, the control system 30 instructs and/or controls the engaging end effector to attempts to engage and transport the bag. If the prediction indicates that the engaging end effector cannot successfully engage and transport the bag, at step 618, the control system 30 instructs and/or controls the engaging end effector to skip the bag. Alternatively, if the prediction output by the machine learning model indicates that the engaging end effector cannot successfully engage and transport the bag, the control system 30 can automatically have the bag routed to a manual load cell, at which the bag is manually loaded into a container by a human so that the bag is not presented to the engaging end effector.

[0126] While the invention has been described in connection with certain embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.