Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR DYNAMIC PROCESSING OF OBJECTS PROVIDED IN VEHICLES WITH DUAL FUNCTION END EFFECTOR TOOLS
Document Type and Number:
WIPO Patent Application WO/2023/059828
Kind Code:
A9
Abstract:
An object processing system is disclosed for dynamically providing the removal of objects from a trailer (12) of a tractor trailer. The object processing system includes a load assessment system for assessing a load characteristic of a plurality of objects within the trailer and for providing load assessment data representative of the load characteristic, an object assessment system for assessing a relative position and relative environment of an object of the plurality of objects responsive to the load assessment data, and for providing object assessment data for the object, and a dynamic engagement system (10) for dynamically engaging the objects within the trailer with either of at least two different engagement systems (24, 26) responsive to the object assessment data.

Inventors:
ALLEN THOMAS (US)
COHEN BENJAMIN (US)
AMEND JOHN (US)
ROMANO JOSEPH (US)
MASON MATHEW (US)
WU YANCHUN (US)
SINGH JAGTAR (US)
Application Number:
PCT/US2022/045943
Publication Date:
February 22, 2024
Filing Date:
October 06, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BERKSHIRE GREY OPERATING COMPANY INC (US)
International Classes:
B65G47/91; B25J9/00; B25J9/16; B65G67/24
Attorney, Agent or Firm:
HILTON, William, E. et al. (US)
Download PDF:
Claims:
CLAIMS

1. An object processing system for dynamically providing the removal of objects from a trailer of a tractor trailer, said object processing system comprising: a load assessment system for assessing a load characteristic of a plurality of objects within the trailer and for providing load assessment data representative of the load characteristic; an object assessment system for assessing a relative position and relative environment of an object of the plurality of objects responsive to the load assessment data, and for providing object assessment data for the object; and a dynamic engagement system for dynamically engaging the objects within the trailer with either of at least two different engagement systems responsive to the object assessment data.

2. The object processing system as claimed in claim 1, wherein the load assessment system includes a plurality of perception units that provide perception data, and wherein the load characteristic includes a height of the plurality of objects.

3. The object processing system as claimed in any of claims 1 - 2, wherein the load assessment system includes a plurality of perception units that provide perception data, and wherein the load characteristic includes a proximity of the plurality of objects to a back end of the trailer.

4. The object processing system as claimed in any of claims 1 - 3, wherein the object assessment system includes at least one perception unit, and wherein the object assessment data includes data representative of whether the object includes a side surface that includes a portion that is not in contact with another object.

5. The object processing system as claimed in any of claims 1 - 4, wherein the object assessment system includes at least one perception unit, and wherein the object assessment data includes data representative of whether the object includes a back surface that includes a portion that does not appear to be in contact with another object.

6. The object processing system as claimed in any of claims 1 - 5, wherein the dynamic engagement system includes at least one dual purpose arm including a grasping portion for grasping a facing surface of the object, and a pulling portion for pulling a non- facing surface of the object.

15

1314425.1

7. The object processing system as claimed in claim 6, wherein the grasping portion includes at least one vacuum cup, and wherein the pulling portion is generally orthogonally disposed with respect to the at least one vacuum cup.

8. The object processing system as claimed in any of claims 1 - 7, wherein the object processing system further includes a securement detection system for detecting whether any of the plurality of objects within the trailer are secured from movement relative any of the trailer or other objects of the plurality of objects.

9. The object processing system as claimed in claim 8, wherein the securement detection system determines whether a subset of the plurality of objects is provided on a pallet.

10. The object processing system as claimed in claim 9, wherein the object processing system further includes a pallet removal system of engaging the pallet and removing the pallet and the subset of the plurality of objects from the trailer.

11 The object processing system as claimed in claim 10, wherein the pallet removal system includes pallet lift forks that are mounted on a swing arm under the dynamic engagement system.

12 The object processing system as claimed in claim 9, wherein the securement detection system determines whether a subset of the plurality of objects is retained by a net within the trailer.

13. An object processing system for dynamically providing the removal of objects from a trailer of a tractor trailer, said object processing system comprising: a load assessment system for assessing a load characteristic of a plurality of objects within the trailer and for providing load assessment data representative of the load characteristic; an engagement system for engaging the objects within the trailer responsive to the load assessment data; and a securement detection system for detecting whether any of the plurality of objects within the trailer are secured from movement relative any of the trailer or other objects of the plurality of objects.

14. The object processing system as claimed in claim 13, wherein the load assessment system includes a plurality of perception units that provide perception data, and wherein the load characteristic includes a height of the plurality of objects.

16

1314425.1

15. The object processing system as claimed in any of claims 13 - 14, wherein the load assessment system includes a plurality of perception units that provide perception data, and wherein the load characteristic includes a proximity of the plurality of objects to a back end of the trailer.

16. The object processing system as claimed in any of claims 13 - 15, wherein the dynamic engagement system includes at least one dual purpose arm including a grasping portion for grasping a facing surface of the object, and a pulling portion for pulling a non- facing surface of the object.

17. The object processing system as claimed in claim 16, wherein the grasping portion includes at least one vacuum cup, and wherein the pulling portion is generally orthogonally disposed with respect to the at least one vacuum cup.

18. The object processing system as claimed in any of claims 13 - 17, wherein the securement detection system determines whether a subset of the plurality of objects is provided on a pallet.

19. The object processing system as claimed in claim 18, wherein the object processing system further includes a pallet removal system of engaging the pallet and removing the pallet and the subset of the plurality of objects from the trailer.

20. The object processing system as claimed in claim 19, wherein the pallet removal system includes pallet lift forks that are mounted on a swing arm under the dynamic engagement system.

21. The object processing system as claimed in any of claims 13 - 20, wherein the securement detection system determines whether a subset of the plurality of objects is retained by a net within the trailer.

22. The object processing system as claimed in any of claims 13 - 21, wherein the object processing system further includes an object assessment system for assessing a relative position and relative environment of an object of the plurality of objects responsive to the load assessment data, and for providing object assessment data for the object

23. The object processing system as claimed in any of claims 13 - 22, wherein the object assessment system includes at least one perception unit, and wherein the object assessment data includes data representative of whether the object includes a side surface that includes a portion that is not in contact with another object.

17

1314425.1

24. The object processing system as claimed in any of claims 13 - 23, wherein the object assessment system includes at least one perception unit, and wherein the object assessment data includes data representative of whether the object includes a back surface that includes a portion that does not appear to be in contact with another object.

25. An object processing system for dynamically providing the removal of objects from a trailer of a tractor trailer, said object processing system comprising: an object assessment system for assessing a relative position and immediate environment of object of the plurality of objects, and for providing object assessment data for the object; a dynamic engagement system for dynamically engaging the objects within the trailer with either of at least two different engagement systems responsive to the object assessment data; and a securement detection system for detecting whether any of the plurality of objects within the trailer are secured from movement relative any of the trailer or other objects of the plurality of objects.

26. The object processing system as claimed in claim 25, wherein the object assessment system includes at least one perception unit, and wherein the object assessment data includes data representative of whether the object includes a side surface that includes a portion that is not in contact with another object.

27. The object processing system as claimed in any of claims 25 - 26, wherein the object assessment system includes at least one perception unit, and wherein the object assessment data includes data representative of whether the object includes a back surface that includes a portion that does not appear to be in contact with another object.

28. The object processing system as claimed in any of claims 25 - 27, wherein the dynamic engagement system includes at least one dual purpose arm including a grasping portion for grasping a facing surface of the object, and a pulling portion for pulling a non- facing surface of the object.

29. The object processing system as claimed in claim 28, wherein the grasping portion includes at least one vacuum cup, and wherein the pulling portion is generally orthogonally disposed with respect to the at least one vacuum cup.

18

1314425.1

30. The object processing system as claimed in any of claims 25 - 29, wherein the securement detection system determines whether a subset of the plurality of objects is provided on a pallet.

31. The object processing system as claimed in claim 30, wherein the object processing system further includes a pallet removal system of engaging the pallet and removing the pallet and the subset of the plurality of objects from the trailer.

32. The object processing system as claimed in claim 31, wherein the pallet removal system includes pallet lift forks that are mounted on a swing arm under the dynamic engagement system.

33. The object processing system as claimed in any of claims 25 - 32, wherein the securement detection system determines whether a subset of the plurality of objects is retained by a net within the trailer.

34. The object processing system as claimed in any of claims 25 - 33, wherein the object processing system further includes a load assessment system for assessing a load characteristic of a plurality of objects within the trailer and for providing load assessment data representative of the load characteristic.

35. The object processing system as claimed in claim 34, wherein the load assessment system includes a plurality of perception units that provide perception data, and wherein the load characteristic includes a height of the plurality of objects.

36. The object processing system as claimed in claim 34, wherein the load assessment system includes a plurality of perception units that provide perception data, and wherein the load characteristic includes a proximity of the plurality of objects to a back end of the trailer.

37. The object processing system as claimed in any of claims 25 - 36, wherein the dynamic engagement system includes a collection panel that is elevationally and rotatably adjustable.

38. The object processing system as claimed in claim 37, wherein the collection panel is formed of multiple sub-panels that may be moved relative one another.

39. The object processing system as claimed in any of claims 25 - 38, wherein the object processing system further includes an obstruction removal system that develops a removal model based at least on part on force feedback from joints of an articulated arm.

19

1314425.1

40. The object processing system as claimed in c any of claims 25 - 39, wherein the object processing system further includes an output perception system for providing perception data regarding objects provided by the dynamic engagement system, and a diverting system for diverting certain selected objects responsive to the perception data.

20

1314425.1

Description:
SYSTEMS AND METHODS FOR DYNAMIC

PROCESSING OF OBJECTS PROVIDED IN VEHICLES

WITH DUAL FUNCTION END EFFECTOR TOOLS

PRIORITY

[0001] The present application claims priority to U.S. Provisional Patent Application No. 63/252,807 filed October 6, 2021, the disclosure of which is hereby incorporated by reference in its entirety.

BACKGROUND

[0002] The invention generally relates to automated, robotic and other object processing systems such as sortation systems, and relates in particular to automated and robotic systems intended for use in environments requiring, for example, that a variety of objects (e.g., parcels, packages, and articles etc.) be processed and distributed to several output destinations.

[0003] Many parcel distribution systems receive parcels from a vehicle, such as a trailer of a tractor trailer. The parcels are unloaded and delivered to a processing station in a disorganized stream that may be provided as individual parcels or parcels aggregated in groups such as in bags, and may be provided to any of several different conveyances, such as a conveyor, a pallet, a Gaylord, or a bin. Each parcel must then be distributed to the correct destination container, as determined by identification information associated with the parcel, which is commonly determined by a label printed on the parcel or on a sticker applied to the parcel. The destination container may take many forms, such as a bag or a bin.

[0004] The sortation of such parcels from the vehicle has traditionally been done, at least in part, by human workers that unload the vehicle, then scan the parcels, e.g., with a hand-held barcode scanner, and then place the parcels at assigned locations. For example many order fulfillment operations achieve high efficiency by employing a process called wave picking. In wave picking, orders are picked from warehouse shelves and placed at locations (e.g., into bins) containing multiple orders that are sorted downstream. At the sorting stage individual articles are identified, and multi-article orders are consolidated, for example into a single bin or shelf location, so that they may be packed and then shipped to customers. The process of sorting these objects has traditionally been done by hand. A human sorter picks an object from an incoming bin, finds a barcode on the object, scans the barcode with a handheld barcode

1

1314425.1 scanner, determines from the scanned barcode the appropriate bin or shelf location for the object, and then places the object in the so-determined bin or shelf location where all objects for that order have been defined to belong. Automated systems for order fulfillment have also been proposed, but such systems still require that objects be first removed from a vehicle for processing if they arrive by vehicle.

[0005] Such systems do not therefore, adequately account for the overall process in which objects are first delivered to and provided at a processing station by a vehicle such as a trailer of a tractor trailer. Additionally, many processing stations, such as sorting stations for sorting parcels, are at times, at or near full capacity in terms of available floor space and sortation resources, and there is further a need therefore for systems to unload vehicles and efficiently and effectively provide an ordered stream of objects.

SUMMARY

[0006] In accordance with an aspect, the invention provides an object processing system for dynamically providing the removal of objects from a trailer of a tractor trailer. The object processing system includes a load assessment system for assessing a load characteristic of a plurality of objects within the trailer and for providing load assessment data representative of the load characteristic, an object assessment system for assessing a relative position and relative environment of an object of the plurality of objects responsive to the load assessment data, and for providing object assessment data for the object, and a dynamic engagement system for dynamically engaging the objects within the trailer with either of at least two different engagement systems responsive to the object assessment data.

[0007] In accordance with another aspect, the invention provides an object processing system for dynamically providing the removal of objects from a trailer of a tractor trailer. The object processing system includes a load assessment system for assessing a load characteristic of a plurality of objects within the trailer and for providing load assessment data representative of the load characteristic, an engagement system for engaging the objects within the trailer responsive to the load assessment data, and a securement detection system for detecting whether any of the plurality of objects within the trailer are secured from movement relative any of the trailer or other objects of the plurality of objects.

[0008] In accordance with a further aspect, the invention provides an object processing system for dynamically providing the removal of objects from a trailer of a tractor trailer. The object processing system includes an object assessment system for assessing a relative position and

2

1314425.1 immediate environment of object of the plurality of objects, and for providing object assessment data for the object, a dynamic engagement system for dynamically engaging the objects within the trailer with either of at least two different engagement systems responsive to the object assessment data, and a securement detection system for detecting whether any of the plurality of objects within the trailer are secured from movement relative any of the trailer or other objects of the plurality of objects.

BRIEF DESCRIPTION OF THE DRAWINGS

[0009] The following description may be further understood with reference to the accompanying drawings in which:

[0010] Figure 1 shows an illustrative diagrammatic view of an object processing system in accordance with an aspect of the present invention;

[0011] Figure 2 shows an illustrative diagrammatic end view of the object processing system of Figure 1;

[0012] Figure 3 shows an illustrative diagrammatic functional flow diagram of a load assessment routine in a system in accordance with an aspect of the present invention;

[0013] Figure 4 shows an illustrative diagrammatic enlarged view of a dual-purpose tool in the object processing system of Figure 1 showing a pull side of the tool;

[0014] Figure 5 shows an illustrative diagrammatic enlarged view of the dual-purpose tool in the object processing system of Figure 4 showing the pull side of the tool engaging an object; [0015] Figure 6 shows an illustrative diagrammatic enlarged view of a dual-purpose tool in the object processing system of Figure 1 showing a rake side of the tool;

[0016] Figure 7 shows an illustrative diagrammatic enlarged view of the dual-purpose tool in the object processing system of Figure 4 showing the rake side of the tool engaging an object; [0017] Figure 8 shows an illustrative diagrammatic functional flow diagram of an object assessment routine in a system in accordance with an aspect of the present invention;

[0018] Figure 9 shows an illustrative diagrammatic view of the object processing system of Figure 1 with a first dual-purpose tool pulling an object onto the collection panel;

[0019] Figure 10 shows an illustrative diagrammatic view of the object processing system of Figure 1 with a second dual purpose tool pulling another object onto the collection panel;

[0020] Figures 11A and 1 IB show illustrative diagrammatic end views of the object processing system of Figure 1 with the end-effector engaging a plurality of objects in a cross-direction (Figure 11 A), and moving the engaged plurality of objects by rotation (Figure 1 IB);

3

1314425.1 [0021] Figure 12 shows an illustrative diagrammatic end view of the object processing system of Figure 1 processing an upper level of a trailer;

[0022] Figure 13 shows an illustrative diagrammatic end view of the object processing system of Figure 1 processing a lower level of a trailer with the collection panel lowered;

[0023] Figure 14 shows an illustrative diagrammatic elevated view of a rotation control system for a collection panel of an object processing system in accordance with an aspect of the present invention;

[0024] Figure 15 shows an illustrative diagrammatic side view of the object processing system of Figure 1 with the collection panel raised;

[0025] Figure 16 shows an illustrative diagrammatic view of an object processing system in accordance with another aspect of the present invention that includes a collection panel with two articulated sub-panels in an elevated position;

[0026] Figure 17 shows an illustrative diagrammatic view of the object processing system of Figure 16 with the two sub-panel collection panel in a folded lowered position;

[0027] Figure 18 shows an illustrative diagrammatic side view of the object processing system of Figure 16 with the two sub-panel collection panel in an elevated position;

[0028] Figure 19 shows an illustrative diagrammatic side view of the object processing system of Figure 16 with the two sub-panel collection panel in a lowered position;

[0029] Figures 20A and 20B show illustrative diagrammatic side views of the object processing system in accordance with a further aspect of the present invention that includes a folding three sub-panel collection panel shown in an elevated position (shown in Figure 20A) and in a lowered position (shown in Figure 20B);

[0030] Figures 21A and 21B show illustrative diagrammatic side views of the object processing system in accordance with a further aspect of the present invention that includes a telescoping multi-panel collection panel shown in an elevated position (shown in Figure 21 A) and in a lowered position (shown in Figure 2 IB);

[0031] Figure 22 shows an illustrative diagrammatic view of an object processing system in accordance with an aspect of the present invention engaging a particularly long object;

[0032] Figure 23 shows an illustrative diagrammatic view of an object processing system in accordance with an aspect of the present invention engaging an object that is blocked from movement;

[0033] Figure 24 shows an illustrative diagrammatic functional flow diagram of an obstruction resolution routine in a system in accordance with an aspect of the present invention;

4

1314425.1 [0034] Figure 25 shows an illustrative diagrammatic view of an obstructed object being subjected to applied forces in each of three mutually orthogonal directions;

[0035] Figure 26 shows an illustrative diagrammatic view of the obstructed object of Figure 25 being subjected to forces in each of yaw, pitch and roll rotational directions;

[0036] Figure 27 shows an illustrative diagrammatic view of an object processing system in accordance with an aspect of the present invention that includes a plurality of substitutable endeffector tools for use with the object processing system;

[0037] Figure 28 shows an illustrative diagrammatic view of the object processing system of Figure 27 with and plurality of substitutable end-effector tools with one end-effector tool being accessed by a programmable motion device;

[0038] Figure 29 shows an illustrative diagrammatic elevated view of a retention detection system in an object processing system in accordance with an aspect of the present invention;

[0039] Figure 30 shows an illustrative diagrammatic end view of an object processing system in accordance with an aspect of the present invention encountering a netting within a trailer;

[0040] Figure 31 shows an illustrative diagrammatic end view of an object processing system in accordance with an aspect of the present invention encountering a loaded wrapped pallet within a trailer;

[0041] Figure 32 shows an illustrative diagrammatic enlarged front view of a pallet removal system in accordance with an aspect of the present invention with the pallet forks in a lowered position;

[0042] Figure 33 shows an illustrative diagrammatic enlarged front view of a pallet removal system in accordance with an aspect of the present invention with the pallet forks in a raised position;

[0043] Figure 34 shows an illustrative diagrammatic enlarged front view of the pallet removal system of Figure 32 with the pallet removal system in a partially rotated position;

[0044] Figure 35 shows an illustrative diagrammatic enlarged front view of the pallet removal system of Figure 32 with the pallet removal system in a fully rotated position;

[0045] Figure 36 shows an illustrative diagrammatic underside view of the pallet removal system of Figure 32 with the pallet removal system in a nonrotated position;

[0046] Figure 37 shows an illustrative diagrammatic underside view of the pallet removal system of Figure 32 with the pallet removal system in a fully rotated position;

[0047] Figure 38 show an illustrative diagrammatic side view of an object processing system in accordance with an aspect of the present invention in which a wrapped pallet is being removed from the trailer;

5

1314425.1 [0048] Figure 39 shows an illustrative diagrammatic view of the object processing system of Figure 38 wherein the wrapped pallet is being lowered onto a shipping and receiving dock; [0049] Figure 40 shows an illustrative diagrammatic view of the object processing system of Figure 38 with the pallet removal system in a lowered position;

[0050] Figure 41 shows an illustrative diagrammatic view of the object processing system of Figure 2, wherein the wrapped pallet is removed from the trailer; and

[0051] Figure 42 shows an illustrative diagrammatic view of the object processing as claimed in claim 38 wherein the objects are diverted based on any of weight or incompatibility.

[0052] The drawings are shown for illustrative purposes only.

DETAILED DESCRIPTION

[0053] In accordance with various aspects, the invention provides a dynamic engagement system for engaging objects within a trailer of a tractor trailer. With reference for example to Figure 1, a dynamic engagement system 10 may engage objects within a trailer 12, and include a chassis 14 that couples to a warehouse conveyor 16 via couplings 18. The chassis 14 (and the conveyor 16) are movable on wheels for permitting the engagement system 10 to enter into (and back out of) the trailer 12. The wheels on the chassis 14 are powered and the control system is remotely coupled to one or more computer processing systems 100.

[0054] With further reference to Figure 2, the dynamic engagement system includes a collection panel 20 that may be pivoted about its bottom edge to facilitate drawing objects from within the trailer 12 onto the conveyor chassis 14. In particular, an upper edge of the collection panel 20 may be positioned adjacent an upper level of a stack of objects within the trailer using one or more powered rotational assist units 22 (e.g., two on each side as further shown in Figure 14). Each assist unit 22 may also include force torque sensor feedback for measuring any forces acting on the panel 20. The powered rotational assist units 22 rotate the panel upward and downward about an axis 19 at the bottom of the panel 20 (shown in Figure 20). Using for example, the force torque sensor feedback, the system may lower the panel toward a stack of objects, detect that the panel has made contact with the stack, and may remain in position or back up a small distance until the panel is no longer contacting the stack of objects. Once the panel 20 is positioned adjacent a stack of objects (e.g., just below a top row of a stack of objects), two articulated arms 24, 26 are employed adjacent the panel to urge objects from the stack onto the panel 20 (which may include one or more guides 21).

6

1314425.1 [0055] Initially, the load of objects within a trailer may be assessed. With reference to Figure 3, a load assessment routine may begin (step 1000) by lowering the panel 20 to a position that is approximately horizontal (step 1002), the conveyor chassis 14 may move toward the trailer (step 1004), and the panel may be then raised to a generally vertical position (step 1006). This may ensure that the dynamic engagement system does not begin too close to the objects. The panel is then lowered until the top of the trailer is visible (step 1008), and then continued to be lowered until at least one upper object is detected (step 1010). Distance and position detection sensors in the perception unit 28 are then used to determine a height of the at least one upper object (step 1012) as well as a distance to the at least one upper object (step 1014). The panel is then further lowered to determine whether (and if so where) any lower objects are provided in the trailer that are closer in distance to the dynamic engagement system than the at least one upper object (step 1016). The highest object height, distance to the highest object, and distances to any closer objects are noted (step 1018), and the system them sets the panel rotation elevation and distance to be moved forward toward the trailer for unloading responsive to the highest object height, distance to the highest object, and distances to any closer objects (step 1020). In particular, the panel is positioned to be near but below the closest and highest objects so that programmable motion devices (e.g., articulated arms) may be used to urge objects onto the panel 20, from which the objects will be guided along conveyor section 14 to warehouse conveyor 16.

[0056] Each articulated arm 24, 26 may include a multi-purpose end effector 30 that includes a retrieving tool 34, a distal side of which includes one or more vacuum cups 32 coupled to a vacuum source, and a proximal side 36 that may be used to pull objects over an upper edge the collection panel 20. In particular, Figure 4 shows a plurality of vacuum cups 32 on one side of the tool 34. The vacuum cups are employed (with vacuum from a vacuum source) to grasp objects and pull them over the upper edge of the collection panel 20 as shown in Figure 5. The object (e.g., 38 may then be dropped (as shown in Figure 9) onto the collection panel by turning off the vacuum to the cups 32. Figure 6 shows the multi-purpose end effector 30 of the articulated arm 26 (again with the vacuum cups 32 on the tool 34), and Figure 7 shows the second side 36 of the tool 34 used to pull one or more objects (e.g., 40, 42) over the upper edge of the collection panel 20 (as shown in Figure 9) and onto which they fall (as shown in Figure 10), optionally guided by one or more guides 21.

[0057] A side of an object may also be engaged to dislodge one or more objects from a stack or set of objects onto the panel. For example, Figure 11A shows the side 36 of the end effector tool 34 engaging an object 44 from a side of the object. The side of the object 44 may have

7

1314425.1 been associated with having an opening (e.g., 45 as also shown in Figure 6) adjacent a side of the object 44. Once engaged, the object 44 may be moved sideways by the tool 34, and then rotated toward the articulated arm to draw the object(s) over the panel 20. Figure 11 A shows the tool 34 engaging not only object 44 but also objects 40 and 42, and urging all three objects against each other and against an inner wall of the trailer. Figure 1 IB shows all three objects 40, 42, 44 being rotated over the panel 20 such that the objects will fall onto the collection panel 20, optionally engaging guides 21, to be collected by the conveyor 14.

[0058] Once the panel is positioned, each facing object is assessed. In particular for example and with reference to Figure 8, an object assessment routine may begin (step 2000) by evaluating object boundaries. For a panel elevation, for each object encountered top down and across, all boundaries of a front face of each object of interest are identified (step 2002). For each object of interest, the system will also determine any boundaries of a top face associated with the front face (step 2004). With this information, the system may determine whether the front face include a surface suitable for vacuum cup grasping, and provide gasp assessment data (step 2006). The system may also determine whether any rear boundaries of the top face are spaced apart from any neighboring objects, and provide pull assessment data (step 2008). Additionally, the system may determine whether any side boundaries of the top face are spaced apart from any neighboring objects, and provide sideways move assessment data (step 2010). The system may then provide dynamic object engagement instructions responsive to the grasp assessment data, the pull assessment data, and the sideways move assessment data (step 2012). [0059] The top edge 23 of the panel 20 should be positioned to permit objects (e.g., 38, 40, 42) to be moved over the panel 20 so that they may be dropped onto the panel (and thereby urged along the chassis conveyor 14 to the warehouse conveyor 16). The objects may generally be removed from top to bottom of an exposed stack of objects. As the objects are removed (and provided onto the panel 20), the panel is lowered to receive further objects. The panel 20 may be lowered, by pivoting the panel (using the assist unit(s) 22 that rotated the panel with respect to a bottom edge thereof, as well as the powered wheels of the chassis 14 that move the dynamic engagement system backward to accommodate the lowering of the panel (as it rotates). In this way, a lower portion of the exposed stack of objects may be processed (as shown in Figure 12), and when further lowered and moved as discussed above, a still further lower portion of the exposed stack of objects may be processed (as shown in Figure 13).

[0060] With reference to Figure 14, control of the rotation of the panel 20 about the axis 19 (shown in Figure 15) at the bottom of the panel 20, is provided by the panel assist units 22, each of which includes, for example, an offset pair of actuators 70, 72. Each actuator 70, 72 is

8

1314425.1 offset from the axis 80 by a small difference, and the combination of the movement of the actuators (in cooperation with the actuators 70, 72 on the other side of the engagement system 10) causes the panel 20 to be rotated upward or downward with respect to the conveyor of the chassis 14. One or each of the actuators may further include force torque sensors that measure any forces acting on the panel 20 other than gravity.

[0061] The objects may thereby be removed from the trailer using the articulated arms 24, 26 to moved objects onto the panel 20 as the panel travels (rotationally and linearly) through the trailer. With reference again to Figures 6, 7, 9 and 10, when the perception system 28 detects that an opening (e.g., at 41, 43 in Figures 6 and 9) may exist behind objects (e.g., 40, 42) sufficient to receive at least part of a tool 34 of an end effector 30, the system may position a tool of an end effector behind the object such that the second side 36 of the tool 34 may be used to pull one or more objects over the panel 20. Similarly, if an opening is determined to exist adjacent a side face of an object, the system may position a tool of an end effector adjacent the object such that the second side 36 of the tool may be used to urge one or more objects over the panel 20. If an object cannot be moved (or for example cannot be grasped or the end effector tool 34 may not be able to get behind the object), then the system will note this and move on to another object.

[0062] Movement of the dynamic engagement system is provided through the one or more processing systems 100 in communication with the perception system 28, the articulated arms 24 26, the rotational assist units and the conveyor wheels actuators (e.g., 15 shown in Figures 21 and 33). Rotational movement of the panel 20 about axis 19 is generally shown at A in Figure 15, and linear movement of the dynamic engagement system is generally shown at B in Figure 15.

[0063] In accordance with further aspects, the collection panel may include sub-panels that may be rotatable with respect to one another such that the panel may be collapsed when it is lowered toward the floor of the trailer. This may facilitate reaching objects without extending the articulated arms 24, 26 significant distances to clear the edge of the collection panel. For example, Figure 16 shows an object processing system that includes a collection panel 50 with two sub-panels 52, 54. When extended to an upper elevation (as shown in Figure 16), the subpanels are maintained in an extended position (end-to-end) by actuators 53. The panel 50 may include guides 51 to facilitate dropping objects onto the chassis 14. With reference to Figure 17, when the actuators 53 release the upper sub-panel 52, it will swing under and be captured on the underside of the sub-panel 54, leaving only the sub-panel to extend toward the objects within the trailer. The outer edge of the sub-panel 54 is therefore now the leading edge of the

9

1314425.1 collection panel 50, permitting the articulated arms 24, 26 to not be required to reach as far away from the chassis 14. Figure 18 shows a side view of the collection panel 50 in the elevated position, and Figure 19 shows the collection panel 50 in the folded and lowered position.

[0064] The collection panel may include any number of such folding sub-panels. Figures 20A and 20B show side views of an object processing system with a collection panel 60 that includes three sub-panels 62, 64, 66. Figure 20A shows the collection panel 60 in an elevated position with each sub-panel 62, 64, 66 extending end-to-end, and Figure 20B shows the collection panel 60 with the sub-panel 62 folded with respect to the sub-panel 64, and the subpanel 64 folded with respect to the sub-panel 66. In Figure 20B, the panel assembly is in a lowered position such that the outer edge of the sub-panel 66 is the leading edge of the collection panel 60, permitting the articulated arms 24, 26 to not be required to reach as far away from the chassis 14.

[0065] The collection panel may further include any number of telescoping sub-panels.

Figures 21A and 21B show side views of an object processing system with a collection panel 60’ that includes multiple telescoping sub-panels 62’, 64’, 66’. Figure 21A shows the collection panel 60’ in an elevated position with each sub-panel 62’, 64’, 66’ extending end-to- end, and Figure 21B shows the collection panel 60’ with the sub-panels 62’, 64’, 66’ collapsed in a telescoping manner. Any guides, e.g., 61’, are mounted in stand-offs with sufficient clearance to permit the sub-panels to be drawn together. In Figure 2 IB, the panel assembly is in a lowered position such that the outer edge of the sub-panel 66’ is the leading edge of the collection panel 60’, permitting the articulated arms 24, 26 to not be required to reach as far away from the chassis 14.

[0066] In various applications, obstructions may be encountered, and these obstructions may be addressed in any of a variety of ways using modelling and machine learning. For example, a particularly large object may be encountered (e.g., that is very long) as shown in Figure 22. The long object 72 may be encountered when only an exposed side is visible or it may be apparent when the object is encountered that it will be long (e.g., an exposed end of a kayak). If the system is unable to move an object, it will move on to the task of moving other objects (as discussed a above) until the object is sufficiently free. Additionally, there may be further objects (e.g., 74) on top of the object 72 that are not yet reachable by the articulated arms 24, 26. In further applications, obstructions may be encountered where the object is too heavy to be moved or cannot be freed from surrounding objects. Figure 23 shows an object 76 that is blocked from being moved by the end-effector 30 by surrounding objects 73, 75, 77, 78, 79.

10

1314425.1 [0067] In either of these situations, the system may apply a maximum normal run-time vacuum pressure, and if this fails, the system may set a signal indicating the need for human personnel intervention. Alternatively, the system may conduct some analyses and develop a removal model. The system may characterize the end-effector movements in terms of the forces and torques it can apply to the load, and then look at the ensemble of objects, the wall, and 1 all the places the effector could be placed, and the forces and torques that could be applied. The system may estimate what any resulting motion would occur. Sometimes an object move, e.g., lift up, slide out of the wall, or slide onto the platform. Sometimes the object may pivot to a more accessible pose. Sometimes, however, the object may pivot to be cocked and harder to remove, which information should be provided by the model (to be avoided). Sometimes multiple objects move, which is generally ok. Simulation modules characterize the possible outcomes of feasible end-effector actions. Machine learning may further be used to learn a mapping from loads to provide good end-effector behaviors, given the wide variability of events such as the object doesn’t move, the object is heavier than anticipated, or the friction is more significant than anticipated, or neighboring objects move in unwanted ways. That modelled outcome could be observed and integrated in the modeling system so that removal models may be developed accordingly.

[0068] For example, Figure 24 shows the functional process of an obstruction resolution routine that may begin (step 3000) by noting for each insufficient grasp or insufficient move, the perception data regarding the obstructed object (step 3002). The system may then grasp the object and try to move in it each of x, y and z directions noting feedback from joint force torque sensors on the robot (step 3004). The system may then try to move the object in each of yaw, pitch and roll directions noting feedback from joint force torque sensors on the robot (step 3006). This sensor feedback information may provide significant data that facilitates not only identifying an efficient removal model, but may help classify objects for facilitating handling unknown objects. The system may then access the database regarding any modelled motion (step 3008), and if no removal model is found, the system may rock the object horizontally to try to free the object from side obstructions (step 3010). Such rocking may sufficiently loosen the object for removal. The system may then record image(s) of the rocking and any movement of surrounding objects (step 3012). If it is determined that a different end-effector should be used, the system may swap out the end-effector for any desired different end-effector (step 3014) as also shown in Figures 27 and 28. The system may then access machine learning database regarding data collected for the object (step 3016) and the develop obstruction removal model (step 3018).

11

1314425.1 [0069] With reference to Figure 25, for example, the end-effector 30 may try to move an obstructed object 76 in each of x, y, and z directions, noting the feedback on joint force torque sensors on the articulated arm. With reference to Figure 26, the end-effector 30 may further try to move the obstructed object 76 in each of yaw, pitch and roll directions, again noting the feedback on joint force torque sensors on the articulated arm. This non- visually observable feedback information may provide valuable insights for the machine learning system in developing efficient removal models.

[0070] Figure 27 shows an object processing system in accordance with an aspect of the present invention as discussed above that includes a pair of end-effector swap racks 48, 58 on which a plurality of further end-effectors 30’, 30” may be provided for use by the articulated arms 24, 26. As shown in Figure 28, each articulated arm (e.g., 24 as shown) may access each further end-effector for automatically swapping out the end-effector as the removal model may require.

[0071] A retention detection system may also be employed to determine whether a retention system is present within a trailer (e.g., such as a restraining net, wall, or set of objects that are wrapped together, for example and provided on a pallet). With reference to Figure 29, the retention detection system begins (step 4000) by being triggered for each object that may not be sufficiently processed. In particular, for each insufficient grasp of an object or insufficient attempted move of an object, the following data is collected (step 4002). This is done until the panel is lowered to its lowest point and all movable objects are moved. The system then records instances of net lines across front faces of retained objects (step 4004), and then record instances of net lines extending horizontally across multiple retained objects (step 4006). The system then records instances of net lines extending vertically across multiple retained objects (step 4008), and then records any image of any portion of a pallet near the floor of the trailer (step 4010). The system them sets a net detection signal responsive to any instances of net lines in connection with a plurality of retained objects (step 4012), and then sets a pallet detection signal responsive to any image of any portion of a pallet near the floor of the trailer in connection with a plurality of retained objects (step 4014). The system then engages the automated pallet removal system responsive to the pallet detection signal (step 4016).

[0072] During the removal of objects therefore, if any object may not be removed (either may not be able to be grasped properly, or may not be movable due to an obstruction), the system will run a retaining detection routine to determine whether any of the objects are retained within the trailer. The system will continue, moving to a next object until all objects that may be moved are moved onto the panel 20. Each time an object is identified as being not movable

12

1314425.1 (again either not graspable or is blocked), the system will run the retaining detection routine. The retaining detection routine may be run on the one or more computer processing systems 100 with perception data from the perception unit 28, and may analyze image data in combination with object grasp attempt data to identify whether any retention system is inhibiting the removal of objects from the trailer. If a retaining feature is present, the system will run for each object that is found to be not movable. A combination of the results of the multiple executions of the routine provides duplicative results that should confirm the type of retaining feature present. For example, Figure 30 shows a netting 56 that spans the width and height of the trailer and are attached to mounts 55. Such netting 56 may be installed manually upon loading of the trailer, and may be required to be removed manually with unpacking the trailer. An alarm (light and/or sound) will be triggered if a netting is detected by the system, and removed by human personnel.

[0073] Alternatively, as the movable objects are removed, an image of an exposed end of a pallet at the floor of the trailer may be detected. The objects on the pallet may be wrapped (accounting for the system being unable to move individual objects), and upon detection of the pallet, the system will trigger a pallet removal command. Figure 31 for example, shows a pallet 57 on which objects are provided within a wrapping (e.g., clear plastic) 58. Objects within the wrapping 58 on the pallet will not be movable by the end effector 30, and the system will run the retaining detection routine. Once the bottom of the trailer becomes clear, the pallet 657 will become visible to the perception system 28, and the system will register that a pallet is present. Again, an alarm (light and/or sound) will be triggered if a pallet is detected by the system, and the pallet and its associated objects may be removed by human personnel.

[0074] In accordance with further aspects, when the system detects the presence of a pallet as above, the system may employ an automated pallet removal system. In particular and with reference to Figure 32, the system may include a pallet removal system 80 that includes a fixed pivot end 82 that rotates with respect to the chassis conveyor 14 by a pivot pin 84, and a rotating swing end 86, both of which ends 82, 86 being coupled to a swing bar 88. The swing bar 88 is attached to a counterweight portion 110 (shown in Figure 35) that is supported by a plurality of heavy-duty casters 112. The pallet removal system 80 also includes a pair of forks 94, 96 that are mounted to a cross bar 98, and the cross bar 98 may be actively raised or lowered along tracks 102, 104 as controlled by the one or more computer processing systems. Figure 33 shows the forks 94, 96 and the cross bar 98 in the raised position. The pallet removal system 80 may also include one or more perception systems 106, 108 to aid the pallet removal process (the perception system 28 may be blocked by the panel 20).

13

1314425.1 [0075] With reference to Figures 34 and 35, the pallet removal system 80 may be rotated with respect to the chassis 14 about the pin 84 (e.g., to 45 degrees as shown in Figure 34, and to 90 degrees as shown in Figure 35). Figure 36 shows an underside view of the pallet removal system 80 under the chassis conveyor 14, and Figure 37 shows an underside view of the pallet removal system rotated 90 degrees (as in Figure 35). The counterweight 110 facilitates lifting of a pallet, and casters 112 (together with wheels under the pivot end 82 and swing end 86) support the weight of the counterweight 110 and the pallet. Figure 38 shows a pallet of objects being removed from the trailer, and Figure 39 shows the pallet of objects rotated 90 degrees by the removal system. Figure 40 shows an opposite side view of the pallet are rotated 90 degrees in Figure 39, and Figure 41 shows the pallet removed and unloaded from the removal system. The removed and unloaded pallet is no longer obstructing the removal of objects from the trailer, and the dynamic engagement system may re-enter the trailer and again begin removing objects. The removed and unloaded pallet may be processed by human personnel.

[0076] Figure 42 shows an output system in accordance with further aspects of the present invention that includes a perception system 120 with a diverting system 122. The perception system may include one or more perception units 124 that provide any of camera images, or scan data such as 2D or 3D scan data, or perception information regarding any identification code such as a barcode, QR code of other unique identification markings. The perception system may further include a weight sensing conveyor section as part of the diverting system 122 (e.g., with rollers or diverter belts mounted on load cells). The diverter 122 may include di-directional belts that are selectively elevatable between the rollers. The perception and diverting systems sit between the chassis 14 and the warehouse conveyor 16, and permit outlier items such as heavy or large items to be diverted to one or another diverter paths 126, 128 for either alternate automated processing or for processing by human personnel. The decision to divert any object is based on the perception information from the perception system (e.g., size, weight, identification etc.).

[0077] Those skilled in the art will appreciate that numerous modifications and variations may be made to the above disclosed embodiments without departing from the spirit and scope of the present invention.

[0078] What is claimed is:

14

1314425.1