Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AN AUTOMATED CHARACTERIZATION AND REPLICATION SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2019/090178
Kind Code:
A1
Abstract:
The invention relates to a system and method for improved characterization and replication of pluralities of features on objects such as aircraft wings, wherein the precise specifications may vary across objects. In one disclosed method, a first 3D data set representing a plurality of object features from a first object is received. The first 3D data set is then converted into one or more 2D data sets including a normal map. A feature search and find operation is then conducted within the normal map. Finally, the resulting data is converted into a second 3D data set and applied to various automated operations such as feature removal, cleaning, replication, etc. In its various embodiments, the invention also includes a computing device for improved object feature characterization and replication speed, a computer readable medium having instructions for performing the above operations, and other apparatus.

Inventors:
HAMBLIN FLINT (US)
CARTER CLAY (US)
MANGUM MARK (US)
SHAW NATE (US)
SLADE CLAY (US)
LAKATOS JANOS (US)
Application Number:
PCT/US2018/059098
Publication Date:
May 09, 2019
Filing Date:
November 03, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CJC HOLDINGS LLC (US)
International Classes:
G06F17/50; G06K9/36; G06T15/00; G06T17/00; G06T17/10
Foreign References:
US20150269282A12015-09-24
US20120148145A12012-06-14
US20170269574A12017-09-21
US20030160970A12003-08-28
US20160325851A12016-11-10
US20160185047A12016-06-30
US20120275689A12012-11-01
US20160005228A12016-01-07
US6208347B12001-03-27
Attorney, Agent or Firm:
FRISCHKNECHT, Preston, P. (US)
Download PDF:
Claims:
CLAIMS

I claim:

1. A computer-based method of replicating object features comprising:

receiving a first 3D data set representing a first plurality of first object features from a first object;

converting the first 3D data set into one or more 2D data sets;

performing a search and find operation for one or more first object features using the one or more 2D data sets;

converting the one or more 2D data sets into a second 3D data set; and

applying the second 3D data set to machining means to produce a second plurality of first object features in a second object.

2. The computer-based method of replicating object features of claim 1, wherein the second plurality of first object features in the second object is substantially identical to the first plurality of first object features.

3. The computer-based method of replicating object features of claim 2, wherein the machining means is an automated robot.

4. The computer-based method of replicating object features of claim 1, further comprising the step of scanning a first plurality of first object features from a first object with a 3D scanner.

5. The computer-based method of replicating object features of claim 1, further comprising the step of generating a first 3D data set representing a first plurality of first object features from a first object.

6. The computer-based method of replicating object features of claim 1, wherein the one or more 2D data sets includes a height map.

7. The computer-based method of replicating object features of claim 1, wherein the one or more 2D data sets includes a normal map.

8. The computer-based method of replicating object features of claim 1, wherein the one or more 2D data sets include a height map and a normal map.

9. The computer-based method of replicating object features of claim 1, wherein first and second objects are aircraft wings.

10. The computer-based method of replicating object features of claim 1, wherein the first object features include one or more from the group consisting of: apertures, screws, fasteners, and rivets.

11. A computing device for improved object feature replication speed comprising:

a storage device; and

a processor programmed to

receive a first 3D data set representing a first plurality of first object features from a first object;

generate one or more 2D data sets from the first 3D data set representing a first plurality of first object features;

perform a search and find operation for one or more first object features using the one or more 2D data sets;

generate a second 3D data set from the one or more 2D data sets; and apply the second 3D data set to machining means to produce a second plurality of first object features in a second object.

12. The computing device of claim 11, wherein the second plurality of first object features in the second object is substantially identical to the first plurality of first object features.

13. The computing device for improved object feature replication speed of claim 11, wherein the one or more 2D data sets includes a height map.

14. The computing device for improved object feature replication speed of claim 11, wherein the one or more 2D data sets includes a normal map.

15. The computing device for improved object feature replication speed of claim 11, wherein the one or more 2D data sets include a height map and a normal map.

16. The computing device for improved object feature replication speed of claim 1, wherein first and second objects are aircraft wings.

17. The computing device for improved object feature replication speed of claim 1, wherein the first object features include one or more from the group consisting of: apertures, screws, fasteners, and rivets.

18. A computer readable medium having computer-executable instructions for performing operations comprising:

receiving a first 3D data set representing a first plurality of first object features from a first object;

generating one or more 2D data sets from the first 3D data set representing a first plurality of first object features;

performing a search and find operation for one or more first object features;

generating a second 3D data set from the one or more 2D data sets; and applying the second 3D data set to machining means to produce a second plurality of first object features in a second object.

19. The computer readable medium of claim 18, wherein the second plurality of first object features in the second object is substantially identical to the first plurality of first object features.

20. The computer readable medium of claim 18, wherein the one or more 2D data sets includes a height map.

21. The computer readable medium of claim 18, wherein the one or more 2D data sets includes a normal map.

22. The computer readable medium of claim 18, wherein the one or more 2D data sets includes a height map and a normal map.

23. The computer readable medium of claim 18, wherein first and second objects are aircraft wings.

24. The computer readable medium of claim 18, wherein the first object features include one or more from the group consisting of: apertures, screws, fasteners, and rivets.

25. An article of manufacture comprising:

a computer-readable medium having stored thereon a data structure; and a first field containing one or more 2D data sets derived from one or more 3D data sets, the one or more 2D data sets representing a plurality of features from a first object and including a normal map.

26. The article of manufacture of claim 25, further comprising a second field containing one or more 2D data sets derived from one or more 3D data sets, the one or more 2D data sets representing a plurality of features from a first object and including a height map.

27. The article of manufacture of claim 26, further comprising a third field containing a reference template for potential features on the first object.

28. The article of manufacture of claim 27, further comprising a fourth field containing one or more first 3D data sets representing a plurality of features from a first 3D scanned object.

29. The article of manufacture of claim 28, further comprising a fifth field containing one or more second 3D sets derived from the one or more 2D data sets representing a plurality of features from the first object.

30. A characterization and replication system including:

a 3D scanner;

machining means; and

a computing device for improved object feature replication speed, the computing device in communication with the 3D scanner and machining means, and comprising:

a storage device; and

a processor programmed to

receive a first 3D data set representing a first plurality of first object features from a first object;

generate one or more 2D data sets from the first 3D data set representing a first plurality of first object features;

perform a search and find operation for one or more first object features; generate a second 3D data set from the one or more 2D data sets; and apply the second 3D data set to machining means to produce a second plurality of first object features in a second object.

31. The characterization and replication system of claim 30, wherein the 3D scanner is a high definition laser scanner.

Description:
DESCRIPTION

TITLE OF THE INVENTION

An Automated Characterization and Replication System and Method

TECHNICAL FIELD

[0001] The technical fields to which the invention relates are: automated machinery and machining, computers, and computer software.

BACKGROUND

[0002] Industries that use inventories or machine fleets often require replacement parts for them as they age. Generally, and as a desirable characteristic that may be taken for granted, OEM components and replacement components have the same feature identity and dimensions, particularly for a given machine model or design. However, there may be cases where, despite desired uniform feature identity and dimensions, actual feature dimensions vary, even for the same component, machine model, or design. Variation can introduce substantial problems in repair and maintenance because custom and individual design and replication of replacement parts may be necessary to achieve the desired tolerances and fit of the original, OEM parts. Although this dynamic may occur in any number of industries, a specific example is given in the aircraft industry.

[0003] A single aircraft wing may have literally thousands of features (e.g., apertures, screws, fasteners, rivets, etc.). The feature identity and dimensions of these features (e.g., type, location, size, depth, direction) may be roughly uniform across different aircraft of a particular model. However, to the extent that underlying components were produced prior to modern, precision computer numerical control (CNC) technologies, or were subject to some other variation producing dynamic, the dimensions of these features lack precise uniformity across aircraft of the same model. In certain observed instances of wing dimensions, a given feature common to the wing may in fact have variance of as much as .25 of an inch across numbers of similar wings. Variation is exponentially compounded by feature types, quantities, and data points in X, Y, and Z axes along the complex and variable wing curvature or surfacing and the number of aircraft in a fleet.

[0004] Given this variation, in order to conduct repairs or replacements, feature identity and dimensions of each wing must be carefully, individually, and precisely characterized, and a new, tailored wing must be made. Various attempts have been made to grapple with the variation problem to characterize and uninstall OEM parts and ultimately achieve a suitable replica. For example, a first method combines human operators and two-dimensional (2D) scanning technology on a large x, y, z gantry. This method uses a template to approximate feature types and positions, scan for and precisely locate each feature, log its position, remove features, and then ultimately machine a replica of the original. The overall process of characterization and replication for such methods may take between 15 and 18 days per wing to complete.

[0005] A second method uses a three-dimensional (3D) scanner to scan the wing, create a

3D model of features, characterize and map the features in 3D, and then ultimately use that data to machine a replica of the original. This method is more efficient than the first, but still lengthy— up to 5 days to complete, with as much as 8 hours just for computer processing time— because of the complexity of 3D characterization and mapping, along with attendant processing times.

[0006] The variation problem, and current solution attempts, lead to inefficiencies, large overhead, aircraft downtime, lost profits for businesses and possibly even national security and/or preparedness issues for armed forces. Accordingly, there is great need for new technology to make the characterization and replication process more efficient.

SUMMARY OF THE INVENTION

[0007] In accordance with the above, a new characterization and replication system and method is provided, comprising the steps of: receiving a first 3D data set representing a first plurality of first object features from a first object; converting the first 3D data set into one or more 2D data sets; performing a search and find operation for one or more first object features using the one or more 2D data sets; converting the one or more 2D data sets into a second 3D data set; and applying the second 3D data set to machining means to produce a second plurality of first object features in a second object. In its various embodiments, the invention also includes a computing device for improved object feature characterization and replication speed, a computer readable medium having instructions for performing the above operations, and other apparatus. The problem of more efficiently characterizing and replicating a set of features that are generally shared across same component types of a singular machine design or model, but wherein the precise specifications of the feature set vary across machines, is solved.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] To further clarify the above and other aspects of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The drawings may not be drawn to scale. The invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:

[0009] Figure 1 is a block diagram of an exemplary prior art method.

[0010] Figure 2 is a perspective view of exemplary hardware components in one embodiment of the invention.

[0011] Figure 3 is a block diagram of exemplary hardware components in one embodiment of the invention.

[0012] Figure 4 is a block diagram of a method in one embodiment of the invention.

[0013] Figure 5 is a block diagram of a first method portion in one embodiment of the invention.

[0014] Figure 6 is a 3D data set showing a triangle mesh in one embodiment of the invention.

[0015] Figure 7 is a first 2D data set showing a height map in one embodiment of the invention.

[0016] Figure 8 is a diagram illustrating how 3D data may be partially translated to a first

2D format within one embodiment of the method.

[0017] Figure 9 is a second 2D data set showing a color normal map in one embodiment of the invention.

[0018] Figure 10 is a second 2D data set showing a greyscale of a color normal map in one embodiment of the invention.

[0019] Figure 11 are diagrams illustrating how 3D data may be partially translated to a second 2D format within one embodiment of the method.

[0020] Figure 12 is a second 2D data set showing a greyscale of a color normal map as a composite of red, green, and blue channel components in one embodiment of the invention.

[0021] Figure 13 is a second 2D data set showing a template matching feature in one embodiment of the invention.

[0022] Figure 14 is a block diagram of a second method portion in one embodiment of the invention.

[0023] Figure 15 is a block diagram of a third method portion in one embodiment of the invention. [0024] Figure 16 is a block diagram of a fourth method portion in one embodiment of the invention.

[0025] Figure 17 is a block diagram of exemplary computer components of the invention.

DESCRIPTION OF EMBODIMENTS

[0026] The present invention in its various embodiments, some of which are depicted in the figures herein, is a characterization and replication system and method. A first example of a prior art characterization method 100 is set forth in Fig. 1. In this first method 100, a physical or electronic template with a plurality of expected features, types and/or locations is used to approximate the same information for actual features on an object 101. For each expected feature, the expected approximate location (e.g., x, y coordinates) is searched with a two-dimensional (2D) vision system, and the feature is then more precisely characterized and located 102. Feature information is then logged and a first operation performed with respect to the feature, e.g., fastener removal 103. A human operator then confirms 104 completion of the first operation 103 with respect to a feature or a plurality of features on an object. If the first operation 103 is confirmed completed 105, a second operation 106, e.g., cleaning and inspection of fastener receptacles and/or the object, may be performed. If the first operation 103 is not confirmed 104 completed 105, then the process 100 may be repeated 107 until a full characterization of object features is achieved as a first step towards replicating the object with its unique features. Significantly, full characterization of a particular object like an aircraft wing may take five days (eight hours each day), per side or surface, to complete using this method. Afterwards, machining a new object to replicate the unique features of the old one may take an additional seven to ten days.

[0027] In a second exemplary prior method (not shown) a three-dimensional (3D) scanner scans the wing and creates a 3D model of features by converting point cloud data to a .stl file format. Features are characterized and mapped in 3D, including using, for example, a suitable software like Polyworks® to identify individual features one at a time from the .stl file. This data is used to perform similar operations to those set forth in the first exemplary method above. This second exemplary method is more efficient than the first, but still lengthy, taking up to three days to complete characterization, first, and second operations, with as much as ten hours just for computer processing time owing to the complexity of 3D characterization and mapping. Machining a new object to replicate the unique features of the old one may take an additional two days. [0028] Referring now to Figs. 2 and 3, an improved system is shown and described to implement the new methods described in more detail below. In one embodiment, the system 200 may be partially or wholly automated and implemented through computer (not shown) and/or other machine application and include one or more additional hardware components. For example, the system 200 may include one or more automated machines or robots 201 for scanning features and/or machining replicates. In the illustrated embodiment, scanner and/or machining tools 202 are interchangeable at one end of a robot arm in order to allow the robot 201 to perform different functions at different steps. In the illustrated embodiment of Fig. 2, the system 200 is shown in the environment of an exemplary workspace with a wing or wing skin- type object 203 having a plurality 206 of features 208, 209, 210 that the system 200 is either scanning, characterizing, or replicating pursuant to the below method.

[0029] Referring now to Fig. 3, a block diagram 300 of various hardware and machine components of the system 200 are shown. In various embodiments, the system 200 may include one or more computers 301, including a computer data processing terminal with interface, with a processor coupled to memory and instructions (e.g., software; see also Fig. 17) 302 to perform a variety of tasks to facilitate characterization and/or replication. Such tasks may include, but are not limited to, initiating, conducting, or carrying out any of the method steps set forth below.

[0030] The system 200 may also include one or more automated machines or robots 303, in communication with the computer 301 and operable through memory and/or a programmable logic controller 304. An example of a suitable computer is a Dell Precision series, Dual Xeon Gold 6130 Processors or better, 192 GB Ram, Redundant arrayed 10k hard drives with RAID 10, Video Card Nvidia P6000 class, although many others may also be suitable. Exemplary robots 303 include one or more Fanuc M-900iB/400Ls with Dual Check Safety system, R30iB controller, IR Calibration package, and Adaptive Control package, although other computers may also be suitable. Exemplary controllers 304 include Allen Bradley, ControlLogix series, although again, others may also be suitable.

[0031] The one or more automated machines or robots 303 may be in communication with and/or capable of manipulating a variety of interchangeable tools 305. Such tools 705 may include one or more: 3D scanners 306, drills or drilling heads 307, de-fastening heads 308— or any other tools 309 such as lathes, rivet guns, screwdrivers, ratchets, wrenches and the like— necessary for implementing some or all method steps or operations set forth below. Suitable 3D scanners include, but are not limited to: contact/CMM (arm or gantry); non-contact macro (e.g., time-of- flight/laser, triangulation/laser, conoscopic holography, confocal chromatic, structured light, modulated light, stereo 3D, photogrammetry); or non-contact micro (e.g., white light interferometry, confocal scanning microscopy, confocal laser scanning microscopy). Some embodiments utilize a Leica® T-Scan 5, high-definition laser scanner.

[0032] In various embodiments, external position tracking components 310 may also be used in connection with the one or more scanners 306, and further be in communication with and/or operable by computer 301 and/or programmable logic controller 304. One example of a suitable tracking system 310 is a Leica AT960 Class, with T-Scan 5 scanner, T-Probe III, T-Mac Probe, EtherCat communications, T-Scan collect software, Red Ring reflector 1.5", Automation Interface Controller w/ 20m Cable, and robot adaptor for T-Scan 5, although any number of other tracking systems may also be used without departing from the purpose and scope of the invention.

[0033] Referring now to Figs. 2 and 4, as a first step in the method 400 used in connection with the system 200, an object (e.g., intended to be replaced or replicated) 203 is scanned 401 with a 3D scanner. Typically, the scanned object 203 has a plurality of features 204 (e.g., apertures, screws, fasteners, rivets, etc.) having unique specifications among otherwise generally uniform features, including within a same model or design. For example, Fig. 2 shows a subset 206 of such features on the object 203 as a screw 208, first aperture 209, and second aperture 210. The scanning operation may be user initiated (through computer or otherwise) or automated, including by robot 201, or any combination of these. Scanner software generates one or more three dimensional (3D) models 402 of the object, typically in point cloud or triangle mesh (see Fig. 6). Although any number of different software products may be used for this operation— Polyworks®, as set forth above, is one known example. In certain embodiments, the scanning step 401 is omitted, and the 3D data set is simply received.

[0034] Through a computer, custom software coding converts the 3D model or object data set into one or more two-dimensional (2D) data sets or models 403 for packaging underlying information in such a manner as to increase the efficiency of manipulation. Data sets may be generated simultaneously or separately. Referring briefly to Figs. 5 and 7, one 2D data set may include a height map 501. The heightmap 501 is a raster image used to store surface elevation data in 2D. The heightmap 501 contains one channel interpreted as a displacement or "height" from the "floor" to a surface that may be visualized as luma of a grayscale image. The heightmap 501 thereby represents or characterizes 3D elevation data (or the Z coordinates associated with features on X, Y coordinates) in a 2D gray (or other) scale model. To create the 2D dataset from a 3D dataset it is necessary to transform or map the coordinates of the 3D dataset from a 3D space to a 2D space in some fashion. Many types of mapping may be used including, but not limited to, planar, cylindrical, and spherical mapping. Planar mapping may be accomplished by projecting the 3D coordinates onto a chosen plane. In the case of a wing or wing-like object, a convenient and preferred plane is one parallel to the ground (normal to gravity) positioned some distance below the wing.

[0035] To generate a height map 501 , in one embodiment, a list of triangles from a triangle mesh is parsed and the x, y, and z extents of the triangle mesh are determined. At this point, a bitmap resolution must be specified to determine the bitmap size. For instance, if the x, y size of the triangle mesh is 1000 mm by 1000 mm, and a resolution of 0.1mm is desired, then a 10000 x 10000 bitmap is generated (1000/0.1). Bitmaps are typically addressed starting at 0, 0 for the upper left pixel of the bitmap. Accordingly, the x, y triangle coordinates must be offset to begin at 0, 0 (606) instead of the mesh bounding box coordinates from the scanner coordinate system. Additionally, the y coordinates must be reflected about the x axis to account for the difference between a standard Cartesian coordinate system and bitmap coordinate addressing (lower left y- axis zero for Cartesian versus upper left y-axis zero for a bitmap). The triangles are then rasterized by projecting them onto the preferred image plane, along the z-axis. The grayscale values of the pixels in the projected triangles are assigned as gradients between the normalized z-coordinates on each triangle vertex (normalization scales the range of z values to real numbers between 0 and 1). Briefly referring to Fig. 8, the normalized values 804, 805 will be encoded to grayscale 807 by integers in the range of 0 - 255 for 8-bit grayscale images 806 and 0 - 65535 for 16-bit grayscale images. The speed of this process can be increased dramatically by using graphics APIs such as OpenGL or DirectX which may utilize computer hardware specifically designed for triangle rasterization.

[0036] Referring now to Figs. 5 and 9 through 12, another 2D data set may include a normal map 502 or image that stores a direction or "normal" at each pixel. In a normal map, surface curvature is represented by a field of normal vectors (vector perpendicular to the surface) stored in an array which may be encoded as an RGB raster image. The RGB components correspond to the x, y, and z direction vector components, respectively, of the surface normal. Since the normal map is a representation of surface curvature, the normal map reveals details in a surface (e.g., texture, displacement, etc.) that enable a template matching algorithm (such as that explained in more detail below) to be more accurate. More particularly, for small bumps or grooves on a surface, pixel values on a height map will be nearly equal. However, the since the slope of the surface curvature changes abruptly at small bumps or grooves, pixel values will be quite different on the normal map in such locations, again, enabling a template matching algorithm to be much more accurate. This is particularly useful for locating counter-sunk fasteners (e.g., screws, rivets) whose heads are featureless and mounted flush with the surface, which is common on an aircraft wing or body. This is illustrated by comparing Figs. 7 and 9 & 10. In faster 702 (Fig. 7), the outer diameter is nearly invisible in the height map. For the same fastener 902 (Figs. 9 & 10), the outer diameter is clearly defined on the normal map 502. Thus, a template matching algorithm will identify the feature a with a higher confidence level and locate the feature more accurately when the normal map is used.

[0037] To generate a normal map 502, in one embodiment, a list of triangles from a triangle mesh (see, e.g, Fig. 6) is parsed and the x, y, and z extents of the triangle mesh are determined. At this point, a bitmap resolution must be specified to determine the bitmap size. For instance, if the x, y size of the triangle mesh is 1000 mm by 1000 mm, and a resolution of 0.1mm is desired, then a 10000 x 10000 bitmap is generated (1000/0.1). Bitmaps are typically addressed starting at 0, 0 for the upper left pixel of the bitmap. Accordingly, the x, y triangle coordinates must be offset to begin at 0, 0 (606) instead of the mesh bounding box coordinates from the scanner coordinate system. Additionally, the y coordinates must be reflected about the x axis to account for the difference between a standard Cartesian coordinate system and bitmap coordinate addressing (lower left y-axis zero for Cartesian versus upper left y-axis zero for a bitmap). The triangles are then rasterized by projecting them onto the preferred image plane, along the z-axis. The RGB (Red, Green, Blue) color values of the pixels in the projected triangles are assigned using the x, y, and z direction components from the triangle normals. The speed of this process can be increased dramatically by using graphics APIs such as OpenGL or DirectX which may utilize computer hardware specifically designed for triangle rasterization. The triangle normals may be calculated by the scanning software or computed as the cross product of two vectors defined by the triangle vertices. [0038] Referencing Fig. 11 now, the normal vector comprises three components 1101,

1102, 1103 with magnitudes from -1 to 1. The x component 1101, 1109 is encoded on the red channel of the bitmap with values from -1 to 0 (1110) represented by integers from 0 to 128 (1111), and values from 0 to 1 (1110) represented by integers from 128 to 256, for 24-bit RGB color images 1111. For 48-bit RGB color images, values from -1 to 0 (1110) are represented by integers from 0 to 32768, and values from 0 to 1 are represented by integers from 32768 to 65536. The encoding is similar for the y component 1102, 1116, 1117, which is encoded on the green channel 1118 and the z component 1103, 1123, 1124, which is encoded on the blue channel 1125.

[0039] To further illustrate representative 2D data, example feature point location 605 represents the same location in Figs. 6 (triangle mesh), 7 (height map), 9 & 10 (normal map) and 12 (normal map color channels), but the type of 2D data stored for the feature point varies by data set in a manner referenceable by the figures. More specifically, outlined in Table 1 below are specific values for feature point locations 605 and 606, across data sets depicted in the figures mentioned:

Table 1: Exemplary Feature Point Information by Data Set

9 & 10 Following the 3D to 2D mapping operation 403, a The raster image 502 origin point in a bitmap 502 with a row, column address (0,0).

of (1265, 555) and a 24-bit RGB value of (152,

206, 225), corresponding to a direction vector of

(0.19, 0.61, 0.76) 1104, 1105, encoded as per

items 1101, 1102, and 1103 (see Fig. 11)

12 A point in the red channel of a bitmap (502) with The raster image 502 origin

(1201) a row, column address of (1265, 555) and an 8-bit (0,0).

value of 152 corresponding to the x component,

0.19, of a direction vector, encoded as per item

1101 (see Fig. 11).

12 A point in the green channel of a bitmap with a The raster image 502 origin

(1202) row, column address of (1265, 555) and an 8-bit (0,0).

value of 206, corresponding to the y component,

0.61, of a direction vector, encoded as per item

1102 (see Fig. 11)

12 A point in the blue channel of a bitmap with a The raster image 502 origin

(1203) row, column address of (1265, 555) and an 8-bit (0,0).

value of 225, corresponding to the z component,

0.76, of a direction vector, encoded as per item

1103 (see Fig. 11)

Feature point location 605 in Figs. 9 and 10 correspond to a direction vector A encoded in RGB. Referring briefly to Fig. 11, direction vector A is more clearly illustrated in diagram 1104— an isometric view of the specific direction vector— and diagram 1105— a view of the direction vector looking down the z-axis, the same projection used in Figs. 6, 7, 9, 10 and 12.

[0040] Referring now to Figs. 4 and 13, a search and find operation 404 is performed. One or more object features (e.g., apertures 1309/1310, screws 1308, fasteners, rivets, etc., including as further categorized by particular size, shape, or other physical characteristic) is searched for and/or identified by reference to a 2D "key" or template normal map (template matching), e.g., a fastener template 1301 and/or an aperture template 1302. In one embodiment, this operation is accomplished through pattern recognition algorithms such as convolution, correlation, cross- correlation, correlation coefficient, Pearson correlation, sliding dot product, sum of differences or any other suitable pattern recognition algorithm. This pattern recognition is made more efficient through the use of parallel processing on multi-core CPUs, GPUs with thousands of processing cores, and/or distributed (cloud) computing. The result of the template matching operation is a grayscale image whose brightest points are the regions most likely to be the features of interest. This grayscale image is turned into a binary image by thresholding, where everything below a certain value is black and everything above is white. The remaining white pixels correspond to the upper right corner of the matched template on the image that was searched (this may be inverted, depending on the algorithm used). The result of the search and find operation is a location map of the feature that was searched for. A location map will be generated for every feature search, thus, numerous location maps may be generated from each height map, normal map set. As multiple objects of a similar type are scanned, a library of normal maps of similar features generated from the technique outlined in Fig. 13 is recorded. Once the library reaches an appropriate size, the multitude of normal maps may be used to train a machine learning model for more efficient and accurate pattern matching.

[0041] Within the method 400, the one or more 2D feature location maps are translated back into the 3D coordinate system of the original 3D scan 405 and transferred to a computer, machine, and/or robot for further operations described herein. In one embodiment, because the feature finding algorithms operate on a template sliding across the searched image, one pixel at a time, multiple points are often identified for one feature on the location map. The actual coordinates of each feature on the location map may be found by averaging (or other data fitting methods) coordinates which lie in close proximity. By using the inverse of the transforms (scaling, offsets, reflections, etc.) described above, the x and y coordinates are transformed from the 2D feature location map coordinate system to the 3D coordinate system of the original 3D scan. The z coordinates of the located instances are found by averaging the z values of the height map surrounding the feature. The normals of the located instances are found by averaging values on the normal map surrounding the feature. The x, y, z coordinates now correspond to physical locations on the object (e.g., wing) and the x,y,z vector components of the located normal now correspond to the direction perpendicular to the object (e.g., wing) surface at the x, y and z location coordinates. [0042] The method 400 may include additional steps, including a second method portion

1400. Referring now to Fig. 14, some or all method steps 400 as described above may be performed with respect to a particular object (e.g,. aircraft wing) to achieve a first characterization of object features. Once the object is characterized, corresponding 3D data is transferred to another machine to disassemble 1401 the object and/or features from remaining machinery or componentry. In various embodiments, features such as apertures are cleaned 1402 (including by automated means) to remove any accumulated debris for purposes of obtaining further characterization.

[0043] Referring now to Fig. 15, within a third method portion 1500, some or all method steps 400 are repeated with respect to the same object to obtain a second characterization of features, e.g., apertures without fastening features and/or accessories (screws, rivets, etc). The resulting 3D characterization data is then transferred to another machine, that then uses the data to machine 1501 a replica of the original object, with its attendant and unique features, in a template or blank object. Referring now to Fig. 16, in a fourth method portion 1600, some or all method steps 400 are repeated with respect to the object in order to obtain a third characterization 1601 for the purpose of confirming feature and/or object replication accuracy and/or for performing a quality control function by comparing third characterization data with second characterization data.

[0044] Referring now to Fig. 17, the system 200 may include a computer 1700 with processor 1701 and memory 1702 with one or more modules to initiate, conduct, or carry out method steps above, through custom software purposed for the same. For example, memory 1702 may include a module for: (1) communicating or instructing the one or more automated machines or robots 1703; (2) initiating, sending, or receiving information from the 3D scanner 1704; (3) transforming scanned data to 3D and/or 2D models, converting 3D models to 2D models or vice versa, or accomplishing one or more of the steps otherwise set forth in Figs. 4 & 14 - 16 (and attendant processes illustrated in the other figures) 1705; (4) mapping object features 1706; (5) transmitting transformed data sets to one or more automated machines or robots 1707; (6) instructing such automated machines or robots with respect to various tool operations 1708 and/or accomplishing one or more of the steps otherwise set forth above.

[0045] The system 200 may further include a computer readable medium having computer- executable instructions for performing one or more of the method steps described above. Computer readable mediums include, but are not limited to, any kind of computer memory such as disks, drives, nonvolatile ROM, and/or RAM.

[0046] The system may also include an article of manufacture comprising a computer- readable medium having stored thereon a data structure with a first field containing one or more 2D data sets derived from one or more first 3D data sets, the one or more 2D data sets representing a plurality of features from a first object and including a normal map. A second field may contain one or more 2D data sets derived from one or more first 3D data sets, the one or more 2D data set representing a plurality of features from a first object and including a height map. A third field may contain one or more reference templates for potential, expected, and/or predetermined features on the first object. A fourth field may contain one or more first 3D data sets representing a plurality of features from a first 3D scanned object. A fifth field may contain one or more second 3D sets derived from the one or more 2D data sets representing a plurality of features from the first object. Other data structures specific to the processes set forth above may also be included.

[0047] Use of the above method and apparatus is vastly superior over the most efficient prior art solutions. For example, depending on the method and/or system deployed, complete characterization and fastener removal operations on aircraft wings may take up to five days per wing surface (2 surfaces per wing, top and bottom). Introducing existing 3D scanning methods and technologies may reduce this time down to 2-3 days per wing surface, but such methods require up to 10 hours of processing time. With the system and method as set forth above, scan and processing times may be reduced to under two hours per wing surface (through 95%+ improvement in processing time) thereby allowing for complete characterization and fastener removal operations to take place in as little as one day, with other machining and/or replication accomplished shortly thereafter. Future improvements in processing and other technology should further enhance times and efficiencies of the new system and method.

[0048] The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.