Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR AI VISUAL INSPECTION
Document Type and Number:
WIPO Patent Application WO/2021/062536
Kind Code:
A1
Abstract:
Provided is a system and method for visual inspection. The system may be used in a quality assurance station at a manufacturing facility or site. The system may evaluate and determine the quality of manufactured or fabricated articles. The system may include a mechanical subsystem for capturing images of the article. The system may include a sensor such as a camera for capturing data, such as images. The system may include an artificial intelligence system to determine if the article suffers from an impermissible defect.

Inventors:
BUFI MARTIN (CA)
SHEHATA RAEF (CA)
Application Number:
PCT/CA2020/051306
Publication Date:
April 08, 2021
Filing Date:
September 30, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
MUSASHI AUTO PARTS CANADA INC (CA)
International Classes:
G01N21/88; G01N21/89; G06N3/02
Foreign References:
US20190096135A12019-03-28
US20170249729A12017-08-31
US20030164952A12003-09-04
US20170278374A12017-09-28
Attorney, Agent or Firm:
HINTON, James W. (CA)
Download PDF:
Claims:
Claims:

1. A system for visual inspection of an article, the system comprising: a camera for acquiring image data of an article under inspection; a node computing device for receiving the image data from the camera and analyzing the image data using a defect detection model trained to detect at least one defect type, the defect detection model comprising a machine-learning based object detection model configured to receive the image data as an input and generate defect data describing a detected defect as an output; and a programmable logic controller (“PLC”) device for receiving the defect data from the node computing device and determining whether the defect data is acceptable or unacceptable by comparing the defect data to tolerance data.

2. The system of claim 1 , further comprising a robotic arm for autonomously moving the camera to a first imaging position relative to the article under inspection prior to acquiring the inspection image.

3. The system of claim 2, wherein upon determining the defect data is unacceptable, the robotic arm is configured to autonomously move the camera to an article identifier position at which an article identifier on the article under inspection is within the imaging zone of the camera and the camera is configured to acquire an image of the article identifier.

4. The system of claim 1 , further comprising a robotic arm for autonomously moving the camera through a predetermined sequence of imaging positions during inspection of the article.

5. The system of claim 1, further comprising an article manipulator configured to engage and rotate the article during inspection.

6. The system of claim 1, wherein the image data comprises a plurality of images acquired during rotation of the article, wherein the node computing device is configured to construct a stitched image from the plurality of images using an image stitching technique, and wherein the image data is provided to the defect detection model as the stitched image.

7. The system of claim 1, further comprising a user control device for receiving data from at least one of the PLC device and the node device, generating a visualization using the received data, and displaying the visualization in a user interface.

8. The system of claim 7, wherein the user interface includes a PLC mode and a node device mode, wherein the PLC mode displays a visualization of PLC device operations and the node device mode displays a visualization of node device operations.

9. The system of claim 8, wherein the user control device is configured to switch between the PLC mode and the node device mode automatically.

10. The system of claim 1, wherein the node computing device is configured to send the defect data to the PLC device upon confirming the detected defect has appeared in a requisite number of consecutive image frames.

11. The system of claim 1, wherein the PLC device is configured to generate a stop inspection command upon determining the defect data is unacceptable.

12. An artificial intelligence (“Al”) visual inspection system comprising: a node computing device configured to: receive image data of an article under inspection; and analyze the image data using a defect detection model trained to detect at least one defect type, the defect detection model comprising a machine learning based object detection model configured to receive the image data as an input and generate defect data describing a detected defect as an output; and a programmable logic controller (“PLC”) device communicatively connected to the node computing device and configured to: receive the defect data from the node computing device; and determine whether the defect data is acceptable or unacceptable by comparing the defect data to tolerance data.

13. The system of claim 12, wherein the defect data includes a defect class corresponding to a defect type, a defect location, and a confidence level.

14. The system of claim 12, wherein the defect detection model comprises a two-stage object detection model.

15. The system of claim 14, wherein the two-stage object detection model includes a Region Proposal Network to generate regions of interest in a first stage, and wherein the regions of interest are sent down a pipeline for object classification and bounding-box regression.

16. The system of claim 12, wherein the defect detection model is configured to perform multiclass classification for classifying instances into one of three or more classes, wherein the classes include at least two defect types.

17. The system of claim 12, wherein the defect detection model has been modified to run at the highest possible frame rate.

18. The system of claim 12, wherein the defect detection model comprises a neural network, and wherein the neural network has been optimized by fusing layers of the neural network to compress the size of the neural network to run on the node computing device.

19. The system of claim 12, wherein the node computing device includes a dual layer of security wherein the node device is encrypted and files on the node device are encrypted.

20. The system of claim 12, wherein the node computing device includes a defect counter for the article that is updated when the PLC device determines the defect data is unacceptable.

21. The system of claim 12, wherein the node computing device is further configured to determine whether the detected defect is a true detection by tracking the defect across consecutive image frames.

22. The system of claim 12, wherein the image data comprises a plurality of images acquired during rotation of the article, wherein the node computing device is configured to construct a stitched image from the plurality of images using an image stitching technique, and wherein the image data is provided to the defect detection model as the stitched image.

23. A method of automated visual inspection of an article using artificial intelligence (“Al”), the method comprising: acquiring image data of an article under inspection using a camera; providing the image data to a node computing device; analyzing the image data at the node computing device using a defect detection model trained to detect at least one defect type, the defect detection model comprising a machine-learning based object detection model configured to receive the image data as an input and generate defect data describing a detected defect as an output; sending the defect data from the node computing device to a programmable logic controller (“PLC”) device; and determining at the PLC device whether the defect data is acceptable or unacceptable by comparing the defect data to tolerance data.

24. The method of claim 23, further comprising tracking the detected defect across consecutive image frames using the node computing device.

25. The method of claim 23, wherein sending the defect data from the node computing device to the PLC device is performed in response to the detected defect being tracked across a requisite number of consecutive image frames.

26. The method of claim 23, wherein the image data comprises a plurality of images acquired during rotation of the article, wherein the method further comprises: constructing a stitched image from the plurality of images using an image stitching technique; and providing the image data to the defect detection model as the stitched image.

27. The method of claim 23, further comprising generating a stop inspection command at the PLC device upon determining the defect data is unacceptable.

28. The method of claim 23, further comprising: generating an alarm command at the PLC device upon determining the defect data is unacceptable; sending the alarm command to an alarm system configured to generate and output an alarm.

29. The method of claim 23, further comprising continuing inspection of the article upon determining the defect data is unacceptable and updating a defect counter for the article to include the detected defect.

30. The method of claim 23, further comprising: upon determining the defect data is unacceptable, autonomously moving the camera to an article identifier position at which an article identifier on the article under inspection is within the imaging zone of the camera; and acquiring an image of the article identifier.

31. The method of claim 30, further comprising linking and storing the article identifier and the defect data in a database.

Description:
SYSTEM AND METHOD FOR Al VISUAL INSPECTION

Technical Field

[0001] The following relates generally to visual inspection, and more particularly to systems and methods for visual defect inspection of an article using artificial intelligence.

Introduction

[0002] Visual inspection can be an important part of product manufacturing. Defective products can be costly. Discovering defective products at an appropriate stage of the process can be an important step for businesses to prevent the sale and use of defective articles and to determine root causes associated with the defects so that such causes can be remedied.

[0003] Existing visual inspection solutions may be difficult to implement in manufacturing and other similar settings where space is limited. Hardware requirements can increase cost and complexity of operations.

[0004] In some cases, visual inspection may be still carried out using human inspection. For example, the inspection of camshafts may include a human physically picking up the camshaft and examining it using a magnifying tool. Human visual inspection techniques may be limited in accuracy and speed. Human inspection may be prone to missing defects. Further, human inspection may include the process of manually documenting visual inspection including documenting identified defects and related information. Such documentation processes can further slow inspection.

[0005] In some cases of human visual inspection, secondary and tertiary inspections may be performed on articles deemed defective during a primary inspection to confirm results. Such additional inspection steps may increase inspection time per article significantly.

[0006] In the case of camshaft inspection, inspection by a primary inspector may take approximately 45 seconds to 1 minute per camshaft. Secondary inspection may increase time to approximately 1 .5 minutes per camshaft. Tertiary inspection may further increase time to approximately 2 minutes and 15 seconds. These inspection times do not include travel times associated with the human inspection, which can add to inspection times.

[0007] In the manufacturing industry, many visual inspection solutions, including those using artificial intelligence (Al), use the cloud. Other solutions may use on-premise servers or standard form factor computer parts. Many users do not want to use the cloud in order to keep communication locked behind their firewalls and create a data silo. When visual inspection is implemented with servers on-premise, the user is to bear a large setup cost to obtain and install specific GPUs for Al inference which may need to be optimized properly. If the Al ingestion pipeline is not optimized properly, many GPUs may be left running without tasks to complete, drawing huge amounts of power while remaining idle. For example, a regular GPU may draw approximately 250W.

[0008] Further, existing approaches may be difficult to scale up to a user’s needs. If the main device fails or breaks, there may be downtime, which can be costly to users.

[0009] Visual inspection solutions are desired that can perform local inference at a level that is suitable for defect detection of manufactured articles and at a speed that meets industrial visual inspection requirements, without incurring significant costs.

[0010] Accordingly, there is a need for an improved system and method for visual inspection that overcomes at least some of the disadvantages of existing systems and methods.

Summary

[0011] Provided is a system and method for visual inspection. The system may be used in a quality assurance station at a manufacturing facility or site. The system may evaluate and determine the quality of manufactured or fabricated articles using any one or more of the methods described herein.

[0012] Provided is a system for visual inspection of an article. The system includes a camera for acquiring image data of an article under inspection, a node computing device for receiving the image data from the camera and analyzing the image data using a defect detection model trained to detect at least one defect type, the defect detection model comprising a machine-learning based object detection model configured to receive the image data as an input and generate defect data describing a detected defect as an output, and a programmable logic controller (“PLC”) device for receiving the defect data from the node computing device and determining whether the defect data is acceptable or unacceptable by comparing the defect data to tolerance data.

[0013] The system may further include a robotic arm for autonomously moving the camera to a first imaging position relative to the article under inspection prior to acquiring the inspection image.

[0014] Upon determining the defect data is unacceptable, the robotic arm may be configured to autonomously move the camera to an article identifier position at which an article identifier on the article under inspection is within the imaging zone of the camera and the camera is configured to acquire an image of the article identifier.

[0015] The system may further include a robotic arm for autonomously moving the camera through a predetermined sequence of imaging positions during inspection of the article.

[0016] The system may further include an article manipulator configured to engage and rotate the article during inspection.

[0017] The image data may include a plurality of images acquired during rotation of the article, wherein the node computing device is configured to construct a stitched image from the plurality of images using an image stitching technique, and wherein the image data is provided to the defect detection model as the stitched image.

[0018] The system may further include a user control device for receiving data from at least one of the PLC device and the node device, generating a visualization using the received data, and displaying the visualization in a user interface.

[0019] The user interface may include a PLC mode and a node device mode, wherein the PLC mode displays a visualization of PLC device operations and the node device mode displays a visualization of node device operations.

[0020] The user control device may be configured to switch between the PLC mode and the node device mode automatically. [0021] The node computing device may be configured to send the defect data to the PLC device upon confirming the detected defect has appeared in a requisite number of consecutive image frames.

[0022] The PLC device may be configured to generate a stop inspection command upon determining the defect data is unacceptable.

[0023] Provided is an artificial intelligence (“Al”) visual inspection system comprising a node computing device configured to receive image data of an article under inspection, and analyze the image data using a defect detection model trained to detect at least one defect type, the defect detection model comprising a machine-learning based object detection model configured to receive the image data as an input and generate defect data describing a detected defect as an output. The Al visual inspection system includes a programmable logic controller (“PLC”) device communicatively connected to the node computing device and configured to receive the defect data from the node computing device, and determine whether the defect data is acceptable or unacceptable by comparing the defect data to tolerance data.

[0024] The defect data may include a defect class corresponding to a defect type, a defect location, and a confidence level.

[0025] The defect detection model may include a two-stage object detection model.

[0026] The two-stage object detection model may include a Region Proposal Network to generate regions of interest in a first stage, and wherein the regions of interest are sent down a pipeline for object classification and bounding-box regression.

[0027] The defect detection model may be configured to perform multiclass classification for classifying instances into one of three or more classes, wherein the classes include at least two defect types.

[0028] The defect detection model may be modified to run at the highest possible frame rate.

[0029] The defect detection model may include a neural network, and wherein the neural network has been optimized by fusing layers of the neural network to compress the size of the neural network to run on the node computing device. [0030] The node computing device may include a dual layer of security wherein the node device is encrypted and files on the node device are encrypted.

[0031] The node computing device may include a defect counter for the article that is updated when the PLC device determines the defect data is unacceptable.

[0032] The node computing device may be further configured to determine whether the detected defect is a true detection by tracking the defect across consecutive image frames.

[0033] The image data may include a plurality of images acquired during rotation of the article, wherein the node computing device is configured to construct a stitched image from the plurality of images using an image stitching technique, and wherein the image data is provided to the defect detection model as the stitched image.

[0034] Provided is a method of automated visual inspection of an article using artificial intelligence (“Al”). The method may include acquiring image data of an article under inspection using a camera, providing the image data to a node computing device, analyzing the image data at the node computing device using a defect detection model trained to detect at least one defect type, the defect detection model comprising a machine-learning based object detection model configured to receive the image data as an input and generate defect data describing a detected defect as an output, sending the defect data from the node computing device to a programmable logic controller (“PLC”) device, and determining at the PLC device whether the defect data is acceptable or unacceptable by comparing the defect data to tolerance data.

[0035] The method may further include tracking the detected defect across consecutive image frames using the node computing device.

[0036] Sending the defect data from the node computing device to the PLC device may be performed in response to the detected defect being tracked across a requisite number of consecutive image frames.

[0037] The image data may include a plurality of images acquired during rotation of the article. The method may further include constructing a stitched image from the plurality of images using an image stitching technique, and providing the image data to the defect detection model as the stitched image. The method may further include generating a stop inspection command at the PLC device upon determining the defect data is unacceptable. The method may further include generating an alarm command at the PLC device upon determining the defect data is unacceptable and sending the alarm command to an alarm system configured to generate and output an alarm.

[0038] The method may further include continuing inspection of the article upon determining the defect data is unacceptable and updating a defect counter for the article to include the detected defect.

[0039] The method may further include upon determining the defect data is unacceptable, autonomously moving the camera to an article identifier position at which an article identifier on the article under inspection is within the imaging zone of the camera; and acquiring an image of the article identifier.

[0040] The method may further include linking and storing the article identifier and the defect data in a database.

[0041] Other aspects and features will become apparent, to those ordinarily skilled in the art, upon review of the following description of some exemplary embodiments.

Brief Description of the Drawings

[0042] The drawings included herewith are for illustrating various examples of articles, methods, and apparatuses of the present specification. In the drawings:

[0043] Figure 1 is a schematic diagram of an Al visual inspection system, according to an embodiment;

[0044] Figure 2 is a schematic diagram of an imaging unit of the visual inspection system of Figure 1, according to an embodiment;

[0045] Figure 3 is a flowchart of a visual inspection sequence, according to an embodiment;

[0046] Figure 4 is a flowchart of a method of a visual inspection sequence, according to an embodiment; [0047] Figure 5 is a flowchart of a method of running the network 156 of the node device 148 of Figure 1 in real-time, according to an embodiment;

[0048] Figure 6 is a flowchart of a method of object detection performed by the visual inspection system of Figure 1 , according to an embodiment;

[0049] Figure 7 is an example image of an inspected camshaft generated by the visual inspection system of Figure 1 , according to an embodiment;

[0050] Figure 8 is an example image of an inspected camshaft generated by the visual inspection system of Figure 1 , according to an embodiment;

[0051] Figure 9 is an example image of an inspected camshaft generated by the visual inspection system of Figure 1 , according to an embodiment;

[0052] Figure 10 is an example image of an inspected camshaft generated by the visual inspection system of Figure 1 , according to an embodiment;

[0053] Figure 11A is a first perspective view of a mechanical inspection subsystem, according to an embodiment;

[0054] Figure 11 B is a second perspective view of the mechanical inspection subsystem of Figure 11 A, according to an embodiment;

[0055] Figure 11 C is a top view of the mechanical inspection subsystem of Figure 11 A, according to an embodiment;

[0056] Figure 11 D is a front view of the mechanical inspection subsystem of Figure 11 A, according to an embodiment;

[0057] Figure 11 E is back view of the mechanical inspection subsystem of Figure 11 A, according to an embodiment;

[0058] Figure 11 F is a side view of the mechanical inspection subsystem of Figure 11 A, according to an embodiment;

[0059] Figure 11G is a front view of an imaging unit of the mechanical inspection subsystem of Figure 11 A, according to an embodiment; [0060] Figure 11 H is a side view of the imaging unit of Figure 11 A, according to an embodiment;

[0061] Figure 111 is a front view of an article holder and article manipulator of the mechanical inspection subsystem of Figure 11 A, according to an embodiment;

[0062] Figure 11 J is an example parts table for the mechanical inspection subsystem of Figure 11 A, according to an embodiment;

[0063] Figure 12 is a front view of a mechanical inspection subsystem of the present disclosure, according to an embodiment;

[0064] Figure 13 is a flowchart of a method of real-time streaming video analysis, according to an embodiment;

[0065] Figure 14A is a perspective view of a mechanical inspection subsystem using robotic automation, according to an embodiment;

[0066] Figure 14B is a top view of the mechanical inspection subsystem of Figure 14A;

[0067] Figure 14C is a front view of the mechanical inspection subsystem of Figure 14A;

[0068] Figure 14D is a side view of the mechanical inspection subsystem of Figure 14A; and

[0069] Figure 15 is a block diagram illustrating communication between various components of a visual inspection system including a production machine and camshaft Al visual inspection machine, according to an embodiment.

Detailed Description

[0070] Various apparatuses or processes will be described below to provide an example of each claimed embodiment. No embodiment described below limits any claimed embodiment and any claimed embodiment may cover processes or apparatuses that differ from those described below. The claimed embodiments are not limited to apparatuses or processes having all of the features of any one apparatus or process described below or to features common to multiple or all of the apparatuses described below.

[0071] One or more systems described herein may be implemented in computer programs executing on programmable computers, each comprising at least one processor, a data storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. For example, and without limitation, the programmable computer may be a programmable logic unit, a mainframe computer, server, and personal computer, cloud-based program or system, laptop, personal data assistance, cellular telephone, smartphone, or tablet device.

[0072] Each program is preferably implemented in a high-level procedural or object-oriented programming and/or scripting language to communicate with a computer system. However, the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Each such computer program is preferably stored on a storage media or a device readable by a general or special purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.

[0073] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the present invention.

[0074] Further, although process steps, method steps, algorithms or the like may be described (in the disclosure and / or in the claims) in a sequential order, such processes, methods and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order that is practical. Further, some steps may be performed simultaneously.

[0075] When a single device or article is described herein, it will be readily apparent that more than one device / article (whether or not they cooperate) may be used in place of a single device / article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device / article may be used in place of the more than one device or article.

[0076] Systems and methods for visual inspection are provided herein. The system can be used to inspect an article. The systems include a nodal architecture. The systems may perform single or two stage object detection. Single stage object detectors may require less computer power but provide reduced accuracy (as compared to two stage detectors). Two stage detectors may be more costly in compute resources for a much higher accuracy and have been one of the biggest challenges facing Al deployment on small, embedded devices. The systems provided herein describe the use of object detectors on low power, small compute edge devices.

[0077] Referring now to Figure 1 , shown therein is a visual inspection system 100, according to an embodiment. This visual inspection system 100 is configured to perform defect detection using artificial intelligence techniques.

[0078] The visual inspection system 100 is located at an inspection station 108.

[0079] The inspection station 108 may be a quality assurance station at a manufacturing facility or site. The quality assurance station may be a station or location at the manufacturing site that is used to evaluate and determine the quality of manufactured or fabricated articles. The quality assurance station may be a separate station designated for evaluating quality or may be integrated within other parts of the manufacturing or processing process (e.g. on a conveyor or the like).

[0080] The inspection station 108 may include an automatic transport mechanism, such as a conveyor belt, for transporting articles to and from the inspection station 108.

[0081] In variations, the system 100 can advantageously be configured to be positioned inline with respect to existing processes at the manufacturing facility. The inspection station 108 may be a retrofitted conveyor-type system adapted to include components of the system 100. In an example, one or more aspects of the system 100 (e.g. mechanical inspection subsystem 114, computing system 116, each described below) may be integrated with the automatic transport mechanism such that the visual inspection feature provided by the system 100 can be integrated into the existing processes at the site with minimal impact to operations.

[0082] While examples described herein may refer to the integration and use of system 100 (and other systems described herein) at a manufacturing site or facility, it is to be understood that the system 100 may be used at any site where visual inspection occurs.

[0083] The visual inspection system 100 is used to inspect an article 110.

[0084] The system 100 inspects the article 110 and determines whether the article

110 has a defect. Articles 110 may be classified as defective or non-defective by the system 100.

[0085] By identifying articles 110 as defective or non-defective, the inspected articles can be differentially treated based on the outcome of the visual inspection. Defective articles 110 may be discarded or otherwise removed from further processing. Non-defective articles 110 may continue with further processing.

[0086] Generally, the article 110 is an article in which defects are undesirable. Defects in the article 110 may lead to reduced functional performance of the article 110 or of a larger article (e.g. system or machine) of which the article 110 is a component. Defects in the article 110 may reduce the visual appeal of the article. Discovering defective products can be an important step for a business to prevent the sale and use of defective articles and to determine root causes associated with the defects so that such causes can be remedied.

[0087] The article 110 may be a fabricated article. The article 110 may be a manufactured article that is prone to developing defects during the manufacturing process. The article 110 may be an article which derives some value from visual appearance and on which certain defects may negatively impact the visual appearance. Defects in the article 110 may develop during manufacturing of the article 110 itself or some other process (e.g. transport, testing).

[0088] The article 110 may be composed of one or more materials, such as metal, steel, plastic, composite, wood, glass, etc. [0089] The article 110 may be uniform or non-uniform in size and shape.

[0090] The article 110 may include a plurality of sections. Article sections may be further divided into article subsections. The article sections (or subsections) may be determined based on the appearance or function of the article. The article sections may be determined to facilitate better visual inspection of the article 110 and to better identify unacceptably defective articles.

[0091] The article sections may correspond to different parts of the article 110 having different functions. Different sections may have similar or different dimensions. In some cases, the article 110 may include a plurality of different section types, with each section type appearing one or more times in the article 110. The sections may be regularly or irregularly shaped. Different sections may have different defect specifications (i.e. tolerance for certain defects).

[0092] The article 110 may be a stand-alone article that is intended for use on its own or may be a component of a bigger article (e.g. a machine part of a larger machine).

[0093] The article 110 may be prone to multiple types or classes of defects detectable using the system 100. Example defects types may include paint, porosity, dents, scratches, sludge, etc. Defect types may vary depending on the article 110. For example, the defect types may be particular to the article 110 based on the manufacturing process or material composition of the article 110. Defects in the article 110 may be acquired during manufacturing itself or through subsequent processing of the article 110.

[0094] The article 110 may be a power train part for automobiles and/or motorcycles. The power train part may be a camshaft, differential assembly, transmission gear, linkage, suspension part, or a part or component of any of the foregoing.

[0095] In a particular embodiment of system 100, the article 110 is a camshaft. Generally, a camshaft is a shaft to which a cam is fastened or of which a cam forms an integral part. The camshaft can be used as a mechanical component of an internal combustion engine. The camshaft opens and closes inlet and exhaust valves of the engine at an appropriate time and in a precisely defined sequence. [0096] The camshaft may include a plurality of different components or parts, with each different component providing a particular function and having particular dimensions. Camshaft components may include any one or more of journals, lobes, ends, and bearings.

[0097] The camshaft may be divided into sections to inspect using the system 100. The sections may correspond to the different components of the camshaft (e.g. lobe section, journal section). The camshaft may have 10 to 17 sections to inspect. The 10 to 17 sections may include lobes and journals.

[0098] The system 100 may be configured to acquire in the range of 650 to 1700 images per camshaft. The system 100 may acquire in the range of 50-100 images per section. There may be 13 to 17 sections per camshaft. For a section, images may be acquired at every 7.2 degrees of rotation of the camshaft. This may allow for a 360- degree rotation every 2 seconds while still allowing the Al algorithm enough degrees of freedom to properly track defects across the 50-100 images in each section.

[0099] The camshaft may include a cylindrical rod running the length of the cylinder bank with a plurality of oblong lobes protruding from the rod. The number of lobes may correspond to the number of valves (e.g. one lobe for each valve). The main journals keep the camshaft in place as it spins around in the engine bay.

[0100] The camshaft may be composed of any suitable camshaft material. The camshaft may be made of chilled iron casting or billet steel.

[0101] The camshaft may be prone to developing defects whether through the manufacturing process or other processing of the camshaft or its components. Defect types may include paint, porosity, sludge, or the like. Different components, or sections, of the camshaft may develop different types of defects. Defects may affect or impact different components differently. This may be the case because different components perform different functions and thus may have different tolerances for defects. For example, defects to lobes and journals may be treated differently and have different tolerances. In some cases, subsections of a component may have different specifications. For example, lobes (which are egg-shaped) may have different defect specifications based on the location of the defect on the lobe. [0102] In some cases, the camshaft may have porosity specifications as small as 0.4 mm. Porosity specifications may depend on location. As an example, see Table A below:

TABLE A

[0103] The camshaft sections may have non-uniform shapes. This may produce irregular light reflections. Occasionally, the camshaft may be covered with a rust inhibitor (clear). The camshaft may experience random occurrences of sludge or dark oil drops. In some cases, sludge may cover a porosity. The defects (e.g. porosity, sludge) may have different sizes and shapes.

[0104] The visual inspection system 100 includes a mechanical inspection subsystem 114 and a computing system 116. The mechanical inspection subsystem 114 and computing system 116 work together to inspect the article 110.

[0105] Components of the mechanical inspection subsystem 114 (e.g. actuators, motors, imaging unit) may be installed on a conveyor. Some modification may be made to the components to adapt them onto the conveyor. Such a setup may be used, for example, with article lines having out-conveyors. [0106] The mechanical inspection subsystem 114 includes an article holder 118 for holding the article 110 prior to inspection. During inspection, the article 110 is lifted off the article holder 118.

[0107] The article holder 118 includes first and second article holders 118a, 118b. The article holders 118a, 118b are shown in Figure 1 as being positioned at or near a first end 119a and a second end 119b of the article 110, respectively.

[0108] The number and positioning of the article holder 118 (or article holders 118) may vary depending on the article 110 (e.g. shape, dimensions, weight, etc.) or other design choices. Any suitable number of article holders 118 may be used.

[0109] The article holder 118 may be modified, as necessary, to accommodate articles 110 of different shapes, sizes, weights, materials, etc.

[0110] Other variations of the system 100 may not include an article holder 118, such as where the article 110 is inspected by the system 100 while on a conveyor belt or the like. In such cases, the conveyor belt or other feature may act as the article holder 118.

[0111] The mechanical inspection subsystem 114 includes an article manipulator 120.

[0112] The article manipulator 120 engages the article 110 and manipulates the article 110 (i.e. moves the article) to facilitate inspection. The article manipulator 120 manipulates the article 110 in a predefined manner based on the article being inspected and the imaging setup.

[0113] The article manipulator 120 is adapted to hold the article 110 in position while the article 110 is inspected by the system 100 and in particular as the article 110 is imaged.

[0114] The article holder 120 may be configured to permit or restrict certain types of movement of the article 110 during the inspection. For example, the article holder 120 may restrict movement of the article 110 in x, y, and z directions, while permitting rotational movement 125 of the article 110 along a central axis of the article 110. [0115] The manner in which the article manipulator 120 moves the article 110 (e.g. rotationally, horizontally, vertically, etc.) may depend on a variety of factors such as the shape of the article 110 and the positioning of imaging units.

[0116] Generally, the article manipulator 120 manipulates the article 110 to position a portion of the article 110 in an imaging zone of an imaging unit (e.g. imaging zone 124 of imaging unit 122, described below). In doing so, the article manipulator 120 may facilitate imaging and inspection of the entire article 110 (or an entire article section), such as by exposing previously unexposed sections or subsections of the article 110 to the imaging equipment.

[0117] The article manipulator 120 may be configured to manipulate the article 110 in a series of steps, such as by use of a stepper motor.

[0118] The article manipulator 120 includes a first article manipulator 120a and a second article manipulator 120b.

[0119] The first and second article manipulators 120a, 120b may be configured to move along a line of motion 123. Movement of the manipulators 120a, 120b may be driven by a motor.

[0120] The manipulators 120a, 120b move towards the article 110 along line of motion 123 to promote engagement with the article 110 prior to the start of inspection.

[0121] Movement of the manipulators 120a, 120b along line of motion 123 away from the article 110 occurs when inspection of the article 110 is complete and promotes disengagement of the article 110 and engagement a new article (when loaded).

[0122] The first article manipulator 120a engages with a first end 119a of the article 110. The second article manipulator 120b engages with a second end 119b of the article 110. The first and second article manipulators 120a, 120b are connected to a motor that drives rotational motion of the manipulators 120a, 120b, which causes the rotation of the article 110 along line of motion 125.

[0123] While the embodiment of Figure 1 shows two article manipulators, any number of article manipulators may be used and may be dictated by the article 110 (size, weight, dimensions) and the type of motion to be imparted to the article 110. [0124] In other embodiments, the article manipulator 120 may include a robot or rotating jig.

[0125] The mechanical inspection system 114 may include an article loader. The article loader is not shown in Figure 1 . The article loader is configured to automatically load and unload the article 110 for inspection. Loading and unloading the article 110 may include loading the article 110 onto the article holder 118 and unloading the article from the article holder 118, respectively.

[0126] In an embodiment, the article loader may be a robotic device (e.g. a packing robot). The robotic device may drop the article 110 for inspection and pick the article 110 after inspection and deliver the article to either a defective article chute (if defects present) or to a tote or tray (if no defects).

[0127] In some cases, the article loader may load the article 110 automatically and the article 110 is unloaded by a human operator. Such may be the case where the inspection is the last process before packing.

[0128] The type of article loader used by the system 100 may depend on the production line and the type of article 110 being inspected.

[0129] The mechanical inspection subsystem 114 also includes an imaging unit 122. The imaging unit 122 captures images of the article 110 (or a section thereof). The captured images are provided to and analyzed by the computing system 116.

[0130] The imaging unit 122 has an imaging zone 124. The imaging zone 124 corresponds to an area from which image data is acquired by the imaging unit 122 and may correspond to the field of view (FOV) of the imaging unit 122.

[0131] In an embodiment, the article 110 is imaged by article section. In such a case, the imaging unit 122 is positioned first where a first section of the article 110 is in the imaging zone 124 and an image of the first section is captured. A second section of the article 110 is then positioned in the imaging zone 124 (whether by moving the imaging unit 122 or the article 110) and an image of the second section is captured. The process can be repeated for successive sections of the article 110 until the whole article 110 has been imaged. In some cases, multiple images of each section may be captured. The multiple images of a section may compose a sequence of images. For example, multiple images of the first section may be captured followed by the capture of multiple images of the second section. In some cases, the multiple images may represent a 360-degree imaging of the article 110. In such cases, each of the multiple images includes a portion of the article 110 not previously imaged.

[0132] The imaging unit 122 includes a lighting mechanism 126, a camera 128, and a communication subsystem/interface 130.

[0133] The lighting mechanism 126 shines light on the article 110. The lighting mechanism 126 may focus or limit the shined light to the imaging zone 124. The lighting mechanism 126 may be controlled by a lighting controller in communication with the lighting mechanism 126.

[0134] The lighting mechanism 126 may be configured to shine homogenous or uniform lighting on the article 110 in the imaging zone 124. The lighting mechanism 126 may also be configured to minimize or eliminate shadowing on the article 110.

[0135] The lighting mechanism 126 may use one or more types of lighting, such as tunnel lighting, coaxial lighting, or dome lighting.

[0136] In an embodiment, the lighting mechanism 126 includes tunnel lighting and coaxial lighting. In another embodiment, the lighting mechanism 126 may include only tunnel lighting.

[0137] The type or types of lighting providing by the lighting mechanism 126 may be selected to provide uniform lighting on the article 110 in the imaging zone 124 and to avoid shadowing.

[0138] The camera 128 captures image data of the imaging zone 124, including the article 110 positioned in the imaging zone 124, and stores the image data in memory.

[0139] The camera 128 may be an area scan camera. The area scan camera may be a BASLER area scan camera (e.g. Basler acA1920-40uc). The area scan camera contains a rectangular sensor with more than one line of pixels, which are exposed simultaneously. [0140] In an embodiment, the system 100 may have a plurality of cameras 128. Using multiple cameras 128 may reduce inspection time. The cameras 128 may each have their own imaging unit 122. In a particular case, the system 100 may have two cameras with each camera responsible for imaging half the article sections. In a camshaft visual inspection example, the cameras may be positioned on each end of the camshaft and responsible for imaging half the sections (e.g. 8 of 16 total sections). Overall inspection time for the camshaft may be reduced by half using the two-camera inspection. Defect detection may be performed as independent processes for images obtained by each of the cameras.

[0141] The camera 128 is connected to the computing system 116 via the communication subsystem 130. The communication subsystem/interface 130 may include a communication interface of the camera 128 and a data transfer connector (e.g. usb 3.0 cable).

[0142] The communication subsystem 130 transfers data to (including the image data generated by the camera 128) the computing system 116. Data transferred to the computing system 116 includes image data generated by the camera 128.

[0143] Referring now to Figure 2, shown therein is a front view of the imaging unit 122, according to an embodiment.

[0144] The imaging unit 122 includes lighting mechanism 126. The lighting mechanism 126 includes an LED tunnel light 210 and an LED coaxial light 212 positioned above the tunnel light 210. In another embodiment, the tunnel light 210 may be replaced with a dome light. The dome light may be attached to a robotic device, such as a robotic arm (e.g. robotic arm 1484 of Figure 14).

[0145] The tunnel light 210 may be used to provide diffuse lighting for elongated and cylindrical articles 110 (e.g. camshaft). The diffuse lighting may reduce shadows and provide uniform lighting on the article 110.

[0146] The coaxial light 212 may help image a highly reflective article 110 (e.g. composed of steel). The light is generated by a diffuse back light and directed at the article 110 using a half-mirror so the lighting axis is the same as the optical axis of the camera 128. This may produce a shadow-free image and may eliminate reflections.

[0147] The tunnel light 210 is connected to a tunnel light controller 214 via a first illumination cable 220. The tunnel lighting controller 214 controls the operation of the tunnel light 210 including the amount of light provided.

[0148] The coaxial light 212 is connected to a coaxial light controller 220 via a second illumination cable 218. The coaxial lighting controller 220 controls the operation of the coaxial light 212 including the amount of light provided.

[0149] Positioned above the coaxial light 212 is a lens 222. The lens 222 may be a high-resolution lens suitable for installation in a machine vision system. The lens 222 may be a 9-megapixel lens.

[0150] The lens 222 is connected to the camera 128. The camera 128 is an area scan camera. The camera 128 captures image data of the article 110.

[0151] The camera 128 transfers the captured image data to the node device 148 for object detection via a communication cable 224. The communication cable 224 may be a USB 3.0 cable.

[0152] Referring again to Figure 1 , the imaging unit 122 is connected to an imaging unit manipulator 132. The imaging unit manipulator 132 moves the imaging unit 122 relative to the article 110 in order to position the article 110 (or a portion thereof) in the imaging zone 124 of the imaging unit 122.

[0153] The imaging unit manipulator 132 includes a frame 134. The frame 134 includes uprights supports 136a, 136b and a track 138 fixedly attached to the supports 136a, 136b. As shown in Figure 1 , the track may be arranged substantially parallel to the article 110.

[0154] The imaging unit manipulator 132 includes an imaging unit mount 140 that is slidably attached to the track 138. The imaging unit mount 140 is configured to connect to the imaging unit 122 and secure the imaging unit 122 to the frame 134 such that the imaging unit 122 can be moved along the track 138 via the mount 140. [0155] In the embodiment shown in Figure 1 , the imaging unit 122 moves along the track 138 in a horizontal direction along line of motion 142 relative to the article 110. In variations, the imaging unit manipulator 132 may be configured to move the imaging unit 122 in a vertical direction relative to the article 110 or in vertical and horizontal directions relative to the article 110.

[0156] In other embodiments, the imaging unit 122 may be stationary and the article 110 moved (for example, via the article manipulator 120 or a conveyor belt) in order to image different sections of the article 110.

[0157] Arranging the imaging unit manipulator 132 such that the imaging unit 122 can move in the horizontal direction 142 relative to the article 110 may allow the imaging unit 122 to conveniently capture image data of a plurality of sections of the article 110 along a horizontal axis.

[0158] The imaging unit manipulator 132 may operate in coordination with the article manipulator 120 to capture images of the entire article.

[0159] The imaging unit manipulator 132 also includes a motor for moving the imaging unit 122. The motor may be a stepper motor. In an embodiment, the imaging unit manipulator 132 includes an actuator for moving the imaging unit 122 vertically (up and down) and a stepper motor for moving the imaging unit 122 horizontally along the length of the article 110. The actuator may be a pneumatic cylinder.

[0160] In another embodiment of the system 100, the mechanical inspection subsystem 114 includes a robotic system for imaging the article 110. The robotic system may include a robotic arm and a robotic arm controller comprising a computing device. In an embodiment, the robotic system may be the robotic system 1482 of Figure 14. The imaging unit 122 (including camera 128 and lighting mechanism 126) may be mounted or otherwise attached to the robotic arm. The robotic arm may replace any one or more of the imaging unit manipulator 132, the frame 134, the upright supports 136, the track 138, and the imaging unit mount 140. The robotic arm may be capable of moving in three dimensions (i.e. not strictly along line of motion 142). Generally, the robotic arm is used to move the imaging unit 122 (camera 128 and lighting mechanism 126) relative to the article 110 being inspected to capture images of multiple sections of the article, such as along a length of the article 110. The robotic arm may function autonomously (i.e. may perform movements, such as moving the camera/imaging unit, autonomously). For example, movements of the robotic arm performed during imaging of the article 110 may be programmed into the robotic arm controller. Programmed movements may vary for different shapes and sizes of articles. Robotic arm movements for imaging an article 110 may include a sequence of movements that are performed for each article 110. The robotic system may include a communication interface such as an Ethernet/IP module (e.g. Ethernet/IP module 1512 of Figure 15) for communicating with other system 100 components. For example, the robotic system may communicate with a PLC of a production machine or the PLC 146 of the visual inspection system 100 using the Ethernet/IP module.

[0161] The visual inspection system 100 includes a computing system 116 for performing various control, analysis, and visualization functions of the system 100. The computing system 116 is communicatively connected to the mechanical inspection system 114 to facilitate data transfer between the computing system 116 and the mechanical inspection subsystem 114 to perform visual inspection.

[0162] The computing system 116 includes a programmable logic controller (PLC) 146, a node device 148, and a user control device 150. The devices 146, 148, 150 are communicatively connected to each other, which may include wired or wireless connections. The devices 146, 148, 150 may be connected to each other via a network 152, such as a wired or wireless network. The network 152 may include direct wired connections between components as described herein. Communication of aspects of the mechanical inspection subsystem 114 with other components such as the PLC 146 via the network 152 is represented by a solid line from the mechanical inspection system 114 to the network 152.

[0163] The PLC 146 is a digital computer adapted to automate electromechanical processes of the visual inspection system 100. The PLC 146 may provide high reliability control and ease of programming and process fault diagnosis.

[0164] The PLC 146 includes a memory (e.g. memory such as RAM or flash) for storing PLC computer programs configured to control various components and operations of the visual inspection system 100 and a processor for executing the PLC computer programs. The PLC 146 includes a communication interface for transmitting and receiving data (e.g. defect data) to and from other components of the system 100.

[0165] The PLC computer programs include computer-executable instructions for controlling the movement and operation of one or more components of the mechanical inspection subsystem 114, interfacing with the node device 148 and user control device 150, and processing defect data received from the node device 148.

[0166] The PLC 146 is also configured to perform an inspection sequence, such as the inspection sequence 400 of Figure 4.

[0167] The PLC 146 controls a plurality of actuators and motors via the PLC computer programs. The actuators and motors move components of the mechanical inspection subsystem 114. Any suitable number of actuators and motors may be used.

[0168] The PLC 146 may control the actuators and stepper motors in either manual or automatic modes.

[0169] The actuator may be a cylinder, such as a pneumatic cylinder.

[0170] The motor may be a stepper motor.

[0171] System 100 includes three actuators including a first actuator, a second actuator, and a third actuator.

[0172] The first actuator is for horizontal movement. On instruction from the PLC 146, the first actuator moves the article manipulator 120a horizontally towards and away from the first end 119a of the article;

[0173] The second actuator is for horizontal movement. On instruction from the PLC, the second actuator moves the article manipulator 120b (including a stepper motor) horizontally towards and away from the second end 119b of the article 110.

[0174] The third actuator is for vertical movement. On instruction from the PLC 146, the third actuator moves the imaging unit 122 up and down relative to the article 110.

[0175] The mechanical inspection subsystem 114 also includes two stepper motors including a horizontal movement motor and a rotational movement motor. [0176] On instruction from the PLC 146, the horizontal movement motor moves the imaging unit 122 horizontally along the track 138. The imaging unit 122 may be moved in steps corresponding to imaging positions of the imaging unit 122. The imaging positions may correspond to sections of the article 110. The imaging positions may be predefined. Moving to subsequent steps or positions may be based on the PLC 146 receiving a signal from another component of the system 100 (e.g. from the node device 148) or using event-based movement. For example, a robot or camera may be instructed to move from a first section of the article 110 to a second section of the article 110 based on an event, such as detection for the first section being complete.

[0177] On instruction from the PLC 146, the rotational movement motor rotates the article 110 to facilitate inspection of the full article (e.g. full 360 degrees). The article 110 may be rotated in steps corresponding to subsections of the article 110 (e.g. 4 degrees).

[0178] The imaging unit 122 and article manipulators 120a, 120b each have a return/home position and an advance position. To return to a home position, the article manipulator 120a is moved to horizontally away from the article 110 (to the left) by the first actuator. To return to a home position, the article manipulator 120b is moved horizontally away from the article 110 (to the right) by the second actuator. To return to a home position, the imaging unit 122 is moved vertically up and away from the article 110 by the third actuator. To advance to an advance position, the article manipulator 120a is moved horizontally towards the article 110 (to the right) by the first actuator. To advance to an advance position, the article manipulator 120b is moved horizontally towards the article 110 (to the left) by the second actuator. To advance to an advance position, the imaging unit 122 is moved vertically down towards the article 110 by the third actuator. The imaging unit 122 also moves horizontally down the camshaft (e.g. right to left) to scan the camshaft via the horizontal movement motor.

[0179] The PLC 146 is communicatively connected to a plurality of sensors configured to provide machine status data to the PLC 146. The machine status data provides status information for components of the subsystem 114 such as actuators and whether an article 110 is loaded or not. [0180] The PLC 146 interfaces with the node device 148 via the PLC computer programs. The PLC 146 may communicate with the node device 148 through a TCP communication protocol.

[0181] Data received by the PLC 146 from the node device 148 includes code indicating whether a defect has been detected by the node device 148 or not.

[0182] For example, if the node device 148 finds a defect, the PLC 146 may receive an NG code from the node device 148 indicating that a defect has been found in the article 110 and that the article 110 is defective (i.e. “no good”).

[0183] The NG code may include defect data describing the defect. The defect data may include a defect type or class (e.g. porosity, sludge, paint, scratch, dent, etc.) and a defect size.

[0184] If the node device 148 does not find a defect, the PLC may receive an OK code from the node device 148 indicating that no defect has been found.

[0185] The PLC 146 processes defect data received from the node device 148 via the PLC computer programs. The PLC 146 receives motor data from the stepper motors. The motor data may include a horizontal reading (e.g. from first stepper motor horizontal) identifying the section of the article 110. The motor data may include a rotational reading (from the second stepper motor rotational) which identifies a camshaft angle. The camshaft angle may be in reference to a pin/keyway. The PLC 146 uses the motor data to identify the exact X and Q coordinates of the article 110 (camshaft). The motor data includes section data (e.g. an x coordinate value) and angle data (e.g. a Q coordinate value). The section data describes a particular section of the article 110 (which can, for example, be related to a horizontal coordinate value, x). The angle data describes a particular article angle (in reference to the article manipulator 120 (e.g. article manipulator 120b includes a stepper motor and keeps an angle of the article 110)). Together, the section data and angle data describe a subsection of the article 110. The article subsection has particular defect specifications (i.e. for the specific section/angle combination) including a tolerance for particular defect types. Defect specifications including tolerances are stored by the PLC 146 in memory and linked to a particular section and angle combination. Different subsections of the article may have different tolerances.

[0186] As described above, the PLC 146 is also configured to send data to the node device 148.

[0187] Data sent from the PLC 146 to the node device 148 may include inspection code. Inspection code instructs the node device 148 to start detection. For example, the article 110 may be inched rotationally in steps by the rotational motor. For each step, the PLC 146 may send inspection code to the node device 148 instructing the node device 148 to start detection.

[0188] In another embodiment, the computing system 116 may use an image stitching technique. For example, the PLC 146 instructs the node device 148 to take an image of the article 110. The PLC 146 then inches the article 110 rotationally (i.e. incrementally rotates the article 110) and instructs the node device 148 to acquire another image of the article 110 (i.e. so that at least a portion of the second image is of a previously unseen portion of the article 110). This process of action by the PLC 146 and the node device 148 may be repeated until an entire 360 degrees of the article 110 has been imaged. The plurality of images representing the 360-degree imaging of the article 110 are then stitched together by the node device 148 to construct a stitched image according to an image stitching technique. The node device 148 then performs object detection on the stitched image. In such a case, object detection may be performed by the node device 148 one time at the last revolution of the 360 degrees on the stitched image. The PLC 146 may send inspection code to the node device 148 at the completion of the 360 revolution of the article 110 instructing the node device 148 to start detection. The node device 148 may construct stitched images and perform object detection on the stitched images for a plurality of adjacent sections of the article 110. Image stitching may advantageously allow for inference to be performed only once on the stitched image representing the entire section of the article covered by the stitched image. The operation of cropping images together into a final large (stitched) image may provide the system with the highest quality image. [0189] Data sent from the PLC 146 to the node device 148 may include a finish code. The finish code instructs the node device 148 to terminate detection. The finish code may instruct the node device 148 to terminate detection until a new article 110 is loaded.

[0190] The PLC 146 may also send handshaking data to and receive handshaking data from the node device 148. Handshaking data includes certain codes that, when exchanged between the PLC 146 and the node device 148, establish a communication channel between the device 146, 148.

[0191] The PLC 146 interfaces with the user control device 150 via the PLC computer programs. The PLC 146 may send data to the user control device 150 indicating status of subsystem 114 operations. Such received data may be visualized by the user control device 150. The PLC 146 may also receive data from the user control device 150. Such data may control operations of the PLC 146 or subsystem 114 according to user input received at the user control device 150.

[0192] Referring now to the node device 148, the node device 148 may be an edge computing device. The node device 148 may be an edge device specifically configured to run Al applications and perform deep learning tasks and may include processing components suited or adapted to performing such tasks.

[0193] In an embodiment, the node device 148 is a JETSON XAVIER device.

[0194] The node device 148 may provide an edge solution that is secure and without a need to connect to devices outside the inspection site or facility. This may allow data to stay on premises.

[0195] The node device 148 may be encrypted (e.g. device-level encryption). Files on the node device 148 may also be encrypted (e.g. file-level encryption). This may provide a dual layer of security.

[0196] In an embodiment, the system 100 may include a plurality of node devices 148 arranged in a nodal architecture. The multiple node devices 148 may run simultaneously and communicate back and forth with the PLC 146. Such an architecture may promote efficient scale up for users. The nodal architecture also may reduce potential downtime caused by equipment problems as, if one node device 148 fails or breaks, the non-functioning node device can be replaced by plugging in a replacement node device.

[0197] The node device 148 may advantageously reduce costs associated with the visual inspection system, particularly over existing edge-based approaches. Such existing edge-based approaches may typically use regular Nvidia or comparable graphics cards that require computer hardware to run.

[0198] By using the node device 148 to perform inference tasks (i.e. object detection, defect classification), the visual inspection system 100 may reduce power consumption as compared to existing onsite inference approaches that use large servers having GPUs that may have a draw in the range of 250W. For example, the node device 148 may have a significantly lower draw in the range of 30W.

[0199] The node device 148 is an embedded computing device.

[0200] The node device 148 includes processing components and storage components.

[0201] The processing components may include one or more CPUs and GPUs.

[0202] The storage components include memory such as RAM and flash.

[0203] The node device 148 also includes a communication interface for communicating with the PLC 146, user control device 150, and camera 128.

[0204] The node device 148 stores one or more computer programs in memory including computer-executable instructions that, when executed by the processing components of the node device 148, cause the node device 148 to perform the actions and functions described herein.

[0205] The node device 148 is configured to run artificial intelligence at the edge. Advantages of performing the Al at the edge using the node device 148 may include faster inference speed, lower latency between image ingestion and inference result, and data never having to leave the premises. [0206] The node device 148 is configured to send and receive certain codes to and from the PLC 146 to establish communication therewith.

[0207] The node device 148 receives inspection code from the PLC 146. Upon receiving the inspection code, the node device 148 starts an object detection pipeline.

[0208] The node device 148 is communicatively connected to the camera 128 and instructs the camera 128 to acquire images of the article 110. The node device 148 may communicate with the camera 128 via an API or the like.

[0209] The node device 148 includes a defect detection model 154. The defect detection model 154 is a machine learning-based model configured to analyze image data of the article 110 and identify defects therein. The defect detection model 154 uses object detection techniques to identify defects in the image data. The defect detection model 154 may be configured to perform multiclass classification for classifying instances into one of three or more classes. Classes may correspond to different defect types that the defect detection model 154 has been trained to detect. The classes may include at least two defect types.

[0210] The node device 148 may include software modules configured to train, optimize, and run the defect detection model 154.

[0211] The defect detection model 154 may utilize an Object Detection API such as TensorFlow Object Detection API or the like. The Object Detection API may be an open source framework that facilitates construction, training, and deployment of object detection models.

[0212] The defect detection model 154 is configured to perform defect classification. The defect classification includes assigning a detected defect to a defect class. Defect classes are associated with particular defect types (e.g. dent, paint, scratch, etc.) the defect detection model 154 is trained to detect. The object detection model 154 may be optimized to learn how to ignore anomalies such as dirt, sludge, oil and water marks which may be present on the article 110 (i.e. via training of the model 154). The ability of the object detection model 154 to ignore such anomalies may advantageously reduce the rate of false alarms. [0213] The defect detection model 154 generates defect data from analyzing the image data. The defect data includes information about an identified defect, such as a defect type, a defect size, a defect confidence level, and a defect location.

[0214] The node device 148 stores the defect data in one or more storage components. The node device 148 may send the defect data to the PLC 146 for further processing. The PLC 146 processes the defect data and determines whether the defect constitutes an unacceptable defect (outside tolerance) or an acceptable defect (within tolerance).

[0215] The node device 148 may send the defect data to the user control device 150 for visualization.

[0216] In an embodiment, the defect detection model 154 includes a neural network 156. The neural network 156 is trained on images of defective and non-defective articles 110. Training may be performed according to known training techniques and techniques described herein.

[0217] In an embodiment, the neural network 156 is generated using model training and conversion for inference which may include taking a pretrained model, training the pretrained model with new data, pruning the model, retraining the pruned model to generate an output model, and passing the output model through an engine (e.g. TensorRT engine and/or an optimized CUDA backend).

[0218] The neural network 156 may be a Faster Region-based Convolutional Neural Network (faster R-CNN) or CSPDarknet53.

[0219] The faster R-CNN may have a ResNet or Inception classification backbone. The ResNet may be ResNet 50 or ResNet 101 . The Inception may be Inception v2.

[0220] The neural network 156 (e.g. CSPDarknet53) may use the new state-of-the- art techniques such as Weighted-Residual-Connections (WRC), Cross-Stage-Partial- connections (CSP), Cross mini-Batch Normalization (CmBN), Self-adversarial-training (SAT), and Mish-activation to achieve similar two-stage object detection results while only being a single stage detector. [0221] The neural network 156 described above may be optimized to run on the node device 148. Optimization may be performed according to the optimization methods described herein. Optimization may include any one or more of Kernel Auto-Tuning, Layer and Tensor Fusion, Precision Calibration (FP32 to FP16), CUDA optimization, and utilizing Dynamic Tensor memory. Optimization may fuse layers of the neural network 156 to compress the network size to run on the node device 148. This may allow the network 156 to run effectively on the node device 148, allowing for local inference at the edge.

[0222] The node device 148 may have per image processing times to allow for real-time speeds. The node device 148 may have a per image processing time of approximately 25ms to 60ms per image size of over 10OO(width) x 10OO(height) pixels. In embodiments where the node device 148 is a general GPU, processing time per image may be in the range of 5ms to 60ms.

[0223] The defect detection model 154 may include a two-stage object detection model. The two-stage object detection model may be trained with a training framework such as TensorFlow or the like. The two-stage object detection model may include modifications to any one or more of input image size, batch numbers, and first stage and second stage proposals. Such modifications may improve efficiency of the model while keeping accuracy as high as possible or at an acceptable level.

[0224] The two-stage detection model may use a Region Proposal Network to generate regions of interest in a first stage. The regions of interest are then sent down a pipeline for object classification and bounding-box regression.

[0225] Once the defect detection model 154 is pre-trained, the neural network 156 weights may be fine-tuned using transfer learning. Transfer learning may reduce the size of the dataset and decrease total training time while obtaining acceptable results.

[0226] The defect detection model 154 may include a CNN pretrained on a very large dataset (e.g. ImageNet, which contains 1.2 million images with 1000 categories, or Common Objects in Context (COCO)), and then use the CNN either as an initialization or a fixed feature extractor for the defect detection task. [0227] Fine tuning using transfer learning may include fine tuning a CNN. Fine tuning the CNN may include replacing and retraining a classifier on top of the CNN on a new dataset and fine-tuning weights of the pretrained network by continuing backpropagation. All of the layers of the CNN may be fine-tuned or some of the earlier layers may be kept fixed (e.g. due to overfitting concerns) and some higher-level portion of the network may be fine-tuned. This may be motivated by the observation that the earlier features of a CNN contain more generic features (e.g. edge detectors or color blob detectors) that should be useful to many tasks, but later layers of the CNN become progressively more specific to the details of the classes contained in the original dataset.

[0228] In an embodiment, the defect detection model 154 (e.g. neural network 156) may be modified to run more efficiently on the node device 148 (e.g. Jetson Xavier).

[0229] For example, the model may be modified to run at the highest possible frame rate (FPS). The frame rate refers to the frame rate of inference per single image. The frame rate may be determined by numerous factors (e.g. image size, number of layers, number of anchor boxes, algorithms within network, operations within network, floating point precision, layer fusion, etc.). A higher FPS inference by the node device 148 may decrease inspection time to levels more suitable to industrial manufacturing inspection processes.

[0230] The defect detection model 154 may be optimized for inference on the node device 148.

[0231] The defect detection model 154 may be optimized with TensorRT or OpenCV (CUDA). Optimization may occur after an optimal number of training iterations to ensure the model is not overfitted and training with augmented images to reduce overfitting.

[0232] Training of the defect detection model 154 may include image augmentation. Data augmentation may be performed to increase the variability of the input images, so that the designed object detection model has higher robustness to the images obtained from different environments. Two augmentations strategies that may be implemented include photometric distortions and geometric distortions in aiding with training object detection models. In dealing with photometric distortion, the brightness, contrast, hue, saturation, and noise of an image may be adjusted. For geometric distortion, random scaling, cropping, flipping, and rotating may be added. Other special augmentations may include, for example: random erase, CutOut, Dropout/Drop Connect/ Dropblock in the feature map, Mixup (with two different images), CutMix, Mosaic augmentation.

[0233] By augmenting images during training, the training set can be grown artificially, which may increase robustness of the generalizing capability of the defect detection model 154 (object detector).

[0234] The network may then run on the node device 148 at FP16 precision in python. The network optimization is run on the node device 148 itself. This is because it utilizes the available hardware and kernels optimized with CUDA at current time of execution.

[0235] The user control device 150 includes a processor, a memory, and an output interface. The output interface includes a display 158. The user control device 150 includes a communication interface for transmitting and receiving data to and from other components of the system 100, such as the PLC 146 and the node device 148.

[0236] The user control device 150 also includes an input interface for receiving input from an operator. The input interface may be a touchscreen on the display 158.

[0237] In an embodiment, the display 158 of the user control device 150 may operate both as a touch display and a regular monitor with high resolution.

[0238] The memory stores data received from the PLC 146 and the node device 148, including defect data.

[0239] The memory also stores one or more computer programs for generating and displaying a user interface 160. The user interface 160 includes a visualization 162 of results of the visual inspection.

[0240] The processor may generate the visualization 162 based on defect data received from the node device 148.

[0241] In an embodiment, the user interface 160 may include a PLC mode and a node device mode. In the PLC mode, the user interface 160 generates and displays a visualization of PLC operations (e.g. mechanical inspection subsystem 114 in operation). The visualization in the PLC mode is generated using data from the PLC 146. In the node device mode, the user interface 158 generates and displays a visualization of the node device operations (e.g. object detection). The visualization in the node device mode is generated using data from the node device 148. Switching between PLC mode and node device mode may be programmed to occur automatically (e.g. to node device mode when detection is being performed) or may occur manually such as through receiving commands from the operator via the input interface.

[0242] In an embodiment, the visual inspection system 100 may be configured to stop inspection of the article 110 once a defect is detected (and confirmed). Such an embodiment may provide speed advantages by limiting unnecessary inspection of a defective article 110. For example, if the article 110 has 10 sections and a defect is identified by the computing system 116 in the third section of the article 110, the visual inspection system 100 may stop inspection of the article and not proceed with inspecting the remaining 7 sections. The article 110 is unloaded and a new article is loaded and inspected. In some cases, upon stopping inspection of the article 110, the system 100 may read an article identifier on the article 110 (e.g. by moving imaging unit 122 and acquiring an image of the article identifier) and associate the identifier with the defect article before unloading. Stopping inspection may be initiated by the PLC 146 (e.g. after determining a detected defect is outside tolerance). The PLC 146 may send a stop inspection command to one or more components of the mechanical inspection subsystem 114. The stop inspection command, when received and processed by the receiving component, may cause the receiving component to perform one or more actions facilitating the stopping of inspection (e.g. moving the imaging unit 122 to a different position, unloading the article 110 from the article manipulator 120, moving the article 110 out of the inspection area, loading a new article onto the article holder 118, etc.).

[0243] In another embodiment, the visual inspection system 100 may continue after a defect has been found (i.e. not stop after a first defect identified and the article ruled defective). In such cases, the computing system 116 may include a defect counter for the article 110 that is updated each time a defect is found on the article 110 (i.e. detected in the image data). The defect counter is configured to keep a record of the defects detected for the article under inspection. Updating the defect counter may include adding defect data for the new detected defect to the record. The defect counter may comprise one or more software modules and may be implemented by the node device 148. In some cases, the defect counter may be updated with a new defect only when the defect is confirmed as a true detection using object tracking. The defect counter may be updated with a new defect only when the defect is determined unacceptable by the PLC 146 (e.g. by comparing to threshold).

[0244] The visual inspection system 100 may provide faster and/or more precise visual inspection of the article 110 as compared to existing techniques.

[0245] In some cases, the visual inspection system 100 may be able to detect a defect (e.g. porosity in a camshaft) in the presence of other miscellaneous objects (e.g. sludge) that may typically obscure such defects (e.g. when examined by a human). This may be particularly advantageous in manufacturing and other environments where articles 110 are not always clean and may have materials on the articles 110 that may hinder detection under existing approaches.

[0246] Embodiments may inspect subsections within each section at 4 degrees each, with a 360-degree rotation for each section (rotation provided by the article manipulator 120). For each section of the article 110 (e.g. camshaft), the article 110 may be imaged, rotated a certain number of degrees, and imaged again. This is performed for a full revolution of the article 110 for each section.

[0247] In an embodiment, defect data generated by the computing system 116 may be stored in a defect analytics database. Defect data stored in the analytics database may be provided from the node device 148. The defect data in the defect analytics database may be analyzed for analytics purposes to provide plant managers or other operators with a bigger picture of defect detection in the articles. For example, defect data for a number of articles 110 may be provided to a computing device configured to analyze the defect data and identify trends, sizes, and the like. Analysis may include the use of statistical libraries and machine learning (e.g. clustering) to see defect types and other trends. A report including such information may be generated and displayed to the operator, for example via the display 158. [0248] Referring now to Figure 3, shown therein is a visual inspection sequence 300, according to an embodiment. The sequence 300 may be performed by system 100 of Figure 1 .

[0249] At 310 the sequence starts.

[0250] At 312, the article 110 is loaded onto the article holder 118.

[0251] At 314, the article manipulator 120 engages with the article 110. This may allow the article manipulator 120 to move (e.g. rotate) the article 110 during inspection.

[0252] At 316, the imaging unit 122 is moved to a first imaging position. The imaging unit 122 is moved via the imaging unit manipulator 132.

[0253] At 318, the camera 128 of the imaging unit 122 captures images of the article 110 at the current imaging position.

[0254] At 320, the images are sent from the camera 128 to the node device 148.

[0255] At 322, the node device 148 performs object detection on the received images to detect defects in the article 110. Object detection is performed using the defect detection model 154. The images are analyzed to generate defect data.

[0256] At 324, the defect data for the images is sent to the PLC 146.

[0257] At 326, the PLC determines whether the defect data is within a predefined tolerance. The predefined tolerance is stored as tolerance data at the PLC 146. The determination at 326 may include comparing or referencing some or all of the defect data with the tolerance data.

[0258] At 328, if the defect data is not within the predefined tolerance, the article 110 is identified as defective and inspection ends.

[0259] At 330, the imaging unit 122 is moved back to the start or home position via the imaging unit manipulator 132. Once back at the home position, the sequence can return to start the process at 10 for another article 110.

[0260] At 332, if the defect data is within the predefined threshold at 326, the PLC 146 determines whether there is another section of the article 110 to be imaged. [0261] At 334, if there is another section to be imaged, the imaging unit 122 is moved via the imaging unit manipulator 132 to a second imaging position corresponding to the unimaged section.

[0262] Once the imaging unit 122 is moved to the second imaging position, the process 300 returns to 318 and images of the article 110 at the current imaging position (i.e. the second imaging position) are captured and the process resumes.

[0263] If there is no other section of the article 110 to be imaged, the inspection ends at 328. At 330, the imaging unit 122 is moved back to the home position and a new inspection can start at 310.

[0264] Referring now to Figure 4, shown therein is a visual inspection sequence 400, according to an embodiment. The inspection sequence 400 may be performed by the visual inspection system 100 of Figure 1 .

[0265] At 410, the inspection sequence 400 starts.

[0266] At 412, the PLC 146 establishes communication with a TCP port. This may occur upon starting up the inspection system 100.

[0267] At 414, handshaking is performed between the PLC 146 and the node device 148 through sending and receiving certain codes. The handshaking initiates communications between the PLC 146 and the node device 148. The handshaking begins when one device sends a message to another device indicating that it wants to establish a communications channel. The two devices 146, 148 then send messages back and forth that enable them to agree on a communications protocol.

[0268] At 416, after handshaking is complete, the node device 148 keeps listening to the PLC 146 to receive an inspection code. The inspection code instructs the node device 148 to perform inspection on the article 110 in the imaging zone 124 of the camera 128.

[0269] At 418, the article 110 is loaded onto the article holder 118. The article 110 may be loaded manually, or automatically. Automatic loading may include integration with a conveyor or the like. [0270] At 420, the first article manipulator 120a advances, moving towards the article 110. The first article manipulator 120a is moved via a first actuator (center 1 ).

[0271] At 422, the second article manipulator 120b advances, moving towards the article 110. The second article manipulator 120b is moved via a second actuator (center 2).

[0272] At 424, a rotational movement motor (motor 2) of the second article manipulator 120b rotates to engage a pin in the article 110 into a keyway (e.g. item 14 of Figure 111).

[0273] At 426, the camera 128 is moved down then left until the camera 128 reaches a first inspection position.

[0274] At 428, the rotational movement motor (motor 2) of the second article manipulator 120b starts to inch the article 110 in steps. For each step, inspection code is sent to the node device 148 to start object detection. In other embodiments, the inspection code may be sent to the node device 148 at the completion of a predetermined number of rotation steps (e.g. at completion of a 360-degree revolution). In such an embodiment, object detection may be performed on a stitched image constructed from a plurality of images corresponding to the steps.

[0275] At 430, images are captured of the article 110. The images are analyzed by the node device 148 using Al algorithms to detect defects in the images (e.g. via defect detection model 154).

[0276] At 432, the rotational movement motor (motor 2) of the second article manipulator 120b keeps rotating the article 110360 degrees while the article 110 is being imaged by the camera 128.

[0277] At 434, the camera 128 is moved to a second inspection position and the rotational movement motor (motor 2) of the second article manipulator 120b rotates the article 110 another 360 degrees. In an embodiment, the camera 128 (imaging unit 122) is moved via a stepper motor. In another embodiment, the camera 128 (imaging unit 122) is moved using a robotic arm (e.g. robotic arm 1484 of Figure 14). [0278] At 436, inspection continues, moving to successive inspection positions and performing inspection until inspection of each section of the article 110 is completed.

[0279] At 438, the PLC 146 sends a finish code to the node device 148. The finish code instructs the node device 148 to terminate detection until a new article 110 is loaded onto the article holder 118.

[0280] At 440, each time the node device 148 finds a defect, the node device 148 sends defect data including a defect type (e.g. porosity, sludge, etc.) and a defect size to the PLC 146.

[0281] If a defect is found, an NG code is sent from the node device 148 to the PLC 146.

[0282] If no defect is found, an OK code is sent from the node device 148 to the PLC 146. The OK code indicates that no defect has been identified in the section of the article 110 being inspected.

[0283] At 442, the PLC 146 determines whether the defect is an NG (no good, confirm defect) or OK (reject defect) using the received defect data and defect specifications stored at the PLC 146. The defect specifications include tolerances for specific sections of the article 110 linked to a horizontal section value and an angle value.

[0284] The PLC 146 makes the determination based on a horizontal reading identifying the section, which may be provided from a horizontal movement motor (e.g. stepper motor 1 for moving the imaging unit 122 including the camera 128), and a rotational reading from the rotational movement motor (stepper motor 2) of the second article manipulator 120b, which identifies an article angle (in reference to the pin/keyway). The PLC 146 determines if the defect is NG or OK according to the defect specification for the specific section/angle combination.

[0285] At 444, if the PLC determines that the part is NG, the system 100 stops and alarms. The stopping and/or alarming is performed according to a stop command and an alarm command generated by the PLC 146. For example, the PLC 146 may generate an alarm command and transmit the alarm command to an alarm system configured to generate or output an alarm upon receipt of the alarm command. [0286] At 446, if the PLC 146 determines the article 110 is OK, either by receiving an OK code from the node device 148 or by determining a defect identified by the node device 148 is within an allowed range/tolerance, the inspection continues.

[0287] At 448, the imaging unit 122 and article manipulators 120a, 120b go to a home position in a reverse order. This may include the first actuator (center 1 ) moving to the left, the second actuator (center 2) moving to the right, and the imaging unit 122 moving to the right and up.

[0288] At 450 the inspected article 110 is unloaded from the article holder 118.

[0289] At 452, a new article 110 is loaded onto the article holder 118.

[0290] In an embodiment, when the PLC 146 determines that the article 110 is NG, the imaging unit manipulator 132 (or robotic arm) moves the camera 128 to an article identifier position wherein an article identifier on the camshaft is within the imaging zone 124 of the camera 128. An image of the article identifier can be captured by the camera 128. The article identifier is a unique identifier on the article 110 (e.g. etched onto the article 110).

[0291] In an embodiment, the article identifier is a 2D data matrix. The 2D data matrix may be similar to a QR code. The article 110 is rotated to bring the 2D matrix parallel to the camera 128 so that the detected defect (e.g. defect data) can be associated with a serial number (article identifier) of the article 110. The defect and serial number can be saved to a database. The defect and serial number may be linked in the database. Storage of such data may advantageously provide a record of what specific articles were found defective and why. Such data may also be used as input in a defect detection analytics process to determine trends or root causes of defects.

[0292] Referring now to Figure 5, shown therein is a method 500 of running the network 156 of the node device 148 of Figure 1 in real-time, according to an embodiment. The method 500 may be implemented by the node device 148.

[0293] At 508, the method starts.

[0294] At 510, the node device 148 creates and loads an instance of the Basler Camera class (software API is provided by Basler on their open source github). [0295] At 512, the object detection model of choice is loaded into the GPU memory utilizing specific TensorRT libraries and/or optimized openCV CUDA backends.

[0296] At 514, object detection is started.

[0297] Referring now to Figure 6, shown therein is a method of object detection 600 performed by the visual inspection system 100, according to an embodiment. The method 600 may be performed by the node device 148 of Figure 1 .

[0298] The method 600 may be implemented at step 514 of the method 500 of Figure 5.

[0299] At 610, the method starts.

[0300] At 612, an image frame is grabbed from the camera 128.

[0301] At 614, the grabbed image is preprocessed. Preprocessing is done to get the image ready for the network 156.

[0302] Preprocessing the image may include any one or more of cropping, converting to a numpy array, changing pixel values to int8 format, changing BGR format to RGB format, and resizing using Bilinear interpolation.

[0303] At 616, the preprocessed image is passed through the neural network 156. Passing the image through the network 156 generates bounding boxes with classes and confidence scores. The bounding box encloses an object (defect) located in the image. The class corresponds to a defect type.

[0304] At 618, the bounding boxes obtained at 614 are post-processed to make sure they translate properly with specific image dimensions when the image is up-scaled back to its original size after the resizing in the preprocess function.

[0305] At 620, the node device 148 keeps track of each object detected. This includes tracking a detected defect across a plurality of image frames. Tracking the defect in this manner may provide greater certainty that the detected defect is in fact a defect and not a one-off incorrect detection.

[0306] Tracking may allow the algorithms to reduce false positive detections and random one-off detections. [0307] At 622, while tracking each individual defect, size information for the tracked defect is stored over every frame in which the defect is seen.

[0308] At 624, the node device 148 determ ines whether the tracked object appears for a minimum number of N consecutive frames without dropping a single frame.

[0309] At 626, if the tracked object has been seen for a minimum number of N consecutive frames without dropping a single frame, the detected object is counted as a true detection.

[0310] Once the detection is considered a true detection, an average size for the defect is calculated using the size information across all the frames in which the defect appeared. This technique may reduce the variance from the bounding box sizing.

[0311] At 628, the defect data for the true detection is sent to the PLC 146. Defect data may include any one or more of defect size, defect location and defect class. The defect data may be sent from the node to the PLC 146 via TCP/IP over a socket.

[0312] When the defect data is received by the PLC, the PLC makes note of the stepper motor angle and relates the defect size to the tolerance on that particular section (with regards to rotation) and location of the article 110.

[0313] The defect location is transmitted to the PLC 146 with bounding box coordinates x0,y0,x1 ,y1 . The PLC 146 can use the defect location information to pinpoint where on that specific section (e.g. specific lobe or journal of a camshaft being inspected) the defect was found.

[0314] At 630, if the tracked object has not been seen for the minimum number of N consecutive frames, the detection is counted as a false detection.

[0315] At 632, the object that is subject to the false detection is disregarded (i.e. the defect data is not sent to the PLC).

[0316] At 634, the process ends. Steps 612 to 632 may be repeated until inspection of the article 110 is complete.

[0317] Referring now to Figures 7 to 10, shown therein are example visual inspection images generated by an embodiment of the visual inspection system 100 of Figure 1. The embodiment of system 100 has been configured to image camshafts and detect paint and porosity defect types therein.

[0318] The images represent images that have been acquired by the imaging unit 122, processed by the node device 148 using object detection techniques to detect and classify defects, and displayed via the user control device 150. The images may be stored by the node device 148 and/or user control device 150 for further review and analysis.

[0319] Image 700, 800, 900, 1000 may be displayed to an operator of the system 100 via the user control device 150.

[0320] Image 700 of Figure 7 shows a camshaft 710 (i.e. article 110 of Figure 1 ) that has been inspected by the system 100 of Figure 1. In particular, image 700 shows results of a visual inspection operation performed on a section 712 of the camshaft 710. The section 712 corresponds to an article section as described in reference to article 110 of Figure 1 . The system 100 is configured to detect paint and porosity defects.

[0321] The system 100 has identified three defects in the camshaft 710 based on the image data captured thereof. The defects are enclosed by bounding boxes 714, 716, and 718. The defects include a first paint defect contained in bounding box 714, a second paint defect contained in bounding box 716, and a porosity defect contained in bounding box 718.

[0322] The bounding boxes 714, 716, 718 are generated by the node device 148 during object detection.

[0323] As shown, the user control device 150 may be configured to associate a unique colour indicator with each different defect type. For example, bounding boxes enclosing a particular defect type may be given a particular colour. This may allow a user to more easily identify and distinguish different types of defects present in the image 700. In the example of image 700, a green colour is associated with displayed information relating to a paint defect type and a red colour is associated with displayed information relating to a porosity defect type. Other embodiments may utilize other types of unique indicators for distinguishing between defect types.

[0324] Each defect in the image 700 has defect data 720 associated therewith. [0325] The defect data 720 is generated and stored by the node device 148 during the object detection process. The defect data 720 may be passed from the node device 148 to the user control device 150. The user control device 150 stores the defect data 720 and is configured to generate a visualization displaying the defect data 720.

[0326] In image 700, the user control device 150 displays the defect data 720 such that the defect data 720 is linked with the bounding box of the same defect, making it easy for the user to identify the defect data 720 relating to a particular identified defect.

[0327] The defect data 720 includes a defect type (or class) 722, a defect confidence level 724, a defect size 726, and a defect location 728. Variations of the system 100 may include more or fewer types of defect data 720. This information may be used by the PLC 146 to understand how to compare each respective defect to the predefined camshaft defect tolerances.

[0328] The defect type 722 includes the type of defect detected. Defect types in image 700 include paint defect type and porosity defect type.

[0329] The defect confidence level 724 represents a confidence level for the object detection and classification (i.e. for the assignment of the defect type 722 to the defect).

[0330] The defect size 726 indicates the size of the defect in a particular unit of measurement. The defect sizes 726 in image 700 are in millimeters.

[0331] The defect location 728 indicates the location of the defect. The defect location includes (x,y) coordinates.

[0332] It may be important to have the operator view the defects in real-time as the Al during the early stages of inspection when the Al is still gaining confidence detecting the defects. As the Al inspects more parts, it collects and learns how to identify more defects with more confidence. There may come a point where the Al no longer needs to be watched by the human, because the Al performance far surpasses human capability.

[0333] Image 800 of Figure 8 shows another camshaft 810 that has been inspected by the embodiment of system 100 used to generate image 700 of Figure 7. In particular, image 800 shows results of a visual inspection operation performed on a section 812 of the camshaft 810. [0334] Image 800 shows a paint defect and a porosity defect in the section 812. The paint defect is outlined by a first bounding box 814. The porosity defect is outlined by a second bounding box 818.

[0335] The paint defect and porosity defect each have defect data 820 associated with them. The defect data 820 is generated by the node device 148 during object detection.

[0336] Image 900 of Figure 9 shows yet another camshaft 910 that has been inspected by the system 100 of Figure 1 . In particular, image 900 shows results of a visual inspection operation performed on a section 912 of the camshaft 910.

[0337] Image 900 shows a paint defect and a porosity defect in the section 912. The paint defect is outlined by a first bounding box 914. The porosity defect is outlined by a second bounding box 918.

[0338] The paint defect and porosity defect each have defect data 920 associated therewith. The defect data 920 is generated by the node device 148 during object detection.

[0339] Image 1000 of Figure 10 shows yet another camshaft 1010 that has been inspected by the system 100 of Figure 1. In particular, image 1000 shows results of a visual inspection operation performed on a section 1012 of the camshaft 1010.

[0340] Image 1000 shows a paint defect and a porosity defect in the section 1012. The paint defect is outlined by a first bounding box 1014. The porosity defect is outlined by a second bounding box 1016.

[0341] The paint defect and porosity defect each have defect data 1020 associated with them. The defect data 1020 is generated by the node device 148 during object detection.

[0342] Referring now to Figures 11 A to 111, shown therein are multiple views of a mechanical inspection subsystem of a visual inspection system of the present disclosure, according to an embodiment. The visual inspection system is adapted to inspect camshafts. The visual inspection system 100 may be an embodiment of the visual inspection system 100 of Figure 1. The mechanical inspection subsystem may be the mechanical inspection subsystem 114 of Figure 1 . The mechanical inspection subsystem of Figures 11 A to 111 can be communicatively connected to a computing system, such as computing system 116 of Figure 1 , that is configured to perform visual inspection according to the techniques described herein.

[0343] Reference numerals provided in Figures 11A to 111 correspond to parts numbers found in the parts table 110Oj of Figure 11 J.

[0344] Figure 11 A shows a first perspective view 1100a of a mechanical inspection subsystem, according to an embodiment.

[0345] Figure 11 B shows a second perspective view 1100b of the mechanical inspection subsystem of Figure 11 A, according to an embodiment.

[0346] Figure 11 C shows a top view 1100c of the mechanical inspection subsystem of Figure 11 A, according to an embodiment.

[0347] Figure 11 D shows a front view 1100d of the mechanical inspection subsystem of Figure 11 A, according to an embodiment.

[0348] Figure 11 E shows a back view 1100e of the mechanical inspection subsystem of Figure 11 A, according to an embodiment.

[0349] Figure 11 F shows a side view 110Of of the mechanical inspection subsystem of Figure 11 A, according to an embodiment.

[0350] Figure 11 G shows a front view 110Og of an imaging unit of the mechanical inspection subsystem of Figure 11 A, according to an embodiment. The imaging unit may be the imaging unit 122 of Figure 1.

[0351] Figure 11 FI shows a side view 1100h of the imaging unit of Figure 11 A, according to an embodiment.

[0352] Figure 111 shows a front view 110Oi of an article holder and article manipulator of the mechanical inspection subsystem of Figure 11 A, according to an embodiment. The article holder may be article holder 118 of Figure 1. The article manipulator may be the article manipulator 120 of Figure 1 . [0353] Figure 11 J shows a parts table 110Oj for the mechanical inspection subsystem of Figure 11 A, according to an embodiment.

[0354] Referring now to Figure 12, shown therein is a front view 1200 of a mechanical inspection subsystem of the present disclosure, according to an embodiment. The mechanical inspection subsystem may be the mechanical inspection subsystem 114 of Figure 1 .

[0355] Referring now to Figure 13, shown therein is a method 1300 of real-time streaming video analysis, according to an embodiment. The method 100 may be performed by the visual inspection system 100 of Figure 1 .

[0356] At 1308, the method 1300 starts.

[0357] At 1310, the node device 148 connects to the first available camera 128.

This may include looking on an available usb 3.0 (or other interface) and seeing if any cameras are connected.

[0358] The node device 148 may connect to the camera 128 via a camera API. The camera API is a set of specifications allowing application software on the node device 148 to communicate with the camera 128.

[0359] In an embodiment wherein the camera 128 is a Basler camera, the node device 148 connects to the first available Basler camera via a PylonTM API. The Basler Pylon camera software suite is a collection of drivers and tools for operating the Basler camera.

[0360] At 1312, the node device 148 grabs images continuously from the camera 128. The images grabbed from the camera 128 by the node device 148 are in a bit map format, which is then converted into a numpy array format with BGR channel ordering utilized by OpenCV.

[0361] At 1314, the grabbed images are converted from the camera format to another format suitable for use with OpenCV or the like. In an embodiment, the grabbed images are converted from a Basler camera format to OpenCV BGR format. [0362] At 1316, the OpenCV BGR formatted image data is converted to a numpy array. The numpy array may provide smaller memory consumption (numpy data structures may take up less space) and better runtime behavior.

[0363] At 1318, the numpy array is sent to a preprocessing function for pre processing. For example, the numpy array may be sent to preprocessing function 614 of Figure 6 and further processed according to the method 600 of Figure 6.

[0364] The pre-processing function preprocesses the image to prepare the image for the neural network 156. The preprocessing may include any one or more of cropping, converting to a numpy array, changing pixel values to int8 format, changing BGR format to RGB format, and resizing using bilinear interpolation.

[0365] Referring now to Figures 14A to 14D, shown therein are perspective, top, front, and side views 1400a, 1400b, 1400c, 1400d, respectively, of a mechanical inspection subsystem 1414 for use in a visual inspection system, according to an embodiment. The mechanical inspection subsystem 1414 uses robotic automation.

[0366] The mechanical inspection subsystem 1414 may be used as the mechanical inspection subsystem 114 of Figure 1 .

[0367] Elements of Figures 14A to 14D may have counterpart elements described in Figure 1. If an element of the mechanical inspection subsystem 114 is referred to by reference number 1XX and this element has a counterpart in the mechanical inspection subsystem 1414, the counterpart element in the mechanical inspection subsystem 1414 is referred to by reference number 14XX (i.e. having the same last two digits as its counterpart in Figure 1 ). Counterpart elements may perform the same or similar functions.

[0368] The mechanical inspection subsystem 1414 is communicatively connected to a computing system (e.g. computing system 116 of Figure 1 ) to facilitate data transfer between the computing system and the mechanical inspection subsystem 1414 to perform visual inspection. The computing system 116 performs various control, analysis, and visualization functions, such as described herein.

[0369] The mechanical inspection subsystem 1414 includes a robotic subsystem 1482. The robotic subsystem 1482 includes a robotic arm 1484 and a robotic arm controller (not shown) for controlling movement of the robotic arm 1484. The robotic arm 1484 is configured to move according to commands from the robotic arm controller. The robotic subsystem 1482 includes a base 1486 for mounting the robotic arm 1484 to a surface. The surface may be on a stationary or mobile object.

[0370] The mechanical inspection subsystem 1414 includes an imaging unit 1422 attached to the robotic arm 1484. The imaging unit 1422 includes a camera 1428 and a lighting mechanism 1426.

[0371] The robotic arm 1484 is configured to move the imaging unit 1422 to capture images of an article 1410 under inspection. The robotic arm 1484 may be configured to move in three dimensions. During inspection, the article 1410 is engaged by article manipulators 1420a and 1420b. The article manipulators 1420a, 1420b facilitate the rotation of the article 1410 during inspection (e.g. such as along a line of motion similar to line of motion 125 of Figure 1 ). By rotating the article 1410, the imaging unit 1422 can capture images of the entire article 1410.

[0372] In a particular case, during an inspection operation the robotic arm 1484 moves the imaging unit 1422 to a section of the article 1410 proximal to the article manipulator 1420a and captures a plurality of images representing a 360-degree imaging of that section of the article 1410 at that position. The robotic arm 1484 then moves the imaging unit 1422 to a second section of the article 1410 (i.e. the robotic arm 1484 moves the imaging unit 1422 further from article manipulator 1420a to image the second section) for similar 360-degree imaging at the second position. This process can be repeated along the length of the article 1410 (e.g. from an end proximal to article manipulator 1420a, to an end proximal to article manipulator 1420b).

[0373] Referring now to Figure 15, shown therein is block diagram 1500 illustrating communication between components in a visual inspection system of the present disclosure, according to an embodiment. In Figure 15, solid lines between components indicate data communication between the connected components. Such communication may be wired or wireless depending on the type of connection.

[0374] The visual inspection system includes an Al visual inspection machine 1502 and a production machine 1504. The production machine 1504 may be the last process performed on the article to be inspected. The production machine 1504 generates an output that is provided as input to the Al visual inspection machine 1502. The Al visual inspection machine 1502 may be the visual inspection system 100 of Figure 1.

[0375] The production machine 1504 includes a PLC 1506 (denoted “PLC 2”) and an Ethernet/IP module 1508. The Ethernet/IP module 1508 facilitates communication between the PLC 1506 and components of the Al visual inspection machine 1502. The PLC 1506 is a controller of the production machine 1504. The PLC 1506 communicates with PLC 1514 (described below) of the inspection machine 1502 for integration purposes (e.g. interface I/O signals) that facilitate proper operation of the production machine 1504 and the visual inspection machine 1502.

[0376] The Al visual inspection machine 1502 includes a robot 1510 and an Ethernet/IP module 1512. The robot 1510 may be the robotic subsystem 1482 of Figure 14 and may include a robotic arm (e.g. robotic arm 1484) for manipulating an imaging unit of the Al visual inspection machine 1502. The Ethernet/IP module 1512 facilitates communication between the robot 1510 and components of the Al visual inspection machine 1502 and the production machine 1504. In particular, the Ethernet/IP module 1512 may communicate with the Ethernet/IP module 1508 to facilitate communication between the robot 1510 and the PLC 1506 of the production machine 1504. In some cases, the PLC 1506 may not need to communicate with the robot 1510. However, in some other cases, Ethernet/IP communications may not be feasible directly between the PLC 1506 of the production machine 1504 and the PLC 1514 of the visual inspection machine 1502 (e.g. between a Mitsubishi PLC and a Keyence PLC) and may thus be achieved through the robot 1510. In such cases, the robot 1510 may act as a medium through which signals can be transferred between the PLC 1506 and the PLC 1514.

[0377] The Al visual inspection machine 1502 also includes a PLC 1514 (denoted “PLC 1”). The PLC 1514 may be the PLC 146 of Figure 1 . The PLC 1514 communicates with other components of the Al visual inspection machine 1502 and production machine 1504 via an Ethernet/IP module 1516 and an Ethernet module 1518. In particular, the PLC 1514 communicates with the robot 1510 via communication between Ethernet/IP module 1516 and Ethernet/IP module 1512. The PLC 1514 may communicate with the PLC 1506 of the production machine 1504 via communication between Ethernet/IP module 1516 and Ethernet/IP module 1508.

[0378] The Al visual inspection machine 1502 also includes an automation component 1520. The automation component 1520 communicates directly with the PLC 1514. The automation component 1520 includes components responsible for automating the inspection process (e.g. components of a mechanical inspection subsystem, such as subsystem 114 of Figure 1 or 1414 of Figure 14). The automation components 1520 may include, for example, stepper motors, actuators (e.g. cylinders), sensors, etc. The automation components 1520 communicate with the PLC 1514 through direct hard-wired connections. The automation components 1520 communicate with the PLC 1514 so the PLC 1514 can perform the desired inspection sequence through the I/O signals. For example, the PLC 1514 may initiate an output signal to advance a cylinder (e.g. first actuator as described above). When the cylinder is advanced, an input is provided back to the PLC 1514 to confirm the cylinder “Advanced” status, and so on.

[0379] The Al visual inspection machine 1502 includes an optics component 1522 and a lighting component 1524. The optics component 1522 may be the camera 128 of Figure 1. The lighting component 1524 may be the lighting mechanism 126 of Figure 1. The optics component 1522 and the lighting component 1524 compose an imaging unit of the Al visual inspection machine 1502.

[0380] The lighting component 1524 communicates directly with the PLC 1514.

[0381] The Al visual inspection machine 1502 includes an Al device 1526. The Al device 1526 may be the node device 148 of Figure 1 . The Al device 1526 communicates directly with the optics component 1522.

[0382] The Al device 1526 communicates with the PLC 1514 via the Ethernet module 1518.

[0383] The Al visual inspection machine 1502 also includes a display/human- machine interface (HMI) 1528 and a human interface 1530. The display/HMI 1528 may be the display 158 or user control device 150 of Figure 1 . The human interface 1530 may be the user interface 160 of Figure 1. The display/HMI 1528 communicates directly with the human interface 1530.

[0384] The display/HMI 1528 communicates with the PLC 1514 via the Ethernet module 1518.

[0385] The human interface 1530 communicates directly with the Al device 1526.

[0386] While the above description provides examples of one or more apparatus, methods, or systems, it will be appreciated that other apparatus, methods, or systems may be within the scope of the claims as interpreted by one of skill in the art.