Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR INSPECTING CONTAINERS USING MULTIPLE IMAGES OF THE CONTAINERS
Document Type and Number:
WIPO Patent Application WO/2017/117566
Kind Code:
A1
Abstract:
A system and method for inspecting containers by detecting reflected light from defects are described. A light source illuminates the contents of a container. One or more cameras are oriented to detect and the light reflected by one or more defects contained within the container, which are reflected from two or more mirrors towards the camera or cameras.

Inventors:
CHEN JAMES (US)
GHORISHI WILLIAM (US)
FAIRBAIRN KEVIN (US)
DONG RUI-TAO DONG (US)
HUDSON ALEXANDER (US)
Application Number:
PCT/US2016/069578
Publication Date:
July 06, 2017
Filing Date:
December 30, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INDUSTRIAL DYNAMICS CO (US)
International Classes:
G01N21/896; G01N21/88; G01N21/94; G01N21/954; G01N21/958
Domestic Patent References:
WO2010090605A12010-08-12
Foreign References:
US6028302A2000-02-22
US4959537A1990-09-25
US7232233B22007-06-19
US5896195A1999-04-20
US8958064B22015-02-17
US5256871A1993-10-26
US20140324520A12014-10-30
US6448549B12002-09-10
Attorney, Agent or Firm:
ACHARYA, Nigamnarayan (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system for detecting a defect in a transparent or translucent container, comprising:

a. a light source emanating light rays towards a region of interest of the container such that the light can reflect off the defect inside the container,

b. a sensor array to receive the reflected light rays,

c. a first mirror and a second mirror to reflect the reflected light rays to the sensor array, wherein the first mirror and the second mirror reflect the reflected light to the sensor array at different angles; and

d. a processor to receive at least two images from the sensor array and able to detect the defects in the container by analyzing the light pattern on the images.

2. The system of claim 1, wherein the light source is situated below or above the container.

3. The system of claim 1, further comprising a third mirror.

4. The system of claim 1, wherein the light source is collimated light.

5. The system of claim 1, wherein the light source is polarized light.

6. The system of claim 1, wherein the region of interest is a bottom half of the container.

7. The system of claim 1, wherein the region of interest is a bottom third of the container.

8. The system of claim 1, wherein the first mirror and the second mirror create an image of the container differing by between 2 degrees and 45 degrees.

9. The system of claim 1, further comprising a base above the light source.

10. The system of claim 1, wherein the processing is a deterministic analysis. 11. The system of claim 9, wherein a gap exists between the bottom of the container and the base.

12. The system of claim 1, wherein the sensor array is a camera.

13. The system of claim 1, wherein more than 100 bottles per minute are inspected by the system.

14. The system of claim 1, wherein a base axis and the first mirror axis are at an angle of about 20 to 25 degrees.

15. The system of claim 1, wherein a base axis and the second mirror are at an angle of about 10 to 30 degrees.

16. The system of claim 1, wherein the average viewing angle is between 10 and 35 degrees.

17. A method for detecting defects in a partially transparent container, comprising:

a. transporting the container for inspection;

b. directing light toward a region of interest of the container from the light source at different reflection angles;

c. receiving the reflected light rays to capture two mages of the region; and d. differentiating between different types of defects in the container by analyzing the light pattern from the two images.

18. The method of claim 17, further comprising transporting the container above a base having a light source under the base.

19. The method of claim 18, wherein the light is collimated light.

20. The method as claimed in claim 16, wherein by using two cameras disposed at different angles, it is determined whether a foreign substance is situated inside or outside each container.

21. The method of claim 17, wherein the analyzing is a deterministic analysis. 22. The method of claim 17, wherein the region is a bottom half of the container.

23. The method of claim 17, wherein the analyzing comprises multi -threading.

24. The method of claim 23, wherein the multi-threading comprises simultaneous analysis of multiple stereoscopic images, wherein the multiple stereoscopic images include the two stereoscopic images.

25. The method of claim 17, wherein the analyzing comprises pipelining.

26. The method of claim 25, wherein the pipelining comprises simultaneously analyzing the light pattern from the two stereoscopic images while acquiring stereoscopic images of a subsequent container.

27. A system for detecting a defect in a transparent or translucent container, comprising: a. a light source, situated below or above the container, emanating light rays towards a region of interest of the container such that the light can reflect off a potential foreign defect inside the container.

b. a first sensor array receives the reflected light rays to capture a first image of the region of interest at a first angle,

c. a second sensor array receives the reflected light rays to capture a second image of the region of interest at a second angle, wherein the first angle and the second angle are different;

d. a processor to receive at least two images from the sensor array and able to detect the defect in the container by analyzing the light pattern on the images.

28. The system of claim 27, wherein the light source is situated below or above the container.

Description:
SYSTEM AND METHOD FOR INSPECTING CONTAINERS USING MULTIPLE

IMAGES OF THE CONTAINERS

CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit of U.S. Provisional Application No. 62/273,644, filed December 31, 2015.

FIELD OF TECHNOLOGY

This disclosure relates generally to inspection of bottles and containers. More particularly, the disclosure relates to bottle or container inspection involving illumination and imaging to identify potential foreign objects and defects inside the bottle or container.

BACKGROUND

Beverages that are contained within bottles are produced, purchased, and consumed daily. Since these beverages are consumer products, they are subject to rigorous quality control and inspection requirements, which are often performed directly on the containers while on production lines. The production process includes various functions, such as washing the bottle, inspecting the bottle for defects, filling the bottle with a beverage, e.g., soda or beer, applying a closure and labeling the bottle. Quality inspection of the filled- containers occurs when the bottles run single-file, at the outfeed of the filler or in-feed or out- feed of the labeling machine.

Cost-effective inspection solutions to detection of small defects (e.g., 1 to 3 mm) have continued to be illusive. Fragments of glass and other physical contaminates can lie at the bottom of a container and avoid detection. Failure to detect contaminates has led to expensive product recalls and damage to brands. To date, in spite of extensive development programs by multiple inspection companies, few effective inspection solutions have materialized to reliably detect, e.g., small, randomly shaped physical contaminates - many of the existing methods have suffered from false "positive" detections, which adds costs.

US Patent No. 7,982,868 discloses a method and apparatus for detecting foreign substances in a container using two or more different orientations of two or more cameras in which the cameras and light sources are mutually connected such that in a short time two or more images of a container filled with liquid can be recorded with mutually differing illumination. In this patent, a probability is determined that detected substances in fact represent undesirable foreign substances. In other words, decisions to reject containers are based upon probability distributions. The method and system in the patent is likely to cause the bottle to drag and fall over, multiple images taken from cameras at different positions and times, and requires high data processing and camera imaging rates.

There is always a need for improved methods and systems for inspecting bottles and containers. It is to this need, among others, that this disclosure is directed.

SUMMARY

This disclosure describes a system and method for inspecting bottles or containers to detect defects by utilizing a light source (e.g., directed light, collimated light or scattered light), aimed along the axis of the bottle and reflected or refracted by a physical contaminate, into one more mirrors which are then reflecting that image into a camera or sensor array. The system and method disclosed herein can provide a bottle/container inspection component that performs multiple inspection functions, in a smaller footprint than that of current systems.

An aspect of the present disclosure relates to a system for inspecting a bottle or container. The inspection component includes a light source that generates a lightbeam and a camera or cameras (sensor arrays or sensor arrays) that detect a portion of the light that is reflected or refracted by a contaminant or defect within the bottle. The system can include multiple mirrors, which allow for at least two images of a potential or identified defect, per camera frame.

Another aspect includes a system for detecting a defect in a transparent or translucent container having a light source emanating light rays towards a region of interest of the container such that the light can reflect off the defect inside the container, a sensor array to receive the reflected light rays, a first mirror and a second mirror to reflect the reflected lights rays to the sensor array. The first mirror and the second mirror reflect the reflected light to a sensor array at different angles. A processor can receive at least two images from the sensor array and can detect the defect (singular or plural) in the container by analyzing the light pattern or combined pattern of light on the images. Additional cameras mirror, and images can be used to provide more complete data for analysis.

Another aspect includes a system for detecting a defect in a transparent or translucent container, having a light source emanating light rays towards a region of interest of the container such that the light can reflect off the defect inside the container. A first sensor array receives the reflected light rays to capture a first image of the region of interest at a first angle, and a second sensor array receives the reflected light rays to capture a second image of the region of interest at a second angle. The first angle and the second angle are different. A processor to receive the images from the sensor array and is able to detect the defect in the container by analyzing the light pattern on the images. The light source can be situated below or above the container.

Another aspect includes a method for detecting defects in a partially transparent container including transporting the container for inspection; directing light toward a region of interest of the container from the light source such that the light reflects off the region of interest as reflected light rays extending at different reflection angles; receiving the reflected light rays to capture two images of the region; and differentiating between different types of defects in the container by analyzing the light pattern from the two images at a different viewing angle. The method can include transporting the containers above a base having a light source under the base. The light can be collimated light. The two or more sensor arrays may be disposed at different angles. The foreign substance or defect can be situated inside or outside each container. The analyzing can be or use a deterministic analysis.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 shows an inspection system for detection of contaminants during inspection of a container, under an embodiment.

FIG. 2 shows a bottle positioned over a light source situated in an aperture plate, under an embodiment. FIG. 3 shows a side view of the system with a container moving through the system and to a conveyer, under an embodiment.

FIG. 4 shows an inspection system for detection of contaminants during inspection of a container, under an embodiment.

FIG. 5 shows an inspection system for detection of contaminants during inspection of a container, under an embodiment.

FIG. 6 shows a contamination detection effectiveness response curve, under an embodiment.

FIG. 7 shows two images of the same bottle taken at different angles, under an embodiment.

FIG. 8 shows mapped images resulting from calibration map transformation, under an embodiment.

FIG. 9 shows a transaction line, under an embodiment.

FIG. 10 shows a region of interest, under an embodiment.

FIG. 11 shows the edge of a transaction line, under an embodiment.

FIG. 12 defines a cylindrical shape of a bottle, under an embodiment.

DETAILED DESCRIPTION

Specific embodiments relate to a system and a method for detecting defects (singular or plural) within bottles or containers. For example, a defect can be a glass fragment generated at various production line locations such as at the filler and the crowner of a bottle. Such defects include induced glass fragments that may be introduced during the filling process and may settle at the lowest part of the base of the bottle. False alarm features may be produced by mold-marks, asci-text and other embossing, glass texture, discontinuous 'smiley face' artifact, water droplets, exterior foam bubbles and interior glass inclusions such as bubbles or cracks can be reduced by certain embodiments. The false alarm features can occur outside of the bottle or within the container walls. In specific embodiment, the defects or objects of interest lie inside of the bottle.

Referring in more detail to the drawings, FIG. 1 illustrates a system or inspection component 100 for detection of defects during inspection of a container 200, which may be formed of a transparent or partially transparent material, for example, glass. In general, the inspection system or inspection component 100 capable of detecting artifacts and defects (e.g., sinking objects) through reflection or refraction of light inside containers can distinguish the location of the object in the images in a 3D space, allowing false alarm objects outside of the bottle and within the bottle walls to be reduced. The term "reflection" is used in this description to include both the reflection and refraction of light by a defect. The camera or sensor 10 is operatively connected to a processer, which determines whether there is a defect in the container 200.

As can be seen from FIG. 1, the inspection system or component 100 for detecting a defect in a bottle 200 has a light source 45, a camera or sensor array 10, and one or more mirrors 20, 25 in a platform. In one embodiment, the two mirrors 20, 25 are placed to maintain the same path length from the object to the camera 10. When the bottle 200 does not contain a defect, the emitted light generated by the light source 45 can pass substantially un-reflected through the bottle 200. However, when the bottle 200 contains a defect (e.g., a fragment) located within the path of the light 30, 35, the defect F reflects at least a portion of the light reflected to mirror 20 and mirror 25, which is captured and/or measured by the camera 10. The light angles (from the mirrors) are about 10-20 degrees from the bottle 200 in some examples. In some examples, there can be a light source 45 below the container 200 and a gap between the bottom of the container 200 and the light source 45. The camera may comprise an imaging sensor or a processer using machine vision or vision software. Most containers or bottles 200 have mold marks on the wall of the bottle such as mold dots, mold lines and alpha numeric marks, which may interfere with the imaging of contaminates inside the container. By combining information from these different images of the container 200, a three-dimensional (3D) position of the possible defect can be identified by the system 100. Multiple images of the same scene or view of the region of interest are used to create a 3D representation or stereoscopic view of the container 100, including the defect.

FIG. 2 shows an embodiment of the system 100 showing the bottle 200 over a light source 45 (e.g., collimated light) situated in a base or aperture plate 40. Collimated light 45 can pass through an aperture plate 40, can be reflected off certain defects in the bottle 200 in paths 30, 35 to mirrors 20, 25, and to the lens 15 of camera 10. The bottle 200 can be held (e.g., by belts 85) above the aperture plate 40 at a distance D. (Note that distance D is also shown in FIG. 3).

The distance D from the bottom of the bottle to the lower edge of the base or aperture plate 40 can depend on the physical dimensions of the bottle and can be, e.g., about 0.5 to 2mm for a 65 mm diameter bottle. The diameter of the aperture can be sized to be approximately 90% of the diameter of the bottle 200. This enables the wall of the container to act as a light pipe, illuminating any physical contaminate in the path of the light while minimizing the illumination of physical discontinuities on the wall of the bottle such as random mold marks such as dots, alpha numeric marks and lines which can potentially confuse the machine vision algorithms used to detect contamination inside a bottle. The gap/distance D between the bottom of the container 200 and the plate 40 reduces mirroring artifacts created by the bottom of the bottle and the transport surface being in contact. Furthermore, it is common to use water and detergents as lubrication on bottling line conveyors which can cause false imaging artifacts due to foam and or water droplets. The gap can enable the use of an air knife to dry the bottom of a bottle to improve ease of contamination or defect detection. Furthermore, the gap can help prevent dragging of the bottle bottom which may cause bottle tipping during the inspection process.

FIG. 3 shows a side view of the system 100 with the container 200 moving through the system and to the conveyer 88. In this example, moving belts 80 are positioned above the base of the container 200 to enable the viewing of the region of interest at the bottom of the bottle as shown by the intersection of camera views/light paths 30 and 35 in a configuration shown in FIG. 2. The system includes two compliant belts 80 that are constrained by a backing plate 85. The moving belts 80 can pull the bottle 200 through the system before depositing the bottles back on a conveyor track 88. An alternative embodiment (not shown) is to hold the bottle by a mechanical means such as a clamp and use a mechanized transport system such as a starwheel or linear robotic type system. In one example, a processor may be part of a glass container inspection computer including memory coupled to the processor, and one or more interfaces coupled to the processor and coupled to one or more input devices (e.g. image sensors, position sensors, user interfaces, etc.) and/or one or more output devices (e.g. light sources, material handlers, displays, etc). The computer further may include any ancillary devices, for example, clocks, internal power supplies, and the like (not shown). The processor may process data and execute instructions that provide at least some of the functionality for the presently disclosed apparatus. As used herein, the term "instructions" may include, for example, control logic, computer software and/or firmware, programmable instructions, or other suitable instructions. The memory may include any computer readable medium or media configured to provide at least temporary storage of at least some data, data structures, an operating system, application programs, program modules or data, and/or other computer software or computer-readable instructions that provide at least some of the functionality of the presently disclosed apparatus and that may be executed by the processor. The data, instructions, and the like may be stored, for example, as look-up tables, formulas, algorithms, maps, models, and/or any other suitable format.

FIGs. 4 and 5 show additional embodiments of the inspection system 100. FIG. 4 shows an inspection system 100 in which the light 45 is reflected to a camera 10 with its associated lens 15 mounted adjacent to the bottle 200 to produces a single image, which contains two views 30 and 35 of the region of interest at the bottom of the bottle. The two views are obtained from two mirrors 20 and 25 which are arranged to produce angularly offset images of contaminates or defects inside the bottle. Contaminates or defects can be distinguished from bottle artifacts that exist on the exterior of the bottle such as mold markings and water droplets. Stereoscopic views or multiple views of the defect can be processed to determine whether the defect is a commercially significant defect.

FIG. 4 shows an additional camera 11 and lens 16 is used in conjunction with a beam splitter 27 to capture another view/light path 37 of through the lower portion of the bottle 200. The inspection component includes a directed light source that emits light 45 and at least two cameras 10, 11 positioned to detect a portion of the light that is reflected by a defect (e.g., a fragment) in a bottle. This arrangement allows additional detection of certain types of defects that produces stronger reflection or shadowing of light as seen from beneath the bottle compared to that imaged by views 30 and 35. Combining views 30, 35 and 37 can further increase the detection capability of a system. An additional embodiment can use a strobe light to side light the bottle 200 and create view 37 to capture a dark field image of possible contamination in the bottle.

FIG. 5 show another embodiment of the system 100 with a third mirror 29 that is introduced at an orientation off axis to mirrors 20 and 25 to produce additional images. The information in view 39 can be compared to the stereoscopic view produced by views/light paths 30 and 35 to provide more accurate detection of defects. The information can provide further definition as the size and shape of the detected contaminant.

FIG. 6 presents an exemplary contamination detection effectiveness response curve, which summarizes results from experiments to determine the optimum angles to image the contamination defects in the presence of mold marks and lines. At shallow angles, the mold marks (dots and alpha numeric marks) have less effect as they are less visible; however the mold lines are very visible with the overall contamination detection rate subdued. At high angles of imaging, which would be ideal for imaging contaminates or defects, the mold marks are very visible, while the mold line is significantly less. The optimized viewing angles are thus chosen to be approximately 10 to 35 or 20 to 25 degrees to minimize the effect of random mold marks with mirror angle differential being approximately 10 degrees.

FIG. 7 shows two images of the same scene or bottle 200 taken at different angles. In this example, two images at about 10 and about 20 degree angles are shown in the same camera frame (image), acquired using the same light source and taken at the same time (no relative motion). The pair of images are combined and analyzed to determine whether the bottle contains a defect. The idea is to discriminate foreign defects from other bottle features and artifacts based on 3D positions and 3D morphological attributes using stereo vision with a set of mirrors, a set of cameras, or a combination thereof. The spatial position and direction of the cameras is determined using a prior imaging calibration and fitting procedure. For each pair of images of the object under inspection, an object fitting algorithm is run to identify and correlate the objects in the two scenes and derive the 3D position of the objects (e.g., using the prior calibration model). For example, the walls of a glass bottle can be approximated using the curvature present in a semicircular artifact that comes from the heel and wall of the bottle "smiley face artifact." This artifact describes the curved surface of the glass wall, specific to each bottle imaged. It is important to note that bottles vary in their dimensions - especially in the thickness of the glass base and the diameter (ovalization) of the bottle, as thev are formed from different molds, and the mold is loaded with varying quantities of glass material. Therefore utilizing prior information on glass bottle dimensions is only approximate. In certain examples, it may be optimal to compensate for bottle to bottle variation.

It is desirable to minimize the separation between camera and object and image at a large aperture to increase the angular coverage of the reflected illumination from the fragment. Two or more independent cameras can be used to form the images, placed at different angles, but the physical size of the camera and lens can limit the angle of separation, or standoff-di stance, between the images to be somewhat larger than may be desirable. A small angle (1-5 degrees) has shown to have insufficient depth resolution for the inspection application. A large a separation (e.g., 25-30 degrees in the vertical plane) results in losing reflections from an object in one of the views, complicating the object fitting. This desired angle for certain embodiments is different in the horizontal plane, where there is more symmetry in the optical illumination. This has only been initially investigated by generating 3D reconstructions from two cameras separated by approximately 5 and about 25 degrees.

The optical distortion characteristics of the lens are compensated for in the calculation, as well as the spatial position and direction of the cameras. An imaging calibration phantom is placed in the inspection area. The phantom contains high-contrast features at known positions. These are used to accurately calculate the position and direction of the imaging cameras relative to the inspection object. This is used as reference data to process subsequent images and to produce 3D data. The calibration can be re-taken when the cameras or mirrors are physically moved. The inspection component 100 can include two or more mirrors. While two mirrors are shown in FIG. 1, two or more mirrors or cameras may be used to generate images of the inside of the bottle 200.

In yet another embodiment, an extension of this technique includes utilizing data from 4 cameras, in addition to the 2 scenes per camera. The 4 cameras generate 'smiley face' artifacts from different and overlapping angles. This permits a more accurate assignment of bottle wall position to be made, when compared to data from one camera alone. Additional mirrors in the horizontal plane create additional stereoscopic image data that can be used to improve 3D accuracy.

In yet another embodiment, two cameras can be used instead of two images generated using two mirrors and one camera.

In yet another embodiment, optimization of the inspection component 100 may involve ensuring that the defect of the bottle 200 is capable of reflecting at least a portion of the light at a camera 10. Therefore, more than one camera 10 may be utilized. As depicted, these cameras 10 may be located on a single plane of a bottle 200 so to create stereoscopic images. The cameras 10 may be oriented at different angles with respect to each other and the bottle 200 without departing from the scope of the present disclosure.

Further, camera positioning, illumination, and stereo match may be adjusted and optimized. Camera positioning (x,y,z) and camera lens model calibration may be adjusting to capture the reflected light. The level and direction of illumination may be adjusted to produce an adequate signal. For example, back-lit or side-lit scenes are an extension of this embodiment. Sufficiently robust and accurate stereo matching may produce improved detection, as well as permitting discrimination of false alarms.

In yet another embodiment, the inspection component 100 may include reflective structures that reflect and concentrate the reflected portions of the light. The reflected portion of the light may engage two reflective structures prior to reaching a camera. The inspection component may also include a means to convey the bottle over the light source that may be about 300mm long. Furthermore, the inspection component may have one or many illumination sources either continuous or strobed or combination thereof. Moreover, the inspection component may include support belts that guide the bottle into an inspection position. Additionally, the inspection component may include a conveyor belt system having a length less than about 1200 mm.

Additional cameras may be oriented within the inspection component to perform other detections, such as fill level detection, floating object and sinking object inspection, and bubble detection. In certain examples, the camera of the inspection component may be offset about 20 or about 10 degrees from horizontal and/or about 70 or 80 degrees from an axis of the light. In other examples, the camera of the inspection component may be offset from about 10 to about 20 degrees from horizontal and/or from about 70 to about 80 degrees from an axis of the light. The light may have a diameter substantially equal to that of an inner diameter of the bottle.

In another embodiment, the system can use images obtained by using a position trigger on the bottle transport line to strobe the light source for a short duration, and take a single image with the camera. In certain examples, the camera 10 has a frame rate no higher than 50 frames per second, for typical high speed bottling lines, which reduce costs of the overall system.

In another embodiment, the region of interest or the area where the defect resides is the bottom half of the container or the bottom third of the container. While the invention utilizes one camera in its simplest implementation, additional camera and mirror sets can be used to cover larger regions of interest at the base of the bottle and in some bottle types, increase the reliability of contamination detection. An alternative to using two mirrors is to use two direct view cameras with the higher costs and data handling needs. A further embodiment of this design (with reference to FIGs. 1-5) uses two or more collimated light sources 45 in plate 40, with corresponding additional camera 10, lens 15 and mirrors 20, 25 mounted coincident with each light source 45, in conjunction with moving the two belts 80 at different speeds to rotate the bottle as it moves over plate 40. The orientation of the bottle at each light source 45 will be different enabling additional inspection coverage of the area within the bottle by the imaging systems.

Another embodiment includes the use optical elements such as mirrors and lenses to produce a collimated light beam.

Another embodiment includes the use the light passing through a bottle to image contaminates (partially or fully floating), determine the fill level, and measure the degree of foaming. Another embodiment includes a system for detecting a defect in a transparent or translucent container, having a light source emanating light rays towards a region of interest of the container such that the light rays can reflect off the defect inside the container, a first sensor array that receives the reflected light rays to capture a first image of the region of interest at a first angle, a second sensor array that receives the reflected light rays to capture a second image of the region of interest at a second angle. The first angle and the second angle are different. A processor to receive at least two images from the sensor array and able to detect the defect in the container by analyzing the light pattern on the images. The light source can be situated below or above the container.

The term "bottle" or "container" can include a transparent or translucent container. In some instances, the bottle can be tinted bottles and or dark liquids, and the wavelength of illumination light would have to be in the near infrared portion of the spectrum. Furthermore, polarized light can be employed to detect certain types of visible contaminants such as cellophane or to deal with very reflective containers.

The inspection component 100 may further include reflective structures that reflect and concentrate the reflected portion of the light as it moves from the bottle to a camera. The inspection component also includes reflective structures positioned to reflect and concentrate the reflected portion of the light prior to the portion of the light reaching a camera lens. As depicted, the reflective structures are planar structures having planar surfaces. However, reflective structures with other geometric structures and non-planar, i.e., convex and concave, surfaces may be utilized. The reflective structures may be configured into a dual image mirror system wherein each reflected portion of the light engages two reflective structures prior to being measured by a camera. However, the light may engage more or less than two reflective surfaces prior to being measured by a camera without departing from the scope of this disclosure.

The inspection component disclosed herein may be further utilized in other applications. Illuminating the base of a bottle allows for fill level inspection to be observed. Moreover, the inspection component may further be used to conduct floating and sinking foreign object inspection.

One embodiment includes a method for inspecting a bottle. This method for detecting defects in a partially transparent container can include transporting the container for inspection; directing light toward a region of interest of the container from the light source at different reflection angles; receiving the reflected light rays to capture two stereoscopic images of the region; and differentiating between different types of defects in the container by analyzing the light pattern from the two, e.g., stereoscopic images. A conveyor belt and support belts can maneuver the bottle/container into an inspection position proximate to or above a light source. The light source may be laser diodes or infrared source or another type of light source (e.g., Xenon strobe, Tungsten, Quartz Halogen, laser, visible, UV, and IR). Using a camera, a portion of the light reflected by a defect within the bottle is detected. It is also possible to utilize electronic sensors instead of or in supplementation of cameras. The light may be transmitted through a base of the bottle toward a neck portion of the bottle.

Various embodiments described above, leverage a complex distributed computing system including a Human Machine Interface (HMI), used for setting up and configuring this inspection system, a Line Control Module (LCM), used for bottle-tracking, camera and strobe triggering and rejecter control and a Vision Engine, supporting the execution of realtime computer vision algorithms, on the images acquired.

To support the real-time inspection requirements in excess of 1500 bottles per minute, the Vision Engine component can be realized as a real-time, multi -threaded software application. The real-time aspect of the system is a requirement to ensure an inspected bottle can be categorized as a defective or non-defective product within milliseconds of the bottle reaching its inspection zone(s). The optional multi -threaded realization enables performance optimizations, through enabling multiple images of the bottle captured to be processed concurrently. Additionally, multi-threading can be used to support pipelining of the various processes including frame-grabbing (acquiring the image) for the next bottle, while the images of the current bottle are being processed.

While the multi-threading software infrastructure is a critical element of the systems and methods described herein, the real-time algorithms used to concurrently process various images captured are key. It is important to note that the system and method of contaminant analysis described herein departs from a probabilistic model and moves towards a deterministic model. Under a probabilistic model, a probability is assigned to detected objects, i.e. a probability distribution determines a probability that an object is a contaminant. This means that an acceptable object (e.g. a mold-mark on the bottle) may be identified as a contaminant. Further, a contaminant may be identified as safe. In contrast, a deterministic model produces the same output from a given starting condition or initial state. In other words, given the same input images of the bottle and same sensitivity threshold settings in the software, the algorithms always produce the same results. The advantage of this system is that through offline sweeps of thousands of known good and bad bottle images, the software can be trained so that the presence of a contaminant always results in the positive detection/identification of the contaminant.

Closely related to the multi -threaded, deterministic infrastructure for this inspection system is the correlation of data processed by various threads. These correlations could be used to discriminate various exterior features of the bottles, water droplets or dirt, from defects lying inside the bottle. Additionally, these correlations can be used to minimize the run-time of these algorithms for a given bottle, by labeling the bottle as defective, as soon as any one of the inspection threads finds a defect inside the bottle.

The computer vision algorithms supporting such application include a combination of 2D and 3D algorithms where 2D algorithms are used to filter the images, using various techniques including but not limited to grayscale, texture and pixel concentration thresholding. If the result of these low-cost 2D algorithms, cannot absolutely declare the bottle free of defects, than a more extensive 3D triage of the images will be invoked. The sophistication of the 2D algorithms used can be configured based on various bottle types and their content.

The following sequence can be one possible realization of inspection algorithm:

1. Calibrate camera(s) with a calibration target to assess relative positioning of the camera(s), mirror(s) and the inspection station.

2. Grab images from camera(s), and dissect the stereoscopic image. Apply calibration map transformation to both images, in a stereoscopic scene, to get mapped images. FIG. 8.

In 2D, leverage the gap between the aperture where light is emanating from and the base of the bottle to define the transaction line 910 AKA the "smiley". This smiley defines the inner base of the bottle. FIG. 9.

Find bottle base contour and define the Region Of Interest (ROI) 1010. FIG. 10. Using various 2D segmentation methods define the edges 1110 of the "smiley'Vtransaction line. FIG. 11.

Compensate for scenarios where the transaction line is not fully pronounced in the image.

Use 2D filters to smooth-out the transaction line.

Based on the revised smileys, allocate ROIs for each view.

Scan the previously defined ROI using various 2D segmentation techniques for potential defects. If none exist, the bottle can be declared as a "Good" bottle and the algorithm can terminate at this point.

Use binocular disparity to generate 3D model of the bottle, based on the above ROIs. Filter out the noise generated via air bubbles or small particles in the liquid content. Use 3D points thresholding to determine potential presence of defects in the bottle. If no points violate the set thresholds, the bottle can be declared as a "Good" bottle and the algorithm can terminate at this point. 14. Once again, binocular disparity information, on the transaction lines can be leveraged to define the cylindrical shape of the bottle and hence determine what 3D points lie inside and outside of these walls. FIG. 12.

15. Position, density, contour and other thresholder attributes can be then used to declare whether there are any defects inside the bottle.

Although the present disclosure and its advantages have been described in detail, it should be understood that various changes, substitutions, and alterations can be made herein without departing from the spirit and scope of the disclosure as defined by the appended claims. Moreover, the scope of the present application is not intended to be limited to the particular configurations of the process, machine, manufacture, composition of matter, means, methods, and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the present disclosure, processes, machines, manufacture, compositions of matter, means, methods, or steps presently existing or later to be developed that perform substantially the same functions or achieve substantially the same result as the corresponding configurations described herein may be utilized according to the present disclosure. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.