Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
BELT INSPECTION SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2018/076053
Kind Code:
A1
Abstract:
A system (10) in accordance with the present embodiment of the invention for inspecting and determining the state of a belt (12) of a particular conveyor system (28). In particular, the system (10) permits determination of particular characteristics of the belt (12) such as thickness of the belt (12), location of particular features of interest (referred to as singularities) along the belt (12), rate of change of the thickness of the belt (12) and rate of change of the particular singularities. For this, the system (10) comprises data capturing means (14) for capturing data representative of parts of the conveyor system (28) such as the belt (12) and at least one pulley (16). The data capturing means (14) may comprise one more digital cameras (18); in particular arrangements, the data capturing means (14) may also include lasers (26) for emitting laser beams see figure (10) providing supplemental information to complement the information provided by the camera(s) (18) with the intention to improve the accuracy of the output of the process for determine particular characteristics of the belt (12).

Inventors:
COLLEY BENJAMIN DONALD (AU)
WEBB CALLUM MICHAEL (AU)
WALE CRAIG JEFFREY (AU)
BLAIR LINDEN ARTHUR (AU)
THOMAS QUENTEN OWEN LUKE (AU)
MCCLEERY THOMAS MICHAEL (AU)
Application Number:
PCT/AU2017/051168
Publication Date:
May 03, 2018
Filing Date:
October 24, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
WEARHAWK PTY LTD (AU)
International Classes:
B65G43/02; B65G15/00; B65G17/00; B65G43/06; G01B11/00; G01B11/06
Foreign References:
KR101466637B12014-12-10
US20130058560A12013-03-07
US6988610B22006-01-24
US6831566B12004-12-14
Attorney, Agent or Firm:
WRAYS PTY LTD (AU)
Download PDF:
Claims:
Claims

1 . A system for inspecting a belt of the conveyor system, the conveyor system comprising at least one end section having a pulley having an outer surface for the belt to move around the pulley abutting the outer surface, the system comprising processing means for receiving image data capture by at least one camera of an edge of the belt while surrounding the pulley, and first and second outer edges of an edge of the pulley; wherein the processing means is adapted to measure the distance between (1 ) first and second outer edges of the edge of the belt and (2) first and second outer edges of the pulley.

2. A system according to claim 1 wherein the camera is arranged overhead of an end section of the conveyor system such that the camera takes a top view of the edge of the belt and the outer edges of the pulley.

3. A system according to claims 1 or 2 wherein the processing means are adapted to infer an edge of the pulley extending from the outer edges of the pulley represent the outer surface of the pulley abutted the by belt.

4. A system according to any one of the preceding claims wherein the configuration of the edge of the pulley extending from the outer edges of the pulley is provided to the processing means by a source external to the system.

5. A system according to any one of the preceding claims wherein the processing means are adapted to extract crest profile of the edge of the belt and a crest profile of the edge of the pulley.

6. A system according to any one of the preceding claims wherein the processing means are adapted to determine the distance between the outer surface of the edge and the outer surface of the pulley along the entire width of the belt by relating the crest profile of the edge of the belt and the crest profile of the edge of the pulley.

7. A system according to any one of the preceding claims wherein the processing means are adapted to measure the distance between each point of the outer surface of the belt and each counterpart point of outer surface of the pulley to provide an indication of the thickness of the belt at each point of the belt extending from one side of the belt to the other side of the belt providing the profile of the upper surface of the edge of the belt.

8. A system according to any one of the preceding claims wherein the processing means are adapted to identify prominent edges in the image taken by the camera.

9. A system according to claim 8 wherein the prominent edges in the image comprise the edge of the belt and the outer edges of the pulley.

10. A system according to any one of the preceding claims wherein the processing means are adapted to calculate the thickness of the belt along the width of the edge of the belt by:

1 . detecting the prominent edges to produce an edge image;

2. filtering the edge images

3. identifying all of the pixels that represent the edge of the belt,

4. identifying all of the pixels that represent the outer edges of the pulley

5. converting the pixel coordinates to real world coordinates;

6. inferring the line representing the edge of the pulley surrounded by the belt and located between the two outer edges of the pulley:

7. calculating the thickness along the width of the edge of the belt by comparing the edge of the belt 12 to the edge of the pulley.

1 1 . A system according to claim 10 wherein the prominent edges in the image taken by the camera are identified by an operator.

12. A system according to any one of the preceding claims wherein the processing means are adapted to process images that depict the edge of the belt and the outer edge of the pulley in a non-orthogonal orientation with respect to the frame of the image.

13. A system according claim 12 wherein the processing means correct the non- orthogonal orientation of the edges of belt and the edges of the pulley prior the step of detecting the edges of the bels and the pulley.

14. A system according to any one of the preceding claims wherein the system is adapted to collect a plurality of images of the edges of the belt and the edges of the pulley collected by the camera while the belt is moving around the pulley.

15. A system according to any one of the preceding claims wherein the processing means is adapted to store the thickness of the belt along the width of each edge of the belt for each of the images of the plurality of images relating the thickness of each edge of the belt to (1 ) the time that the image was taken and (2) to the position of each edge along the length of the belt.

16. A system according to claim 15 wherein the processing means are adapted to calculate the position of each edge of the belt.

17. A system according to any one of claims 15 and 16 wherein the position of each edge of the belt is determined via using the speed of the belt being provided by any one of (1 ) a system controlling the speed of the belt, (2) RFID chip sensors detecting passing by of the belt, (3) integration of the speed of the belt including appropriate error correction, (4) the time interval between images captured by the camera 18; (5) techniques of optical flow over subsequent images, and (6) use of reference points such as splices joining together the belt.

18. A system according to any one of the preceding claims wherein the processing means are adapted to generate a 3D model of the belt comprising an indication of the thickness of the belt at each point of the belt via concatenating the thickness of each edge of the belt.

19. A system according to any one of the preceding claims wherein the processing means are adapted to generate a pictorial representation of the profile of the belt using the 3D model.

20. A system according to claim 19 wherein the pictorial representation comprises any singularities existing on the belt including the thickness of each point of the singularity.

21 . A system according to any one of the preceding claims wherein the system is adapted to store the data collected and generated by the processing means during a particular inspection of the belt in storage means.

22. A system according to claim 21 wherein the processing means is adapted to update storage means with data collected and generated by the processing means during one or more subsequent inspections of the belt.

23. A system according to claim 22 wherein the processing means will update the 3D model of the belt stored in the storage means in real time with each thickness profile extracted, and in conjunction with the position of each thickness profile extracted.

24. A system according to claim 22 wherein the process of updating the 3D model with a new profile and the position of that new profile involves overwriting the data in the 3D model that represents the profile most near along the length of the belt to the position of the new profile.

25. A system according claim 22 wherein the process of updating the 3D model with a new profile and the position of that new profile involves the application of an appropriate data-fusion algorithm, which may preferably include the incorporation of any other relevant information available to the system.

26. A system according to any one claims 18 to 25 wherein the system is adapted to retain data of 3D model representing the state of the belt that taken over a particular period time providing an historical record of the inspection processes conducted to the belt.

27. A system according to claim 26 wherein the processing means are adapted to provide the rate of change of belt thickness over particular periods of times.

28. A system according to any one claims 26 and 27 wherein the processing means are adapted provide the rate of change of a particular singularity of the belt.

29. A system according to claim 28 wherein the system is adapted to store in storage means any data related to singularities that are present in the belt.

30. A system according to any one of claims 28 and 29 wherein the processing means are adapted to identify and select singularities in the belt using techniques of object recognition.

31 . A system according to claim 30 wherein processing means may include machine-learning software in order to determine whether the identified singularities represent damage.

32. A system according to any one of claims 30 and 31 wherein in the event that the processing means identifies a feature of the belt from the 3D model and categorises the feature as damage, then the footage of the section of the damaged belt is stored in storage means.

33. A system according to any one of claims 1 to 32 wherein system is adapted to provide alternative belt thickness measurements for generating the 3D model of the belt.

34. A system according to any one of claims 28 and 29 wherein alternative belt thickness measurements comprise lasers for shining laser beams onto the belt to define laser line adjacent the edge of the belt.'

35. A computer implemented method for inspecting a belt of the belt of the conveyor system, the conveyor system comprising at least one end section having a pulley having an outer surface for the belt to move around the pulley abutting the outer surface, the method comprising the steps of: receiving image data capture by at least one camera of an edge of the belt while surrounding the pulley, and first and second outer edges of an edge the pulley; and executing a program for measuring a distance between (1 ) first and second outer edges of the edge of the belt and (2) first and second outer edges of the pulley.

36. A method according to claims 35 wherein the processing means infer an edge of the pulley extending from the outer edges of the pulley represent the outer surface of the pulley abutted the by belt.

37. A method according to claims 35 or 36 comprising using a configuration of the edge of the pulley extending from the outer edges of the pulley provided to the processing means by a source external to the system.

38. A method according to any one of claims 35 to 37 wherein the processing means extract crest profile of the edge of the belt and a crest profile of the edge of the pulley.

39. A method according to any one of claims 35 to 38 wherein the processing means determine the distance between the outer surface of the edge and the outer surface of the pulley along the entire width of the belt by relating the crest profile of the edge of the belt and the crest profile of the edge of the pulley.

40. A method according to any one of claims 35 to 39 wherein the processing means measure the distance between each point of the outer surface of the belt and each counterpart point of outer surface of the pulley to provide an indication of the thickness of the belt at each point of the belt extending from one side of the belt to the other side of the belt providing the profile of the upper surface of the edge of the belt.

41 . A method according to any one of claims 35 to 40 wherein the processing means identify prominent edges in the image taken by the camera.

42. A method according to claim 41 wherein the prominent edges in the image comprise the edge of the belt and the outer edges of the pulley.

43. A method according to any one of claims 35 to 42 wherein the processing means are adapted to calculate the thickness of the belt along the width of the edge of the belt by:

1 . detecting the prominent edges to produce an edge image;

2. filtering the edge images

3. identifying all of the pixels that represent the edge of the belt,

4. identifying all of the pixels that represent the outer edges of the pulley

5. converting the pixel coordinates to real world coordinates;

6. inferring the line representing the edge of the pulley surrounded by the belt and located between the two outer edges of the pulley:

7. calculating the thickness along the width of the edge of the belt by comparing the edge of the belt 12 to the edge of the pulley.

44. A method according to claims 43 to 44 wherein the prominent edges in the image taken by the camera are identified by an operator.

45. A method according to any one of claims 35 to 44 wherein the processing means process images that depict the edge of the belt and the outer edge of the pulley in a non-orthogonal orientation with respect to the frame of the image.

46. A method according to any one of claims 35 to 45 wherein the processing means correct the non-orthogonal orientation of the edges of belt and the edges of the pulley prior the step of detecting the edges of the bels and the pulley.

47. A method according to any one of claims 35 to 46 wherein the system collect a plurality of images of the edges of the belt and the edges of the pulley collected by the camera while the belt is moving around the pulley.

48. A method according to any one of claims 35 to 47 wherein the processing means the thickness of the belt along the width of each edge of the belt for each of the images of the plurality of images relating the thickness of each edge of the belt to (1 ) the time that the image was taken and (2) to the position of each edge along the length of the belt.

49. A method according to any one of claims 35 to 48 wherein the processing means calculate the position of each edge of the belt.

50. A method according to any one of claims 35 to 49 wherein the position of each edge of the belt is determined via using the speed of the belt being provided by any one of (1 ) a system controlling the speed of the belt, (2) RFID chip sensors detecting passing by of the belt, (3) integration of the speed of the belt including appropriate error correction, (4) the time interval between images captured by the camera 18; (5) techniques of optical flow over subsequent images, and (6) use of reference points such as splices joining together the belt.

51 . A method according to any one of claims 35 to 50 wherein the processing means generate a 3D model of the belt comprising an indication of the thickness of the belt at each point of the belt via concatenating the thickness of each edge of the belt.

52. A method according to any one of claims 35 to 51 wherein the processing means generate a pictorial representation of the profile of the belt using the 3D model.

53. A method according to claim 52 wherein the pictorial representation comprises any singularities existing on the belt including the thickness of each point of the singularity.

54. A method according to any one of claims 35 to 53 wherein the system is stores the data collected and generated by the processing means during a particular inspection of the belt in storage means.

55. A method according to claim 54 wherein the processing means update storage means with data collected and generated by the processing means during one or more subsequent inspections of the belt.

56. A method according to claim 54 wherein the processing means update the 3D model of the belt stored in the storage means in real time with each thickness profile extracted, and in conjunction with the position of each thickness profile extracted.

57. A method according to 54 wherein the process of updating the 3D model with a new profile and the position of that new profile involves overwriting the data in the 3D model that represents the profile most near along the length of the belt to the position of the new profile.

58. A method according to claims 54 wherein the process of updating the 3D model with a new profile and the position of that new profile involves the application of an appropriate data-fusion algorithm, which may preferably include the incorporation of any other relevant information available to the system.

59. A method according to any one of claims 35 to 58 wherein the system retains data of 3D model representing the state of the belt that taken over a particular period time providing an historical record of the inspection processes conducted to the belt.

60. A method according to claim 59 wherein the processing means provide the rate of change of belt thickness over particular periods of times.

61 . A method according to claims 59 wherein the processing means provide the rate of change of a particular singularity of the belt.

62. A method according to claims 61 wherein the system is adapted to store in storage means any data related to singularities that are present in the belt.

63. A method according to any one of claims 35 to 62 wherein the processing means are adapted to identify and select singularities in the belt using techniques of object recognition.

64. A method according to claim 63 wherein processing means apply machine- learning software in order to determine whether the identified singularities represent damage.

65. A method according to claims 63 to 64 wherein in the event that the processing means identifies a feature of the belt from the 3D model and categorises the feature as damage, then the footage of the section of the damaged belt is stored in storage means.

Description:
Belt Inspection System and Method

CROSS REFERENCE TO RELATED APPLICATIONS

The entire disclosure of Australian Provisional patent application 2016904328 is incorporated herein by reference. TECHNICAL FIELD

[0001 ] The present invention relates to a system for monitoring systems such as conveyors.

[0002] The invention has been devised particularly, although not necessarily solely, in relation to a system for monitoring conveyor belts. BACKGROUND ART

[0003] The following discussion of the background art is intended to facilitate an understanding of the present invention only. The discussion is not an acknowledgement or admission that any of the material referred to is or was part of the common general knowledge as at the priority date of the application. [0004] The present invention relates to conveyor systems for moving materials over a distance. Conveyor systems normally comprise one or more conveyor belts arranged such that bulk materials are moved from location to location via a belt which is laid over a number of driving pulleys to provide locomotion. Such conveyor belts are commonly used in the mining industry to move mineral bearing ore to ships, wherein it is transported for smelting and mineral extraction.

[0005] Conveyor belts are commonly damaged in their normal operation, and are a constant cause of breakdown in operations involving movement of materials from one location to another. Breakdowns caused through the failure of the belt in the conveyor system are particularly difficult to deal with, as commonly the material being transported is still on the belt when the failure occurs. This makes it extremely difficult to make repairs or to replace the belt.

[0006] Presently and typically, to determine the health of a conveyor belt the belt is stopped in an unloaded state and a series of measurements are then taken at discrete points along the belt to give an indication of its remaining thickness.. Such methods are time consuming and produce very little data resulting a low level of confidence in the actual belt condition. Furthermore, this method is unlikely to identify instances of isolated damage to the belt.

[0007] One alternative method uses a single camera positioned over a flat surface of the belt, and uses contrast analysis to estimate a surface profile and identify defects. This method cannot determine the absolute thickness of the belt.

[0008] Another alternative method uses a line laser and camera positioned over a flat section of the belt and determines the top surface profile by triangulation. This method does not allow for determination of absolute thickness unless a special idler pulley or similar reference surface is installed, requiring costly modifications to the conveyor design.

[0009] Another alternative method uses non-contact ultrasonic sensors to measure the thickness of the belt. This method cannot currently produce sufficient resolution to identify instances of isolated damage on the belt.

[0010] Another alternative method uses a plurality of discrete laser triangulation distance sensors on each side of a free-hanging section of belt to determine the thickness of the belt in narrow lines across its width. Due to the constrained number of sensors along the belt width, this method cannot currently produce sufficient resolution to identify instances of isolated damage on the belt. Furthermore this method does not capture natural images of the belt surface for visual confirmation and analysis of suspected defects.

[001 1 ] Another alternative method uses a line laser and camera positioned over the conveyor belt tail pulley. This method cannot work without the line laser which is unsuitable for some applications, and also increases complexity and cost when compared to the present invention. [0012] It is against this background in which the present invention has been developed.

SUMMARY OF INVENTION

[0013] According to a first aspect of the invention there is provided a system for inspecting a belt of the conveyor system, the conveyor system comprising at least one end section having a pulley having an outer surface for the belt to move around the pulley abutting the outer surface, the system comprising processing means for receiving image data capture by at least one camera of an edge of the belt while surrounding the pulley and first and second outer edges of the pulley; wherein the processing means is adapted to measure the distance between (1 ) first and second outer edges of the edge of the belt and (2) first and second outer edges of the pulley. [0014] Preferably, the camera is arranged overhead of an end section of the conveyor system such that the camera takes a top view of the edge of the belt and the outer edges of the pulley.

[0015] Preferably, the processing means are adapted to infer an edge of the pulley extending from the outer edges of the pulley represent the outer surface of the pulley abutted the by belt.

[0016] Preferably, the configuration of the edge of the pulley extending from the outer edges of the pulley is provided to the processing means by a source external to the system.

[0017] Preferably, the processing means are adapted to extract crest profile of the edge of the belt and a crest profile of the edge of the pulley.

[0018] Preferably, the processing means are adapted to determine the distance between the outer surface of the edge and the outer surface of the pulley along the entire width of the belt by relating the crest profile of the edge of the belt and the crest profile of the edge of the pulley. [0019] Preferably, the processing means are adapted to measure the distance between each point of the outer surface of the belt and each counterpart point of outer surface of the pulley to provide an indication of the thickness of the belt at each point of the belt extending from one side of the belt to the other side of the belt providing the profile of the upper surface of the edge of the belt. [0020] Preferably, the processing means are adapted to identify prominent edges in the image taken by the camera.

[0021 ] Preferably, the prominent edges in the image comprise the edge of the belt and the outer edges of the pulley.

[0022] Preferably, the processing means are adapted to calculate the thickness of the belt along the width of the edge of the belt by: 1 . detecting the prominent edges to produce an edge image;

2. filtering the edge images

3. identifying all of the pixels that represent the edge of the belt,

4. identifying all of the pixels that represent the outer edges of the pulley 5. converting the pixel coordinates to real world coordinates;

6. inferring the line representing the edge of the pulley surrounded by the belt and located between the two outer edges of the pulley:

7. calculating the thickness along the width of the edge of the belt by comparing the edge of the belt 12 to the edge of the pulley. [0023] Alternatively, the prominent edges in the image taken by the camera are identified by an operator.

[0024] Preferably, the processing means are adapted to process images that depict the edge of the belt and the outer edge of the pulley in a non-orthogonal orientation with respect to the frame of the image. [0025] Preferably, the processing means correct the non-orthogonal orientation of the edges of belt and the edges of the pulley prior the step of detecting the edges of the bels and the pulley.

[0026] Preferably, the system is adapted to collect a plurality of images of the edges of the belt and the edges of the pulley collected by the camera while the belt is moving around the pulley.

[0027] Preferably, the processing system is adapted to store the thickness of the belt along the width of each edge of the belt for each of the images of the plurality of images relating the thickness of each edge of the belt to (1 ) the time that the image was taken and (2) to the position of each edge along the length of the belt. [0028] Preferably, the processing means are adapted to calculate the location of each edge of the belt. [0029] Preferably, the location of each edge of the belt is determined via using the speed of the belt being provided by any one of (1 ) a system controlling the speed of the belt, (2) RFID chip sensors detecting passing by of the belt, (3) integration of the speed of the belt including appropriate error correction, (4) the time interval between images captured by the camera 18; (5) techniques of optical flow over subsequent images, and (6) use of reference points such as splices joining together the belt.

[0030] Preferably, the processing means are adapted to generate a 3D model of the belt 12 comprising an indication of the thickness of the belt at each point of the belt via concatenating the thickness of each edge of the belt. [0031 ] Preferably, the processing means are adapted to generate a pictorial representation of the profile of the belt using the 3D model.

[0032] Preferably, the pictorial representation comprises any singularities existing on the belt including the thickness of each point of the singularity.

[0033] Preferably, the system is adapted to store the data collected and generated by the processing means during a particular inspection of the belt in storage means.

[0034] Preferably, the processing mean are adapted to update storage means with data collected and generated by the processing means during one or more subsequent inspections of the belt.

[0035] Preferably, the processing means will update the 3D model of the belt stored in the storage means in real time with each thickness profile extracted, and in conjunction with the position of each thickness profile extracted.

[0036] Preferably, the process of updating the 3D model with a new profile and the position of that new profile involves overwriting the data in the 3D model that represents the profile most near along the length of the belt to the position of the new profile. [0037] Preferably, the process of updating the 3D model with a new profile and the position of that new profile involves the application of an appropriate data-fusion algorithm, which may preferably include the incorporation of any other relevant information available to the system. [0038] Preferably, the system is adapted to retain data of 3D model representing the state of the belt that taken over a particular period time providing an historical record of the inspection processes conducted to the belt.

[0039] Preferably, the processing means are adapted to provide the rate of change of belt thickness over particular periods of times.

[0040] Preferably, the processing means are adapted provide the rate of change of a particular singularity of the belt.

[0041 ] Preferably, the system is adapted to adapted to store in storage means any data related to singularities that are present in the belt. [0042] Preferably, the processing means are adapted to identify and select singularities in the belt using techniques of object recognition.

[0043] Preferably, processing means may include apply machine-learning techniques in order to determine whether the identified features represent damage.

[0044] Preferably, in the event that the processing means identifies a feature of the belt from the 3D model and categorises the feature as damage, then the footage of the section of the damaged belt is stored in storage means.

[0045] Preferably, system is adapted to provide alternative belt thickness measurements for generating the 3D model of the belt.

[0046] Preferably, alternative belt thickness measurements comprise lasers for shining laser beams onto the belt to define laser line adjacent the edge of the belt.

[0047] According to a second aspect of the invention there is provided a computer implemented method for inspecting a belt of the belt of the conveyor system, the conveyor system comprising at least one end section having a pulley having an outer surface for the belt to move around the pulley abutting the outer surface, the method comprising the steps of: receiving image data capture by at least one camera of an edge of the belt while surrounding the pulley, and first and second outer edges of an edge the pulley; and executing a program for measuring a distance between (1 ) first and second outer edges of the edge of the belt and (2) first and second outer edges of the pulley.

[0048] Preferably, the processing means infer an edge of the pulley extending from the outer edges of the pulley represent the outer surface of the pulley abutted the by belt. [0049] Preferably, using a configuration of the edge of the pulley extending from the outer edges of the pulley provided to the processing means by a source external to the system.

[0050] Preferably, the processing means to extract crest profile of the edge of the belt and a crest profile of the edge of the pulley. [0051 ] Preferably, the processing means determine the distance between the outer surface of the edge and the outer surface of the pulley along the entire width of the belt by relating the crest profile of the edge of the belt and the crest profile of the edge of the pulley.

[0052] Preferably, the processing means measure the distance between each point of the outer surface of the belt and each counterpart point of outer surface of the pulley to provide an indication of the thickness of the belt at each point of the belt extending from one side of the belt to the other side of the belt providing the profile of the upper surface of the edge of the belt.

[0053] Preferably, the processing means identify prominent edges in the image taken by the camera.

[0054] Preferably, the prominent edges in the image comprise the edge of the belt and the outer edges of the pulley.

[0055] Preferably, the processing means calculate the thickness of the belt along the width of the edge of the belt by:

1 . detecting the prominent edges to produce an edge image;

2 filtering the edge images

3 identifying all of the pixels that represent the edge of the belt,

4 identifying all of the pixels that represent the outer edges of the pulley 5 converting the pixel coordinates to real world coordinates;

6 inferring the line representing the edge of the pulley surrounded by the belt and located between the two outer edges of the pulley:

7 calculating the thickness along the width of the edge of the belt by comparing the edge of the belt 12 to the edge of the pulley.

[0056] Alternatively, the prominent edges in the image taken by the camera are identified by an operator.

[0057] Preferably, the processing means process images that depict the edge of the belt and the outer edge of the pulley in a non-orthogonal orientation with respect to the frame of the image.

[0058] Preferably, correct any non-orthogonal orientation of the edges of belt and the edges of the pulley prior the step of detecting the edges of the bels and the pulley.

[0059] Preferably, the system collects a plurality of images of the edges of the belt and the edges of the pulley collected by the camera while the belt is moving around the pulley.

[0060] Preferably, the processing system stores the thickness of the belt along the width of each edge of the belt for each of the images of the plurality of images relating the thickness of each edge of the belt to (1 ) the time that the image was taken and (2) to the position of each edge along the length of the belt. [0061 ] Preferably, the processing means calculate the location of each edge of the belt.

[0062] Preferably, the location of each edge of the belt is determined via using the speed of the belt being provided by any one of (1 ) a system controlling the speed of the belt, (2) RFID chip sensors detecting passing by of the belt, (3) integration of the speed of the belt including appropriate error correction, (4) the time interval between images captured by the camera 18; (5) techniques of optical flow over subsequent images, and (6) use of reference points such as splices joining together the belt.

[0063] Preferably, the processing means generate a 3D model of the belt 12 comprising an indication of the thickness of the belt at each point of the belt via concatenating the thickness of each edge of the belt. [0064] Preferably, the processing means generate a pictorial representation of the profile of the belt using the 3D model.

[0065] Preferably, the system store the data collected and generated by the processing means during a particular inspection of the belt in storage means. [0066] Preferably, the processing mean update storage means with data collected and generated by the processing means during one or more subsequent inspections of the belt.

[0067] Preferably, the processing means updates the 3D model of the belt stored in the storage means in real time with each thickness profile extracted, and in conjunction with the position of each thickness profile extracted.

[0068] Preferably, the process of updating the 3D model with a new profile and the position of that new profile involves overwriting the data in the 3D model that represents the profile most near along the length of the belt to the position of the new profile.

[0069] Preferably, the process of updating the 3D model with a new profile and the position of that new profile involves the application of an appropriate data-fusion algorithm, which may preferably include the incorporation of any other relevant information available to the system.

[0070] Preferably, the system is adapted to retain data of 3D model representing the state of the belt that taken over a particular period time providing an historical record of the inspection processes conducted to the belt.

[0071 ] Preferably, the processing means are adapted to provide the rate of change of belt thickness over particular periods of times.

[0072] Preferably, the processing means are adapted provide the rate of change of a particular singularity of the belt. [0073] Preferably, the system is adapted to adapted to store in storage means any data related to singularities that are present in the belt.

[0074] Preferably, the processing means are adapted to identify and select singularities in the belt using techniques of object recognition. [0075] Preferably, processing means may include apply machine-learning techniques in order to determine whether the identified features represent damage.

[0076] Preferably, in the event that the processing means identifies a feature of the belt from the 3D model and categorises the feature as damage, then the footage of the section of the damaged belt is stored in storage means.

[0077] Preferably, system is adapted to provide alternative belt thickness measurements for generating the 3D model of the belt.

[0078] Preferably, alternative belt thickness measurements comprise lasers for shining laser beams onto the belt to define laser line adjacent the edge of the belt.

BRIEF DESCRIPTION OF THE DRAWINGS

[0079] Further features of the present invention are more fully described in the following description of several non-limiting embodiments thereof. This description is included solely for the purposes of exemplifying the present invention. It should not be understood as a restriction on the broad summary, disclosure or description of the invention as set out above. The description will be made with reference to the accompanying drawings in which:

Figure 1 illustrates a particular arrangement of a system, for inspecting a conveyor system in accordance with the present embodiment of the invention;

Figure 2 shows a schematic side view of an assembly of a particular arrangement for inspecting a belt of the conveyor system;

Figure 3 shows a schematic top view of the particular arrangement for the belt of a conveyor system shown in figure 2; Figure 4 shows a schematic top view of three successive images taken at times consecutive times;

Figure 5 shows an enlarged view of detail A shown in figure 5; Figure 6 is a flowchart describing the method for measuring the thickness of the belt using the images shown in figure 4;

Figure 7 illustrates an arrangement of a register recording the profile of edges of the belt obtained by processing the images shown in figure 4; Figure 8a shows a pictorial representation of a 3D model of a particular section of the belt being inspected in figure 2.

Figure 8b shows a cross section of the pictorial representation shown in figure 8a along the line 8b-8b';

Figure 9 is a flowchart summarizing the method for generating the 3D model shown in figures 8;

Figure 10 shows a schematic front side perspective of an assembly of another arrangement for inspecting the belt of the conveyor system; and

Figure 1 1 shows a schematic top perspective of the belt of the assembly shown in figure 10. DESCRIPTION OF EMBODIMENTS

[0080] Figure 1 shows a particular arrangement of a system 10 in accordance with the present embodiment of the invention for inspecting and determining the state of a belt 12 of a particular conveyor system 28. In particular, the system 10 permits determination of particular characteristics of the belt 12 such as thickness of the belt 12, location of particular features of interest (referred to as singularities) along the belt 12, rate of change of the thickness of the belt 12 and rate of change of the particular singularities. For this, the system 10 comprises data capturing means 14 for capturing data representative of parts of the conveyor system 28 such as the belt 12 and at least one pulley 16. The data capturing means 14 may comprise one more digital cameras 18; in particular arrangements, the data capturing means 14 may also include lasers 26 for emitting laser beams (see figure 10) providing supplemental information to complement the information provided by the camera(s) 18 with the intention to improve the accuracy of the output of the process for determine particular characteristics of the belt 12. [0081 ] Further, the system 10 also comprises processing means 22 operatively connected to the data capturing means 14 for processing the captured data to determine, for example, the status of the belt 12 and providing 3D models of the belt 12 permitting, for example, generation of a pictorial representation of the belt 12 as shown in figures 8.

[0082] The processing means 22 are adapted to communicate with communication and computing devices 24a to 24c to permit, for example, operators in charge of the conveyor system 28 to view the data captured by the data capturing means 14 (the images) and the information generated by the processing means 22. The operators of the conveyor system 28 may also assist in processing the images as well for further processing of data resulting by the processing of the images by the processing means 22.

[0083] The processing means 22 comprises computing means for running software for processing the images taken by the data capturing means 14 as will be described at a later stage. The software, for example, includes particular algorithms known in the art used for (1 ) processing images (such as for, for example, algorithms used for identifying pixels representative of particular feature in images) and (2) for processing data such as data fusion techniques.

[0084] The computing means is coupled to a communication device configured to communicate via a communication network wired or either wireless via the internet. The communication device may be used to communicate, for example, with one or more of the communication and computing devices 24a to 24c and the storage means 25.

[0085] Further, database farms may be incorporated in the processing means. The database farms store information required for processing the images taken by the data capturing means 14 using the software running in the computing means of the processing means. The processing means 22 may also store all of the information generating during capturing the image and processing them.

[0086] The communication and computing device 24a correspond to the control room of the conveyor system 28 and thus is connected to the conveyor system 28 as shown in figure 1 . This permits operators at the control room to control the conveyor system 28.

[0087] The communication and computing device 24a are also connected to the processing means 22 as shown in figure 1 . This permits provision of the data processed by the processing means 22 to the communication and computing device 24a to allow viewing of, for example, the images taken by the camera 18 permitting the operators to assist in the data processing process; as will be described at a later stage, the operators may identify the edges 38 and 50 of the belt 12 and pulley 16 during the process of determining the profile of the upper surface of the belt 12. Also, the operators may view pictorial representations of the belt 12 in real time as the belt 12 is being inspected. This permits the operators in real time to (1 ) identify the particular singularities of the belt 12 and (2) take action such as selecting the particular area of the pictorial representation where the particular singularities of the belt 12 are located with the intention to store in storage means to further analyse the particular singularities.

[0088] The communication and computing device 24a comprises server means containing database farms for storing the data that is being processed by the processing means 22 as well as human-machine interfaces (HMI) to permit interaction with either the conveyor system or the processing means. For example, the HMI may comprise means for selecting particular regions of the images taken by the data capturing means 14 as well as for manipulating the images and the pictorial representations of the belt 12 resulting from the data processing conducted in the processing means 22

[0089] Further, the system 10 is adapted to permit other communication and computing devices 24b and 24c to connect to the conveyor system and/or to the processing means 22. In one particular arrangement, one of the communication and computing devices may comprise a mobile phone 24c permitting connecting with the processing means 22 to retrieve information from the inspection process of the belt 12. This permits an operator located proximal to the conveyor system 28 to view the results of the inspection process of the belt 12.

[0090] Furthermore, the system 10 further comprises storage means such as a circular buffer 25 that permits storage of the data generated by the processing means 22 as well as data generated by data capturing means 14. Any of the communication and computing devices 24 may access the circular buffer 25 for retrieval of the information stored therein.

[0091 ] Referring now to figure 2, figure 2 shows an end section of a conveyor system 28 including data capturing means 14 with one camera 18. In alternative arrangements there may be more than one or more cameras 18; also one or more laser emitters 26 may be incorporated to the data capturing means 14 as shown in figure 10. [0092] As shown in figure 2, the camera 18 is arranged overhead of an end section 30 of the conveyor system 28, in particular, at one of the locations where the belt 12 changes direction by moving around the pulley 16 to return to the another end section (not shown) and that is located spaced apart from the end section 30 shown in figure 2. [0093] In the particular arrangement shown in figure 2, the camera 18 is located at an angle with respect to the vertical. In this manner, the angle of view of the camera 18 covers a section extending from a particular location 32 of the belt 12 adjacent to a another particular location 34 forefront of the curved end (referred to as the edge 38 of the belt 12) of the end section 30 of the conveyor system 28. In particular, in this arrangement, the camera 18 may take a top view of the edge 38 of the belt 12 shown in figure 2. As will be described with reference to the method of processing the images taken by the camera 18, the image of the top view of the edge 38 of the belt 12 permits determination of the particular profile of the edge 38.

[0094] Further, this particular arrangement, permits the camera 18 to capture an image of particular section 48 comprising an upper section 36 of the belt 12 and the section of the belt 12 abutting the curved surface of the pulley located forefront the end section 30 of the conveyor system 28. The image taken by camera 18 includes the edge 38 of the belt 12 defining the outermost location of the conveyor system 28 and the edges 40a and 40b (of the pulley 16) that extend beyond the sides of the belt 12 and are best shown in figure 3. The edge 38 of the belt 12 abuts the edge (of pulley 16) that is represented by the dotted line 50 and that extends between the outer edges 40a and 40b of the pulley 16.

[0095] Figure 3 shows a top view of the end section 30 of the conveyor system 28. As shown in figure 3, the edges 40a and 40b of the pulley 16 reach from under the lower surface 42 (see figure 2) of the belt 12 extending beyond the belt 12. This is due to the fact that the belt 12 has a width that is smaller than the width of the pulley 16.

[0096] As mentioned earlier when referring to figure 2, the belt 12 surrounds the pulley 16 as the belt 12 changes direction returning to the other end of the conveyor system 28 not shown in the figures. Thus, the upper surface 44 of the belt 12 is spaced apart a particular distance 46 from the outer surface of the surface pulley 16. The distance 46 may be seen best in figure 2. [0097] In accordance with the present embodiment of the invention, the distance 46 may be determined by analysing the images taken by the camera 18. As will be described below determination of the distance 46 for each point of the belt 12 permits generating a pictorial representation of the belt 12 showing the upper surface of the belt 12 incorporating any singularities of the belt 12 - see figure 8a and 8b. The pictorial representation (or data associated with the pictorial representation) may also provide the thickness of the belt 12 at each point of the belt 12

[0098] In a particular arrangement, the system 10 at a first stage determines the distance 46 of the belt 12 at the sides of the belt 12 abutting the outer edges 40a and 40b of the pulley 16. This is done by relating the edges 40 of the pulley 16 with the upper surfaces at the sides of the belt 12 to determine the distance 46 as shown in figure 3.

[0099] As is expected, the sides being subjected to less wear than the centre section of the belt 12, the distance 46 as calculated at the edges 40a and 40b only provides information of the thickness of the belt 12 at the sides of the belt 12 and not of the thickness of the edge 38 of the belt 12 extending between the sides of the belt 12. (even though, if the a new belt is inspected using the present system 10, it is possible to assume that the thickness of the belt at its sides is substantially the same at each point of the belt 12; except perhaps at locations where the splices of the belt 12 are located). [00100] In accordance with the present embodiment of the invention, distance 46 for each point of the edge 38 extending between the sides of the belt 12 is determined by calculating the spacing between the each point of the edge 38 and it corresponding counterpart point on the edge 50 of the pulley 16. In accordance with the present embodiment of the invention, the processing means 22 extracts the crest profile of the edge 38 of the belt 12 as well as the crest profile of the edge 50 of the pulley 16 on which the edge 38 was resting when the image was taken by the camera 18. For this, the image taken by the camera is analysed using the processing means 22 to determine the crest profile of the edge 38 and edge 50.

[00101 ] Further, as shown in figure 3 to 5, the edge 38 of the belt 12 is spaced apart from the edge 50 of the pulley 16. The dotted lines represent the edge 50 of the pulley 16 on which the edge 38 of the belt 12 was resting when the image was taken. By measuring the distance at a multitude of locations 52 that extend from one side of the belt 12 to the other side of the belt 12, it is possible to obtain the thickness of the belt 12 along the edge 38. This is illustrated in figure 5.

[00102] As shown in figure 5, the edge 38 is spaced apart from the edge 50 by a distance X that may vary for each for point along the edge 38 of the belt 12 due to the fact that the profile of the edge 38 is irregular as shown in figure 5. Measuring the distance X for each point of the edge 38 of the belt 12 permits generating the crest profile of the edge 38.

[00103] In figure 5 are shown three locations 52a to 52c as discrete measurements for clarity, but during processing of the image to generate the crest profile the measurement of the distance X are conducted for each point along the width of the edge 38.

[00104] Further, in the above described method, it is assumed that the edge 50 of the pulley 16 on which the edge 38 rest define a straight edge. In alternative arrangements, the edge 50 of the pulley 16 may not follow a straight line, but it may be configured as a convex, concave or even irregular line in view that the pulley 16 comprises a concave, convex or irregular outer surface.

[00105] To account for the pulley 16 comprising a concave, convex or irregular outer surface, the processing means 22 extracts the crest profile of the edge 50 of the pulley 16 based on the particular configuration of the outer surface of the pulley 16. The crest profile of the edge 50 is related then to the crest profile of the edge 38 of the belt 12.

[00106] For example, the particular configuration (for example convex, concave or irregular) of the outer surface of the pulley 16 may be known by the operators of the conveyor system 28 and provided to the processing means 22 in order to, instead of using the crest profile of a straight line, use the crest profile of the edge 50 defined by, for example, the convex, concave or irregular surface of the outer surface of the pulley 16 on which the belt 12 rests.

[00107] The above mentioned method determines the thickness of a particular edge 38 of the belt 12 as is shown in figures 3 to 5. By using this method, the processing means 22 provide a two-dimensional model (2D model) of the edge 38 of the belt 12. This 2D model provides the thickness of the belt 12 at each point of the belt 12 extending from one side of the belt 12 to the other side of the belt 12 at the particular location along the belt 12 where the edge 38 is located. Thus, the 2D model provides the profile of the upper surface of a two-dimensional slice of belt 12 representing the edge 38.

[00108] The processing means 22 processes the images to generate the 2D crest profile of the edge 38 of the belt 12. The crest profile refers to the set of pixels identified as representing the edge 38 (also referred to as the tangential portion of the belt 12 as seen by camera 18). The crest profile of the edge 38 permits measuring the thickness of the belt 12 at the edge 38 from one side of the belt 12 to the other side of the belt 12 by comparing it with the crest profile of the edge 50 of the pulley 16. [00109] The processing means 22 uses object recognition techniques known in the art of image processing techniques to analyse the image taken by the camera 18 to identify the edges 38 and 50. In an arrangement the processing means 22 identify the regions of interest of the image such as the most prominent edges which are for example the edge 38 of the belt 12 and the outer edges 40a and 40b of the pulley 16. [001 10] In alternative arrangements, an operator may be visually identify the regions of interest for identifying the edge 38 of the belt 12 and the edges 40a and 40b of the pulley 16 by viewing the image and (1 ) identifying the most prominent edges (which are for example the edge 38 of the belt 12 and the outer edges 40a and 40b of the pulley 16), and (2) selecting the edges via known methods using user interface devices such as mouses or touch screens to highlight the areas of interest to be used by processing means 22 to determine the distance 46 between the edge 38 of the belt 12 and the edge 50 of the pulley 16.

[001 1 1 ] In one arrangement, the method for calculating the distance 46 is described in figure 7. As shown in figure 7, the digital processing means preferably applies seven stages of processing to the image taken by the camera 18, being:

1 . detecting the edges to produce an edge image: this is the process of applying an appropriate known edge detection algorithm to the image taken by camera 18.

2. filtering the edge image: this is the process of applying an appropriate known algorithm to remove non-edge pixels from the edge image produced in step 2 above. 3. identifying all of the pixels that represent the crest profile of the belt,

4. identifying all of the pixels that represent the outer edges 40a and 40b of the pulley 16,

5. converting the pixel coordinates to real world coordinates;

6. inferring the line representing the edge 50 between the two pulley edges 40a and 40b: this process includes obtaining information about the configuration of the outer surface of the pulley 16; for example, the outer surface of the pulley may be configured as a convex, concave, irregular or a straight surface; and

7. calculating the thickness along the width of the edge 38 by comparing the edge 38 of the belt 12 to the pulley surface determined in step 6 above.

[001 12] On occasions, the particular section of the belt 12 and the pulley 16 captured by the camera 18 may not be exactly orthogonal in the image. In this case the belt 12 and the pulley 16 appear at an angle. Processing of these type of images require processing by processing means 22 using particular image processing algorithms known in the art prior conducting the step 1 above related to detecting the edges to produce an edge image

[001 13] Further, in accordance with a particular arrangement of the invention, the system 10 permits determination of the thickness of the belt 12 by taking images while the belt 12 is moving; with this arrangement it is possible to obtain the profile of the upper surface of the belt 21 along the length of the belt 12.

[001 14] In this particular arrangement, the camera 18 takes at particular times (T,) images of the belt 12 as the belt 12 passes over the pulley 16. In this manner, a plurality of images of particular regions 48 of the belt 12 are captured to determine the profile of each edge 38 located at the particular region 48. As will be described at a later stage, to determine the profile of each edge 38 along the entire length of the belt 12 it is possible to create a 3D model that will allow generation of, for example, a pictorial representation of the profile of the belt 12 as shown in figure 8a and 8b.

[001 15] For illustration purposes, figure 3 shows three regions 48a to 48c. For each of these regions 48 the camera 18 takes one image for each region of the belt 12 moving around the pulley 16 for determining the crest profile of each edge 38 located at regions 48. An indication of the profile of the upper surface of the belt 12 is obtained by determining for a multitude of regions 48, that extend along the entire length of the belt 12, the crest profile of each of the edges 38 located in each region 48, This permits generation of a 3D model of the belt 12 as will be described below.

[001 16] In further arrangements of the present invention, there is provided a method for generating a three dimensional model (3D model) of particular sections of the belt 12 or of the entire belt 12. A 3D model provides the thickness of the belt 12 along the entire length and width of the belt 12. The 3D model also can be used to provide a pictorial representation of the belt 12 showing the profile of the upper surface of the belt 12 - see figures 8a and 8b. Providing a representation of the upper surface of the belt 12 permits detection of the locations of particular singularities (such as damages or imperfections among other features) along the length of the belt 12.

[001 17] The 3D model is generated by measuring the thickness of each two- dimensional slices of belt 12 mentioned before with reference to determine the thickness of the belt at the edge 38. This is done by taking successive images of the belt 12 as the belt 12 passes around the pulley 16. Figures 3 and 4 illustrate this process for a particular section 54 comprising the three regions 48a to 48c mentioned earlier. [001 18] As shown in figure 3, successive images of the regions 48 are captured as the particular section 54 of the belt 12 passes around the pulley 14. The images of each region 48 are shown in figure 4.

[001 19] Further, the processing means 22 processes the images in the order that the images have been taken to determine the thickness of each region 48. The array of numerical values representing the thickness of each region 38 is stored in the order that the images are processed in a register as shown in figure 7. Each region is identified with the particular moment that the image was taken. For example, the region 48a was taken at time T1 ; the region 48b was taken at time T2 and the region 48c was taken at time T3.

[00120] The 3D model of the particular section 54 of the belt 12 is generated by concatenating the 2D model of each region 48 together as shown in figure 3. [00121 ] The above description was limited to three discrete regions 48a to 48c to define the 3D model of a particular region 54 of the belt 12.

[00122] However, in order to obtain a 3D model of the entire belt 12 that will permit identifying the particular location of singularities along the length of the belt 12, a multitude of 2D models of individual slices of the belt 12 will need to be generated and concatenated. In this case each region 48 corresponds to one of the individual slices (representing an edge 38 of the belt 12) described earlier with reference to the generation of the 2D model. By concatenating the individual slices it is possible to generate the 3D model. The data that represent the 3D model corresponds to data that is stored in storage means as shown in figure 7. Figure 7 shows a table relating each time an image has been taken to (1 ) the time T, the particular image was taken, (2) the position of the edge 38 with respect to a particular reference point and (3) the particular profile of the edge 38.

[00123] The profile of the edge 38 stored in the register shown in figure 7 indicate the thickness of each point along the width of the particular edge 38; it also indicates the position of each point along the width of the particular edge 38 having a particular thickness.

[00124] With this information it is possible to generate a 3D model of the belt 12 showing the thickness of the belt 12 at each point of the belt 12. This permits together with the particular location of each point along the belt 12 to map the data conforming the 3D model to generate pictorial representation of the belt 12 (1 ) indicating the belt thickness at each point of the belt as well as (2) showing the each of the singularities of the belt 12 and the location and the particular configuration of each singularity.

[00125] Below will be described several options for determining the location of each edge 38 of the belt 12.

[00126] Further, the camera 18 is adapted to capture images at particular periods of time in order to generate the 3D model.

[00127] In accordance with a particular arrangement, the camera 18 is adapted to capture images based on a command generated by the processing means 22. The processing means 22 due to generating the command that captures the image is able to relate each image taken by the camera 18 with a particular moment in time. In this manner, the processing means 22 can relate any image of a particular edge 38 of the belt 12 with the particular time that the image of the particular edge 38 was captured. And, the crest profile of the edge 38 of the belt 12 generated by the processing means 22 may be also related to the particular time that the image of the particular region 48 was taken. [00128] Further, in a particular arrangement, the 3D model can provide information about the location along the belt of particular singularities; this is particular advantageous because as will be described at a later stage, the data (of the 3D model) related to a particular singularity permits further analysis of the singularity to, for example, determine whether the singularity is a potential damage to the belt 12 and that may require immediate attention. In order to further analyse the particular singularity, its location along the belt 12 need to be known so that the singularity may be found and surveyed and repaired if applicable.

[00129] In an arrangement, the position of the region 48 captured in a particular image can be calculated by knowing the speed that the belt 12 is traveling. In an arrangement, the speed of the belt 12 may be provided by an external source such as for example a control room controlling the conveyor system 28. For this, the processing means 22 and the control room 24a are adapted to communicate to each other.

[00130] Another option is to use signals from sensors elsewhere on the plant, which can be used to deduce the speed of the belt 12. [00131 ] In another arrangement, the processing means 22 may apply the techniques of optical flow over subsequent frames to determine the speed of the belt 12.

[00132] Alternatively, the position of the belt in each frame can be determined by integrating the speed of the belt 12. Given that integrating a speed signal to calculate position can lead to increased error in the calculated positon over time, the processing means 22 are adapted to determine the precise location of the particular singularity along the belt at least once per cycle of the belt 12.

[00133] In a further arrangement, the splices that joint the belt 12 together may be used as reference points for determining position of any point of the belt 12. The processing means 22 may apply techniques of object recognition to identify one of more splices as they are captured in images. As the relative position of each splice remains constant along the belt length, the location of the belt in any particular image within which a belt splice is identified will determine the location of that particular image along the belt. [00134] In accordance with one embodiment of the invention, the position of the section of the belt 12 captured in a image may be provided via RFID chips embedded in the belt 12, and a plurality of sensors to detect the RFID chips as they pass close to the sensor. An external device could use the timestamps of the RFID chip measurements, and the position of the RFID sensors relative to the pulley 16 to determine the position of the belt in real time, which can then be provided used to provide the position of the captured images.

[00135] As mentioned before, the system 10 provides a 3D model of the entire belt 12. For this, the belt 12 is moved one entire cycle around pulleys 16. During this one cycle, the 3D model is generated with its data being stored in storage means as mentioned before. This data is useful for further analysis of the data to determine the status of the belt 12 at the time the data was collected by, for example, analysing particular singularities detected either by the processing means 22 or an operator viewing the pictorial representation of the belt 12 in display. These particular singularities may be stored in other storage means such as a circular buffer for further review at a later stage.

[00136] Further, the belt 12 may be periodically inspected using system 10. This is done to ensure, for example, no new damage has appeared in the belt 12 since completion of the earlier analysis and that any singularity may have deteriorated becoming a potential damage for the belt 12. The data collected and generated by the processing means 22 at later analysis may be stored in the same storage means used for storing the data of the earlier analysis.

[00137] In an arrangement, the processing means will update the 3D model of the belt 12 stored in the storage means in real time with each thickness profile extracted, and in conjunction with the position of each thickness profile extracted. In one arrangement, the process of updating the 3D model with a new profile and the position of that new profile involves overwriting the data in the 3D model that represents the profile most near along the length of the belt to the position of the new profile.

[00138] In another arrangement, the process of updating the 3D model with a new profile and the position of that new profile involves the application of an appropriate data-fusion algorithm, which may preferably include the incorporation of any other relevant information available to the system.

[00139] Moreover, the system 10 is able to retain data of 3D model representing the state of the belt 12 that taken over a particular period time; in this, the system 10 provides a historical record of the status of the belt 12, for example, starting when the belt 12 was first installed and ending when the belt needed to be replaced. This historical record of the 3D model permits determining the rate of change of belt thickness over particular periods of times. Also, it permits determining the rate of change of a particular singularity detected and that is being monitored due to being a potential hazard for the operation of the belt 12. [00140] Furthermore, it is expected that the historical data of the 3D model will occupy relative large areas of the storage means. For this, in an arrangement, the historical data of the 3D models are digitally stored by the system at different resolutions through time, such that a the data of the 3D model generated recently will be recorded at a higher resolution than 3D models generated during analyses conducted before the recent analysis. Also, the historic records of the 3D models may be digitally compressed via an appropriate compression algorithm in order to reduce the data storage required by the system 10.

[00141 ] Moreover, as mentioned earlier, the system 10 is adapted to store in storage means any data related to singularities that are present in the belt 12. For example, an operator may be watching a pictorial representation of the 3D model to identify singularities present in the belt 12. Once a singularity that may be relevant to the operator, it may be selected and stored in storage means such as a circular buffer for further analysis. [00142] Alternatively, the process of recognising and selecting of singularities in the belt may be conducted by the processing means 22. For this the processing means 22 may apply techniques of object recognition to identify any relevant singularity. Also, the processing means 22 may include apply machine-learning techniques in order to determine whether the identified features represent damage, and if so, the type of damage.

[00143] In one arrangement, in the event that the digital processing means identifies a feature of the belt from the 3D model and categorises the feature as damage, then the footage of the section of the damaged belt is stored in for example a circular buffer 25 for future retrieval such that it can be presented to operators via a human machine interface (HMI)

[00144] In accordance with a particular arrangement, belt thickness measurements that are provided via external sources or determined via different techniques other than using the camera 18 may also be incorporated in the 3D belt thickness model.

[00145] For example, referring to Figure 5, the system further comprises a laser assembly 26 arranged to irradiate the laser light onto the belt 12 at a location adjacent where the camera 12 views the belt 12 as it passes around the pulley 16. In this manner, a laser line 56 produced by the reflecting laser light is generated extending along the width of the belt 12. As the belt 12 moves around the pulley 16, particular relevant singularities may deform the line 56 produced by the laser. The position and distortion of the laser line 56 in the frame provides information on the thickness of the belt 12. The laser line(s) 56 captured in the frame are analysed by a processing means 12 to infer a belt surface geometry, which is then incorporated into the 3D model of the belt using known techniques of data fusion. The distortion of the line is depicted in Figures 10 and 1 1 .

[00146] In an arrangement, a plurality of laser lines 56 may be emitted onto the belt 12 spaced apart with respect to each other and the digital imaging camera. The laser lines 56 are generated by a plurality of laser emitters oriented at different angles relative to each other. The different relative angles between the laser(s) and digital imaging camera will provide varying degrees of line distortion, thereby providing the digital processing means 22 with more information that permit inferred belt surface geometries of a greater accuracy when comparted to using only the images taken by the camera 18. [00147] In the above mentioned arrangement including the laser assembly as shown in the figures 10 and 1 1 , the belt thickness measurements obtained via the laser lines will be used by the processing means 22 during the process of generating the 2D model of the each region 48 of the belt 12. Alternatively, with particular surface profile identification systems using lasers and optical cameras, the information collected by these systems may be used during the 3D model generation process.

[00148] In a particular arrangement, the system further includes a human machine interface (HMI). The HMI is arranged to display information that is generated by the system of the present invention. The HMI includes a plurality of displays arranged to allow an operator of the conveyor system 28 to see visualisations depicting the measured state of the conveyor belt. The HMI may display images taken from the digital imaging device.

[00149] Further, in accordance with a particular, a subset of the data recorded or generated by the system may be made available to other systems, including but not limited to the plant control system(s), or a cloud data storage system. [00150] In an arrangement, the system 10 further comprises a means of controlling the speed or throughput of the conveyor system 28 in response to a deterioration of the of the belt 12. Adjusting the overall speed of the conveyor system 28, reduces the load being placed on the belts in the conveyor and thus decreasing the rate of deterioration of the belt 12. [00151 ] Further, if deterioration of the belt 12 is such that maintenance or repair of the belt 12 is essential, the system of the present invention is able to control other associated systems which feed the affected conveyor system 28 to stop the through flow of materials. In this manner, the damaged belt can be cleared of materials prior to being shut down for repairs. [00152] Modifications and variations as would be apparent to a skilled addressee are deemed to be within the scope of the present invention. For example, in a particular arrangement, the data capturing means 14 may comprise a plurality of cameras 18 arranged side by side overhead the end section 30e of the conveyor system 28. In this arrangement, each of the cameras 18 will take an image of a particular section of the belt 12 as the belt 12 passes around the pulley 16. The processing means 22 are adapted to use the plurality of images of the particular section of the belt 12 for generating the crest profile of the edge 38 of the belt 12 and the crest profile of the edge 50 of the pulley 16 to determine the profile of the edge 38. [00153] Further, in accordance with an alternative arrangement the invention, the data capturing means 14 not necessarily may need to be located at a pulley located at the end of the conveyor system; instead the data capturing means 14 may be located adjacent any pulley that provides at least a small deflection to the belt 12 such that the data capturing means 14 may take an tangential image of the edge 38 (formed by the at least small deflection) of the belt 12.

[00154] Throughout this specification, unless the context requires otherwise, the word "comprise" or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated integer or group of integers but not the exclusion of any other integer or group of integers




 
Previous Patent: DOOR HANDLE ASSEMBLY

Next Patent: A SCREW PILE