Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SMART CAMERA SYSTEM FOR MONITORING REMOTE ASSETS
Document Type and Number:
WIPO Patent Application WO/2023/205837
Kind Code:
A1
Abstract:
A camera system for monitoring remote assets comprises: a sealed housing including an electrical housing, a bottom part, and a top part, the top part and bottom part located at opposite ends of the electrical housing; one or more cameras positioned within the top part to have a field of view external of the top part; one or more processors within the sealed housing; a communications unit within the sealed housing for enabling wireless data communication with an external gateway device; a communications antenna; and a first power source receiving chamber. The one or more cameras are configmed to capture one or more images. The one or more processors may include an artificial intelligence (Al) processor configured to apply an Al model to determine a likelihood of a hazard to be present in the one or more images, and to generate a signal indicating detection of the hazard.

Inventors:
TATA NARDINI FLAVIA (AU)
PEARSON MATTHEW JAMES (AU)
PEREIRA NICOLLAS ALEXANDRE VIEIRA DE FREITAS (AU)
ESMATI ZAHRA (AU)
RALTCHEVA MAGDALENA STANKOVA (AU)
LIAO CHUN-YI (AU)
Application Number:
PCT/AU2022/050469
Publication Date:
November 02, 2023
Filing Date:
May 16, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
FLEET SPACE TECH PTY LTD (AU)
International Classes:
G08B13/196; G06V10/70; G06V20/40; G08B21/18; G08B23/00; H01Q7/00
Foreign References:
US20200088341A12020-03-19
US20190313020A12019-10-10
US20190113826A12019-04-18
US20050207487A12005-09-22
Attorney, Agent or Firm:
FB RICE (AU)
Download PDF:
Claims:
CLAIMS:

1. A camera system for monitoring remote assets, comprising: a sealed housing including an electrical housing, the electrical housing being substantially cylindrically shaped, the sealed housing also including a bottom part, and a top part, the top part having a diameter larger than the electrical housing, the top part and bottom part located at opposite ends of the electrical housing; one or more cameras positioned within the top part to have a field of view external of the top part; one or more processors within the sealed housing; a communications unit within the sealed housing for enabling wireless data communication with an external gateway device, the communications unit communicatively coupled to the processor; a communications antenna communicatively coupled to the communications unit, the communications antenna coupled to the sealed housing; and a first power source receiving chamber in the sealed housing, the first power source receiving chamber formed to receive a first power source to power components of the smart camera system; and wherein the one or more cameras are configured to capture one or more images, at least one of the one or more processors is configured to apply an artificial intelligence (Al) model to determine a likelihood of an object of an object type from among a predetermined set of object types being present in the one or more images, and the at least one of the one or more processors is configured to determine whether the likelihood of a presence of the object is above a predetermined threshold, and upon determining that the likelihood is above the predetermined threshold, the one or more processors is configured to generate a first signal indicating detection of the object, and the communications module is configured to transmit a second signal indicating detection of the object based on the first signal through the communications antenna to a gateway device.

2. The smart camera system of claim 1, wherein the one or more processors is configured to determine a likelihood of a vehicle or work machine to be present in the one or more images.

3. The system of claim 1 or 2, wherein the one or more processors comprises a first Al processor and a second processor, the Al processor configured to determine a likelihood of a hazard in the one or more images, and the second processor is configured to wake-up the Al processor at predetermined times.

4. The system of claim 3, wherein the Al processor, upon waking-up, is configured to activate the one or more cameras to capture the one or more images before determining the likelihood of the hazard from the one or more images.

5. The system of claim 3 or claim 4, wherein the predetermined times occur between 1 and 12 times per hour.

6. The system of any one of claims 1 to 5, wherein the one or more cameras include a first camera and a second camera, wherein the one or more images includes a first one or more images captured by the first camera and/or a second one or more images captured by the second camera.

7. The system of any one of claims 1 to 6, wherein the one or more cameras include a first camera and a second camera that are each configurable to be oriented to face in a respective different direction.

8. The system of claim 6 or claim 7, wherein the one or more processors are configured to activate the first camera to capture the first one or more images at first activation times, and activate the second camera to capture the second one or more images at second activation times.

9. The system of any one of claims 3 to 5, further comprising, a printed circuit board (PCB) receiving chamber contained within the electrical housing; a first PCB on which is mounted the first Al processor; and a second PCB on which is mounted the second processor; and wherein the electrical housing comprises a PCB housing and a first power source housing, the PCB housing including the PCB receiving chamber, the first power source housing including the first power source receiving chamber, the first PCB and second PCB mounted within the PCB receiving chamber.

10. The system of claim 9, wherein the PCB housing and first power source housing are cylindrically shaped.

11. The system of claim 9 or claim 10, further comprising a third PCB, the third PCB mounting interfacing components between the one or more cameras to one or more components mounted on the first PCB and/or the second PCB.

12. The system of any one of claims 1 to 11, wherein the electrical housing and bottom part are sized to be receivable in a pole having an inner diameter of at least 53 mm.

13. The system of any one of claims 1 to 12, wherein the electrical housing and bottom part has an outer diameter selected to be less than a selected hollow pole inner diameter, and the top part has an outer diameter that is selected to be the same or greater than the hollow pole outer diameter, so that the top part may be mounted to sit on top of the hollow pole and the electrical housing and bottom part may be received inside the pole.

14. The system of any one of claims 9 to 11, wherein the first PCB and second PCB are shaped and mounted within the electrical housing to extend generally in separate planes that are substantially perpendicular to a longitudinal central axis of the electrical housing.

15. The system of any one of claims 1 to 14, further comprising an antenna board mounted on the top part, the antenna board mounting the communications antenna.

16. The system of any one of claims 1 to 15, wherein the communications antenna is an electrically small antenna that comprises a first capacitive loaded loop (CLL) and a first elongated member.

17. The system of claim 16, wherein the first CLL overhangs or overlies the first elongated member.

18. The system of claim 16 or claim 17, wherein the first CLL forms a rectangular arch, and the first elongated member forms a circular arch.

19. The system of any one of claims 16 to 18, further comprising, a GNSS unit in the housing for processing a time synchronisation signal; and a GNSS antenna communicatively coupled to the GNSS unit, the GNSS antenna coupled to the housing.

20. The system of claim 19, wherein the GNSS antenna is an electrically small antenna that comprises a second CLL and a second elongated member.

21. The system of claim 20, wherein the first CLL and the second CLL each comprise a dielectric strip disposed in the middle of its respective CLL.

22. The system of any one of claims 19 to 21, wherein the GNSS antenna is mounted to the antenna board.

23. The system of claim 22, wherein the antenna board comprises a plastic cover, the plastic cover forming an antenna housing, the antenna housing covering the communications antenna and GNSS antenna.

24. The system of claim 23, wherein the plastic cover is substantially opaque to light transmission.

25. The system of any one of claims 1 to 24, wherein when two or more images of the one or more images are detected to contain the object, the processor is configured to generate a signal indicating detection of the object and a high confidence of the object.

26. The system of any one of claims 1 to 25, wherein when two or more objects are detected in the one or more images, the processor is configured to generate a signal indicating detection of the objects and a high confidence of the objects.

27. The system of any one of claims 1 to 26, wherein the first power source receiving chamber is located at a portion below the one or more processors, the one or more cameras, and the communications antenna.

28. The system of claim 27, further comprising the first power source contained in the first power source receiving chamber, the first power source configured to supply power to the one or more processor and/or other components of the system.

29. The system of claim 28, further comprising a second power source housing coupled to a bottom of the first power source housing, the second power source housing including a second power source receiving chamber to receive a second power source.

30. The system of claim 28 or claim 29, further comprising a further one or more power source housing coupled to a bottom of the second power source housing, the further one or more power source housing each including a respective power source receiving chamber to receive respective power sources.

31. The system of any one of claims 28 to 30, wherein each power source housing comprises a respective PCB.

32. A hazard detection system comprising, one or more camera systems of any one of claims 1 to 31; one or more gateway devices in communication with a remote server system via one or more satellites; and wherein each of the smart camera systems is communicatively coupled to a gateway device of the one or more gateway devices, and each gateway device is configured to transmit data to the server system based on signals sent to the gateway device from one or more of the smart camera systems.

33. A camera system for monitoring remote assets, comprising: a sealed housing including an electrical housing, a bottom part, and a top part, the top part and bottom part located at opposite ends of the electrical housing; one or more cameras positioned within the top part to have a field of view external of the top part; one or more processors within the sealed housing; a communications unit within the sealed housing for enabling wireless data communication with an external gateway device, the communications unit communicatively coupled to the processor; a communications antenna communicatively coupled to the communications unit, the communications antenna coupled to the sealed housing; and a first power source receiving chamber in the sealed housing, the first power source receiving chamber formed to receive a first power source to power components of the smart camera system; and wherein the one or more cameras are configured to capture one or more images, at least one of the one or more processors is configured to apply an artificial intelligence (Al) model to determine a likelihood of a hazard to be present in the one or more images, and the at least one of the one or more processors is configured to determine that the likelihood of a presence of the hazard is above a predetermined threshold, and upon determining that the likelihood is above the predetermined threshold, the one or more processors is configured to generate a first signal indicating detection of the hazard, and the communications module is configured to transmit a second signal indicating detection of the hazard based on the first signal through the communications antenna to a gateway device.

Description:
"Smart camera system for monitoring remote assets"

Technical Field

[1] Embodiments relate generally to camera systems for monitoring remote assets. Such systems may be “smart” in that they may apply artificial intelligence (Al) to object detection in images captured by cameras. Embodiments also relate to hazard or object detection devices. Some embodiments relate to a method for hazard or object detection, and systems employing one or more hazard or object detection devices. Some embodiments relate to camera systems for detecting hazards or objects. Some embodiments relate to systems comprising one or more gateway devices in communication with one or more hazard or object detection devices for high-latency data backhaul communication to a server system. Some embodiments relate to a server system and client device for visualising and actioning hazard or object detection.

Background

[2] Assets and infrastructure, such as sensitive sites, gas pipelines and fibre cable lines, may be susceptible to tampering and/or damage induced by objects, such as people or vehicles. The consequences of damage to high pressure pipelines may be catastrophic. The assets may be damaged by prolonged and/or repeated application of heavy vehicle loads, and/or may be at risk of being damaged by construction activities, for example. Signage at the assets may be inadequate and/or have scalability problems to ward away the hazardous or potentially hazardous objects. The assets may be located in remote locations with power and/or communication constraints.

[3] Embodiments disclosed below may address or ameliorate one or more of the aforementioned shortcomings, or at least to provide a useful alternative.

[4] Throughout this specification the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps.

[5] Any discussion of documents, acts, materials, devices, articles or the like which has been included in the present specification is not to be taken as an admission that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present disclosure as it existed before the priority date of each of the appended claims.

Summary

[6] Some embodiments relate to a camera system for monitoring remote assets, comprising: a sealed housing including an electrical housing, the electrical housing optionally being substantially cylindrically shaped, the sealed housing also including a bottom part, and a top part, the top part optionally having a diameter larger than the electrical housing, the top part and bottom part located at opposite ends of the electrical housing; one or more cameras positioned within the top part to have a field of view external of the top part; one or more processors within the sealed housing; a communications unit within the sealed housing for enabling wireless data communication with an external gateway device, the communications unit communicatively coupled to the processor; a communications antenna communicatively coupled to the communications unit, the communications antenna coupled to the sealed housing; a first power source receiving chamber in the sealed housing, the first power source receiving chamber formed to receive a first power source to power components of the smart camera system; and wherein the one or more cameras are configured to capture one or more images, at least one of the one or more processors is configured to apply an artificial intelligence (Al) model to determine a likelihood of a hazard to be present in the one or more images, and the at least one of the one or more processors is configured to determine that the likelihood of a presence of the hazard is above a predetermined threshold, and upon determining that the likelihood is above the predetermined threshold, the one or more processors is configured to generate a first signal indicating detection of the hazard, and the communications module is configured to transmit a second signal indicating detection of the hazard based on the first signal through the communications antenna to a gateway device.

[7] Some embodiments relate to a camera system for monitoring remote assets, comprising: a sealed housing including an electrical housing, the electrical housing optionally being substantially cylindrically shaped, the sealed housing also including a bottom part, and a top part, the top part optionally having a diameter larger than the electrical housing, the top part and bottom part located at opposite ends of the electrical housing; one or more cameras positioned within the top part to have a field of view external of the top part; one or more processors within the sealed housing; a communications unit within the sealed housing for enabling wireless data communication with an external gateway device, the communications unit communicatively coupled to the processor; a communications antenna communicatively coupled to the communications unit, the communications antenna coupled to the sealed housing; and a first power source receiving chamber in the sealed housing, the first power source receiving chamber formed to receive a first power source to power components of the smart camera system; and wherein the one or more cameras are configured to capture one or more images, at least one of the one or more processors is configured to apply an artificial intelligence (Al) model to determine a likelihood of an object of an object type selected from among a predetermined set of object types being present in the one or more images, and the at least one of the one or more processors is configured to determine whether the likelihood of a presence of the object is above a predetermined threshold, and upon determining that the likelihood is above the predetermined threshold, the one or more processors is configured to generate a first signal indicating detection of the object, and the communications module is configured to transmit a second signal indicating detection of the object based on the first signal through the communications antenna to a gateway device.

[8] According to some embodiments, the one or more processors is configured to determine a likelihood of a vehicle, such as a heavy vehicle, truck, excavator or other work machine to be present in the one or more images. [9] According to some embodiments, the one or more processors comprises a first Al processor and a second processor, the Al processor configured to determine whether a likelihood of a hazard or object being contained in the one or more images, and the second processor is configured to wake-up the Al processor at predetermined times.

[10] According to some embodiments, the Al processor, upon waking-up, is configured to activate the one or more cameras to capture the one or more images before determining the likelihood of the hazard or object from the one or more images.

[11] According to some embodiments, the one or more cameras include a first camera and a second camera, wherein the one or more images includes a first one or more images captured by the first camera and/or a second one or more images captured by the second camera. The one or more cameras may include a first camera and a second camera that are each configurable to be oriented to face in a respective different direction.

[12] According to some embodiments, the one or more processors are configured to activate the first camera to capture the first one or more images at first activation times, and activate the second camera to capture the second one or more images at second activation times. The first activation times may be different from the second activation times. The first activation times may be the same as the second activation times.

[13] According to some embodiments, the camera system further comprises, a printed circuit board (PCB) receiving chamber contained within the electrical housing; a first PCB on which is mounted the first Al processor; and a second PCB on which is mounted the second processor; and wherein the electrical housing comprises a PCB housing and a first power source housing, the PCB housing including the PCB receiving chamber, the first power source housing including the first power source receiving chamber, the first PCB and second PCB mounted within the PCB receiving chamber. [14] According to some embodiments, the PCB housing and first power source housing are cylindrically shaped.

[15] According to some embodiments, the camera system further comprises a third PCB, the third PCB mounting interfacing components between the one or more cameras to one or more components mounted on the first PCB and/or the second PCB.

[16] According to some embodiments, the top part includes a plastic wall. According to some embodiments, the electrical housing and bottom part are sized to be receivable in a pole having an inner diameter of at least 53 mm.

[17] According to some embodiments, the electrical housing and bottom part has an outer diameter selected to be less than a selected hollow pole inner diameter, and the top part has an outer diameter that is selected to be the same or greater than the hollow pole outer diameter, so that the top part may be mounted to sit on top of the hollow pole and the electrical housing and bottom part may be received inside the pole.

[18] According to some embodiments, the first PCB and second PCB are shaped and mounted within the electrical housing to extend generally in separate planes that are substantially perpendicular to a longitudinal central axis of the electrical housing.

[19] According to some embodiments, the camera system further comprises an antenna board mounted on the top part, the antenna board mounting the communications antenna.

[20] According to some embodiments, the communications antenna is an electrically small antenna that comprises a first capacitive loaded loop (CLL) and a first elongated member. According to some embodiments, the first CLL overhangs the first elongated member. According to some embodiments, the first CLL forms a rectangular arch, and the first elongated member forms a circular arch. [21] According to some embodiments, the camera system further comprises, a GNSS unit in the housing for processing a time synchronisation signal; and a GNSS antenna communicatively coupled to the GNSS unit, the GNSS antenna coupled to the housing. According to some embodiments, the GNSS antenna is an electrically small antenna that comprises a second CLL and a second elongated member, the GNSS antenna similarly shaped to the communications antenna. According to some embodiments, the first CLL and the second CLL each comprise a dielectric strip disposed in the middle of its respective CLL.

[22] According to some embodiments, the GNSS antenna is mounted to the antenna board. According to some embodiments, the antenna board comprises a plastic cover, the plastic cover forming an antenna housing, the antenna housing covering the communications antenna and GNSS antenna.

[23] According to some embodiments, the plastic cover is substantially opaque to light transmission. According to some embodiments, the one or more cameras are each configurable to be oriented in a respective direction. According to some embodiments, the predetermined times occur between 1 and 12 times per hour.

[24] According to some embodiments, when two or more images of the one or more images are detected to contain the hazard or object, the processor generates a signal indicating detection of the hazard or object and a high prevalence or confidence of the hazard or object. According to some embodiments, when two or more hazards or objects are detected in the one or more images, the processor generates a signal indicating detection of the hazards or objects and a high prevalence of the hazards or objects.

[25] According to some embodiments, the first power source receiving chamber is located at a portion below the one or more processors, the one or more cameras, and the communications antenna. [26] According to some embodiments, the camera system further comprises the first power source contained in the first power source receiving chamber, the first power source configured to supply power to the one or more processor and/or other components of the system. According to some embodiments, the camera system further comprises a second power source housing coupled to the bottom of the first power source housing, the second power source housing including a second power source receiving chamber, the second power source receiving chamber containing a second power source, the second power source configured to supply power to components of the system.

[27] The camera system may further comprise a further one or more power source housing coupled to the bottom of the second power source housing, the further one or more power source housing each including a respective power source receiving chamber, the respective power source receiving chambers containing a respective power sources, the respective power sources configured to supply power to components of the system. According to some embodiments, each power source is coupled to a respective PCB.

[28] Some embodiments relate to a hazard detection system comprising: one or more camera systems as described above or herein; and one or more gateway devices. One or more satellites may be communicatively coupled to each of the one or more gateways. One or more ground stations may be communicatively coupled to the one or more satellites. A server system may be communicatively coupled to the one or more ground stations. Each of the smart camera systems may be communicatively coupled to a gateway device of the one or more gateway devices. The server system is configured to receive data of the signals sent from the smart camera systems via the one or more gateway devices, one or more satellites, and the one or more ground stations.

[29] According to some embodiments, the one or more camera systems are deployed to have their respective cameras facing along and/or above one or more pipeline (or other sensitive infrastructure) trajectories. [30] Some embodiments relate to a camera system for monitoring remote assets, comprising: a sealed housing including an electrical housing, a bottom part, and a top part, the top part and bottom part located at opposite ends of the electrical housing; one or more cameras positioned within the top part to have a field of view external of the top part; one or more processors within the sealed housing; a communications unit within the sealed housing for enabling wireless data communication with an external gateway device, the communications unit communicatively coupled to the processor; a communications antenna communicatively coupled to the communications unit, the communications antenna coupled to the sealed housing; a first power source receiving chamber in the sealed housing, the first power source receiving chamber formed to receive a first power source to power components of the smart camera system. The one or more cameras are configured to capture one or more images. The one or more processors include an artificial intelligence (Al) processor configured to apply an Al model to determine a likelihood of a hazard or object (of an object type selected from among a predetermined set of object types) to be present in the one or more images, and the Al processor is configured to determine whether the likelihood of a presence of the hazard or object is above a predetermined threshold, and upon determining that the likelihood is above the predetermined threshold, the Al processor is configured to generate a first signal indicating detection of the hazard or object. The communications module is configured to transmit a second signal indicating detection of the hazard or object based on the first signal through the communications antenna to a gateway device.

Brief Description of Drawings

[31] Figure 1 shows a block diagram of a hazard detection system according to some embodiments.

[32] Figure 2 is an example block diagram of a camera system as an example of a camera system for remote asset monitoring according to some embodiments. [33] Figures 3A and 3B show external views of an example camera system as an example of a camera system for remote asset monitoring according to some embodiments.

[34] Figure 3C shows an internal view of the camera system according to some embodiments.

[35] Figure 3D shows an exploded view of the camera system according to some embodiments.

[36] Figure 4 is a perspective view of the printed circuit boards, top seal, and camera mounts according to some embodiments.

[37] Figure 5 shows views of a stack of PCBs according to some embodiments.

[38] Figures 6A and 6B show top and underside views of a PCB according to some embodiments.

[39] Figures 7A and 7B show top and underside views of a PCB according to some embodiments.

[40] Figures 8 A and 8B show top and underside views of a PCB according to some embodiments.

[41] Figures 9A and 9B shows external and internal views respectively of PCBs mounted by PCB structure columns within PCB housing.

[42] Figure 10A shows a top view of power source secured by power source clamp within power source housing.

[43] Figure 10B shows an internal side view of power source secured by power source clamp within power source housing. [44] Figures 11A, 11B, and 11C shows views of two camera mounts attached to top seal.

[45] Figure 12 shows two antennas and respective capacitive loaded loops (CLLs) mounted upon antenna platform.

[46] Figure 13A and 13B shows external and internal views respectively of camera system according to some other embodiments.

[47] Figure 14 shows an external view of camera system according to some other embodiments.

[48] Figure 15 shows components of camera system interfacing with a processor.

[49] Figure 16 is a flowchart of a low-power operation method of a camera system according to some embodiments.

[50] Figure 17 is a flowchart of a method of Al-based image processing according to some embodiments.

[51] Figure 18 is a flowchart of a method of a processor executing a wake-up process to wake-up from a low-power mode, according to some other embodiments.

[52] Figure 19 is a flowchart of a method of alert determination according to some embodiments.

[53] Figure 20 shows an example deployment view of a camera system at a site according to some embodiments.

Detailed Description

[54] External hazard threats, such as trespassers or construction activities performed by work machines may pose a significant risk to remote assets or sites, such as sensitive sites, artistic works, high pressure transmission pipelines, or communication lines, for example. Preventative controls, such as signage, may fail to prevent external interference threats, and therefore may result in potentially costly and/or catastrophic consequences.

[55] Some embodiments relate to a device, system, and/or method utilising detection of external interference threats to assist avoiding or minimising the damage which may be caused to remote assets. The devices, systems and/or methods of the present disclosure may also utilise detection of external hazards for other purposes.

[56] Some embodiments relate to a camera system able to process locally the images captured by a camera to establish the level of threat associated with a remote asset encroachment and alert the remote asset owner/operator in near real-time. The camera system may send signals based on the processed images via a gateway which communicates with one or more satellites or satellite networks to a ground station which forwards the signal via a network to a server system. The server system may provide a data visualisation application and alert system to allow one or more client devices to visualise status and/or location of the camera system and/or receive alert notifications/messages concerning the remote asset encroachment detected by the camera system.

[57] The camera system has an artificial intelligence (Al) processor that may apply a trained artificial intelligence model to images captured by the camera(s) to detect a hazard. The artificial intelligence model may be an object detection model trained to detect a specific set of hazards in the captured images, for example. The Al processor may apply an object detection model to images captured by the camera in order to detect a work machine, such as a dump truck, hydraulic excavator, or bulldozer, for example. Upon positive detections of work machines through processing the images using the object detection model, the camera system may signal or send an encroachment/hazard notification to the server system. The Al processor applies the Al model to determine a likelihood of a hazard to be present in the one or more images. The Al processor is configured to determine whether the likelihood of a presence of the hazard is above a predetermined threshold. Upon determining that the likelihood is above the predetermined threshold, the Al processor is configured to generate a first signal indicating detection of the hazard. This first signal may be communicated to a separate processor for storage into memory or stored directly into memory, for example. The communications module is configured to transmit a second signal indicating detection of the hazard based on the first signal through the communications antenna to a gateway device.

[58] In some embodiments, the camera system sleeps and “wakes-up” at predetermined times or at a predetermined interval of time to capture images, then process the captured images, and then perform any signalling. For example, the camera system may wake-up at predetermined times or intervals, such as every 15, 30, or 60 minutes or at certain times of a day, to perform the aforementioned functions and then go to sleep after the aforementioned functions are complete. The camera system may capture a predetermined number of images (or a video sequence) per wake-up, such as one or multiple images. For example, the camera system may capture and then process between 1 and 500 images upon each wake-up event. Where still (single frame) images are captured, then around 1 to 12 images may be captured, for example. Where a video sequence is captured, then it may be captured for a predetermined number of seconds. For video image sequence capture, the total number of images captured is a product of the frame rate and the length (in seconds) of capture, so for example 10 seconds of video image capture at a frame rate of 20 frames per second would result in 200 images being captured.

[59] In some other embodiments, the camera system may include a sensor, such as an acoustic sensor, light sensor or an accelerometer, to trigger a wake-up of the system operations, including performing image capture, Al processing, and data transmission functions of the camera system.

[60] In some embodiments, the hazards being detected by the Al processor may not be limited to work machines. In some embodiments, the hazards being detected by the Al processor may include animated or movable object types, such as livestock, a person, an animal, a bike, a car, a clay delver, a truck, a cable plough, a tractor, an excavator, a Bobcat, a Ditch Witch, a horizontal drill, a boring rig, an auger, a bulldozer and/or a post driver, for example. In some embodiments, an artificial intelligence model may be applied to detect one, a subset, or all of the hazards listed.

[61] In some embodiments, the processor processing images executing the object detection model may be able to detect one, a subset, or all of the hazards with a predetermined mean average precision (mAP). An example predetermined mAP may be in the order of 0.90 to 0.95 (i.e. 90-95% confidence) for each image. In some embodiments, the processor executing the object detection model may detect the hazard if the hazard being captured in an image is stationary and/or moving.

[62] The camera system may comprise a housing, which contains components such as the processor(s) of the camera system. The housing may be at least partially in the shape of a pole, such as a pole complying with standard dimensions of the location, to make the camera system appear similar to other poles for signage, for example. The housing may be at least partially narrower than a pole complying with standard dimensions of the location, to allow the housing to be at least partially inserted into the pole. The housing may comprise a power receiving chamber which is configured to house one or more power sources. The number of power sources configurable to be housed by the camera system housing and chamber is proportional to the length of the power receiving chamber.

[63] The camera system may comprise low-power wide-area network (LPWAN) communications to the gateway, such as LoRa modulation using LoRaWAN protocols and architecture. In some other embodiments, the camera system may comprise a different form of communications to send signals to server system than LPWAN communications. Therefore, the camera system may contain components for communicate via a mobile network, such as via LTE or NR networks, for example. The camera system may not transmit raw image data, but rather transmit a signal indicating a status determined from processing the raw image data. [64] Accordingly, with regard to the use of sleep and wake-up, LPWAN communications and signalling, and power receiving chamber, the camera system may be configured to operate for at least about a year of operation without a further external power source while using six power modules.

[65] The camera system may have an easily attachable/detachable serial connection, such as a magnetic micro universal serial bus (USB) connection at the centre of the top seal, for connecting external devices. The external devices may include laptops, tablet devices, mobile computing devices, for example. External devices may be able to be communicatively coupled to the one or more cameras and/or processors of the camera system to view the one or more camera images via connecting to the easily attachable/detachable connection. Therefore, an installer may be able to see the camera angles and adjust the angle or position of one or more mounted cameras accordingly. The external device may also be communicatively coupled to the one or more processors to perform software updates and read data of the software/system.

[66] When the external device connects to the easily attachable/detachable connection, this may be automatically detected by the processor and may cause the processor to automatically change from an operational mode to a diagnostic mode, for example at the hardware level.

[67] Figure 1 shows a block diagram of a hazard detection system 100 according to some embodiments. Hazard detection system 100 may also be referred to as system 100, detection system 100, or pipeline detection system 100, for example.

[68] Hazard detection system 100 includes a camera system 110. Camera system 110 may also be referred to as a detection unit 110, hazard detection device 110, hazard detection unit 110, smart camera unit 110, smart camera apparatus 110, smart camera system 110, detection device 110, pipeline monitoring device 110, pipeline monitoring unit 110, unit 110, and device 110. [69] Camera system 110 is designed to be suitable for deployment at a remote site which may have power supply and/or communication constraints. In some other embodiments, camera system 110 is deployed at a local site which may not have power supply and/or communication constraints.

[70] Deployment of camera system 110 may include embedding into or installing to a ground surface. Camera system 110 may be mounted on a structure which is embedded into or installed to a ground surface.

[71] Camera system 110 may communicate via communication link 118 to gateway device 120. Communication over communication link 118 may be wireless communications. Communication over communication link 118 may be wireless wide- area network (WWAN) communications. Communication over communication link 118 may be LPWAN communications, such as Narrowband Internet of Things (NB- loT), LoRaWAN or SigFox LPWAN.

[72] The camera system 110 comprises an LPWAN antenna that is configured to communicate over 8 or 16 radio channels. The camera system 110 may communicate with gateway device 120 within a range of 20 km, for example. The LPWAN antenna of the data acquisition unit 110 may be configured to communicate using the LoRa™ technology over the frequency bands 902-928 MHz, 863-870 MHz, 433 - 434 MHz, for example. The data acquisition units 110 may also be configured to communicate over Bluetooth (or other short-range) technology or over WiFi™ with devices located in its immediate vicinity, for example within a range of 15 m.

[73] Camera system 110 and gateway device 120 may utilise spread- spectrum techniques for more resilient communication. Such techniques may include Direct Sequence Spread Spectrum (DSSS) or Chirp Spread Spectrum (CSS), Random Phase Multiple Access (RPMA) and Listen-Before-Talk (LBT), for example.

[74] In some other embodiments, communication over communication link 118 may be wired communications, such as Ethernet over copper or optical fibre. [75] In some other embodiments, communication over communication link 118 may be wireless local-area network (WLAN) communications, such as WiFi.

[76] Data sent from camera system 110 may include an alert message, alert indicator, camera system 110 status, and/or detected object, for example.

[77] There may be one or more camera systems 110, which may form a camera system array 115. Each camera system 110 of the camera system array 115 may be communicatively coupled to gateway device 120.

[78] The hazard detection system 100 also comprises a satellite constellation 135. Satellite constellation 135 may comprise one or more satellites 130. Gateway device 120 may be communicatively coupled to a satellite constellation 135 via communication link 128. In embodiments with more than one satellite 130, the communication link 128 may extend to the more than one satellite 130. The communication link 128 may not be a persistent communication link and if satellite 130 is not accessible to the gateway device 120, the gateway device 120 may await the resumption of the radio communication link 128 to continue communication of information. Radio Communication link 128 uses radio links to satellites 130 orbiting the earth to communicate data received at a gateway device 120 from the camera system array 115 and receive instructions or configuration information or firmware updates for camera system 110 or gateway device 120.

[79] The communication link 128 may not be a persistent communication link and if satellite 130 is not accessible to the gateway 120, the gateway 120 may await the resumption of the radio communication link 128 to continue communication of information.

[80] The remote backhaul system 100 also comprises one or more ground stations 140. The ground stations 140 receive communication from one or more satellites 130 of the satellite constellation 135 over a communication link 138. The communication link 138 may be facilitated by radio waves of suitable frequency according to the region where the ground station 140 is located.

[81] The satellite 130 may be a low earth orbit (LEO) satellite that circles the earth approximately every 90-110 minutes, for example. With such orbiting satellites, a relatively smaller number of satellite ground stations 140 may be used to receive downlinks from satellite 130, or all the data transmitted by gateway device 120.

[82] In some other embodiments, one or more satellites 130 are geostationary (GEO) satellites.

[83] In some embodiments, satellites 130 in a near polar orbit may be used and ground stations 140 may be located near each of the Earth’s poles. This arrangement allows each satellite 130 to connect to a ground station 140 on almost every orbit, leaving the throughput latency no higher than 45 minutes (half the time required to complete an orbit), for example. In some embodiments, ground stations may be located at lower latitudes with less harsh weather and transport, and easier access to power and communication links to the ground station 140.

[84] The ground station 140 may comprise radio communication equipment necessary to communicate with the satellite 130 and a communication interface to relay received information to a server system 150 over a communications link 148. The communication link 148 may be a wired or wireless communication link to the internet available to the ground station 140 and to the server system 150. The server system 150 may be accessible over the internet through an application or platform on a client computing device 160 over a conventional internet connection over the communication link 157. The client device 160 may be an end user computing device such as a desktop, laptop, mobile device, tablet, for example.

[85] The server system 150 may be configured to decode, decrypt and/or decompress communication originating from a gateway device 120 and received over the communication links 128, 138 and 148. [86] Server system 150 may store the raw, decoded, decrypted and/or decompressed data originating from gateway device 120 in data storage 120.

[87] In some embodiments, server system 150 comprises an alert module 154 (which includes program code) executable by a processor of the server system 150 or the alert module 154 may be located separately and communicatively coupled to the server system 150 via link 153. Alert module 154 may read data stored in data storage 152, the data originating from determination device 110, and if an alert threshold is met, send an alert message via communication link 157 to one or more client computing devices 160.

[88] Server system 150 may comprise or have access to code for executing a data visualisation module 156. The data visualisation module 156 may be a platform accessible to client device 160 via communication link 157. The data visualisation module 156 may read object detection data from the data storage 152. In some other embodiments, gateway device 120 does not communicate via satellite constellation 135, satellites 130, nor ground station 140, but instead gateway device 120 communicates with server system on networked mobile and/or wired communications via communication link 168.

[89] In some other embodiments, camera system 110 does not communicate via gateway 120, satellite constellation 135, satellites 130, nor ground station 140, but instead communication link 118 is for networked mobile and/or wired communications between camera system 110 and server system 150.

[90] In some other embodiments, camera system 110 does not communicate via gateway 120, but instead camera system 110 includes a satellite modem and communication link 118 is for communications between camera system 110 and one or more satellites 130 of satellite constellation 135.

[91] The hazard detection system 100 enables high-latency communication of data between the camera system array 115 and the client device 160. High-latency communication may be inherently suitable for transmitting small messages to and from the camera system array 115 deployed in remote locations and the server system 150. High-latency communication may comprise latency of greater than about 1 second, 2 seconds, 15 seconds, 30 seconds, or 1, 2, 3, 4 or 5 minutes, for example. Two high- latency communication methods are store and forward communication and short burst data communication.

[92] Store and forward communication may be implemented by the satellite constellation 135 that periodically passes into a range where communication may be received from the gateway device 120 positioned in a remote location. Satellite 130 may gather data from the gateway device 120 and deliver it back to ground stations 140 that are connected to a network backbone or a network generally accessible over the internet. In some embodiments, the store and forward communication could be implemented by satellites or any type of air, ground or sea vehicles (carrying suitable communication and storage equipment) that intermittently travel within communications range of the gateway device 120. The transfers of data by the store and forward method may be bi-directional. The vehicles or satellites used to implement store and forward communication can be far less numerous than a number of statically deployed terrestrial devices that would be needed to cover a designated remote area. Further, vehicles or satellites used to implement store and forward communication can be more rapidly deployed, which can save time during the implementation of the hazard detection system 100, reduce the duration of blackouts resulting from failure of statically deployed terrestrial devices and permit maintenance operations and system upgrades to be carried out using the server system 150 rather than on site in the field.

[93] Short Burst Data (SBD) is another technique for communicating short data messages between gateway 120 and a centralised host computing system such as the server system 150. SBD satellite messaging systems work by waiting for a suitable slot in a satellite network that has voice as its primary application. Examples include Orbcomm™, Iridium™ and Globalstar™. The voice traffic in such systems is prioritised and requires latencies typically less than 500 ms, for example. However, due to the fluctuating demands for voice traffic, there are windows in which shorter messages can be sent. This is analogous to the Short Messaging System (SMS) technique/standard used in terrestrial communications networks design for mobile telephony. The typical latencies of the SBD traffic in such systems can be in the range of 5 seconds to 10 minutes or greater, for example.

[94] Figure 2 is an example block diagram of a camera system 110 according to some embodiments.

[95] As shown in Figure 2, camera system 110 may include first, second and third printed circuit boards 203, 206, and 209. Printed circuit boards (PCBs) 203, 206, and 209 may bear electronic components shown in Figures 2 and 6 to each form a printed circuit board assembly (PCBA).

[96] The electronic components of PCB 209 may include a processor 214, a geolocation unit (e.g. including a GNSS module) 274, inertial measurement unit (IMU) 280, temperature sensor 282, humidity sensor 284, volatile memory 230, non-volatile memory 220, and communications unit 218.

[97] Inertial measurement unit 280 may be or include a MEMS accelerometer, for example.

[98] Electronic components of PCB 209 may also comprise any other connections or circuit elements, such as diodes, capacitors, inductors, resistors, and transistors.

[99] In some embodiments, processor 214 is or includes a microcontroller unit (MCU) and/or system on chip (SoC). Processor 214 forms part of processing unit 210. Processing unit 210 may comprise printed circuit board 209. Processing unit 210 or processor 214 may comprise volatile memory 230 and non-volatile memory 220 so that memory 230 and 220 are accessible to the microcontroller of processor 214. Processing unit 210 is responsible for controlling operation of the camera system 110, most of the work of which is performed by processor 214. [100] Non-volatile memory 220 may comprise operating system code 222. Nonvolatile memory 220 may also comprise pre-determined or periodically determined operational parameters and device operation management (e.g. executed as a wake-up module) code 226, the functions of which are described in relation to Figures 16 and 18. Non-volatile memory 220 may also comprise alert module code 228, the functions of which are described below in relation to Figures 16 and 19. Non-volatile memory 220 may store alarm parameters 224.

[101] Volatile memory 230 may comprise and a payload queue 238, the functions of which are further described in Figures 13, 15, and 16.

[102] In some embodiments, processor 214 is packaged with a data communications unit 218. Data communications unit 218 may include a chip for long-range wireless communications. In some other embodiments, the data communications unit 218 including the chip for long-range wireless communications is not packaged with processor 214, but instead is another electronic component which interfaces with processor 214 outside of the processor’s package. In some embodiments, the data communications unit 218 is a LPWAN unit and the chip for long range wireless communications is an LPWAN chip. In some embodiments, the LPWAN chip is a LoRaWAN chip which utilizes a LoRaWAN protocol. The LoRaWAN chip may be used for low-power consumption during transmission, as well as utilizing communication range capabilities. The data communications unit 218 including the chip for long range wireless communications enables processor 214 to communicate with gateway 120. Communications unit 218 is communicatively coupled to an antenna 216 via cabled connections, in order to transmit and/or receive wireless signals from gateway 120.

[103] Geolocation unit 274 may enable processor 214 to communicate with a GNSS satellite 270 for receiving positioning and timing data. The GNSS satellite 270 may be a global positioning system (GPS) satellite, for example. Geolocation unit 274 is communicatively coupled to an antenna 216 via cabled connections, in order to receive wireless signals from GNSS satellite 270. [104] The electronic components of PCB 206 may include an artificial intelligence (Al) module 250, expandable memory 254, volatile memory 252, and non-volatile memory 256.

[105] Al module 250 may be or include a processor, system on chip and/or microcontroller. Al module 250 may also be referred to as image processor 250, Al processor 250, and machine learning (ML) processor 250.

[106] Al module 250 may comprise expandable memory 254, volatile memory 252, and non-volatile memory 256, so that memory 254, 252, and 256 are accessible to the microcontroller of Al module 250.

[107] Al module 250 may be communicatively coupled to processing unit 210 and/or processor 214. The connection between processing unit 210/processor 214 to Al module 250 may be a wired connection.

[108] The electronic components of PCB 203 may include a camera connector 265, and a debug connector 268. Debug connector 268 may be communicatively coupled to Al module 250. Debug connector 268 is also connected to programmer port 204 on camera system 110. The programmer port 204 exposed so that an external device may connect to programmer port and camera system 110.

[109] Camera connector 265 is communicatively coupled to one or more cameras 260 on camera system 110. Camera connector is also communicatively coupled to Al module 250.

[110] Camera system 110 also includes power supply 240. Power supply 240 may include one or more cells 242. Power supply 240 may include 1 to 12 cells, for example. In some embodiments, power supply 240 includes 1 to 6 cells 4 to 8 cells, or 6 to 10 cells, for example. As shown in figure 1, power supply 240 includes 6 cells 242a, 242b, 242c, 242d, 242e, 242f, configured in three parallel connected groups of two cells in series. [111] Power source 240 may supply power to processor 214 and other components on printed circuit boards 209, and 206, and the one or more cameras 260. According to some embodiments, power source 240 may also be able to supply power to another external device, or other connected device. For example, the other device may be connected through a serial connection such as programmer port 204.

[112] In some embodiments, the electronic components are chosen for their low- power consumption and capability. Components may be chosen optimal to the design rather than being limited to commercial modules. In addition, the PCBA may be designed to consume as little power as possible to extend its battery life in the field.

[113] Figures 3A and 3B show external views of camera system 110 according to some embodiments. Camera system 110 may comprise an enclosed housing 305. Enclosed housing 305 comprises a PCB housing 304, a power source housing 302, a housing top seal 309, and a bottom part 310. Enclosed housing 305 may comprise coupling part 303, which can also be seen in further detail in Figure 8. Enclosed housing 305 may also be referred to as sealed housing 305.

[114] Power source housing 302 may comprise a cylindrically shaped wall forming a hollowed structure, and a power source receiving chamber. The cylindrically shaped wall of power source housing 302 may comprise of multiple curved or flat panels. Power source housing 302 may have two open ends, which may be attachable and detachable to other components of camera system 110.

[115] PCB housing 304 may comprise a cylindrically shaped wall forming a hollowed structure, and a PCB receiving chamber. The cylindrically shaped wall of PCB housing 304 may comprise of multiple curved or flat panels. PCB housing 304 may have two open ends, which may be attachable and detachable to other components of camera system 110.

[116] Power source housing 302 is coupled to and detachable from the PCB housing 304 at respective open ends of the power source housing 302 and PCB housing 304. Power source housing 302 and PCB housing 304 may be attached to the coupling part 303, thereby assisting the coupling and detaching of the power source housing 302 and PCB housing 304 from each other.

[117] The other open end of PCB housing 304 may be sealed via housing top seal 309. Housing top seal 309 may be cylindrically, annular, and/or circular shaped according to some embodiments.

[118] When PCB housing 304, power source housing 302, coupling part 303, and/or bottom part 310, are attached together as shown in Figures 3A, 3B, 13A, and 13B, they may form an electrical housing. In some other embodiments, the electrical housing is defined by the structure formed by the coupling of power source housing 302 to PCB housing 304. In some embodiments, coupling part 303 is not required to couple power source housing 302 to PCB housing 304.

[119] When PCB housing 304, power source housing 302, coupling part 303, and bottom part 310, are attached together as shown in Figures 3A, 3B, 13A, and 13B, their respective outer edges/walls may align to form an wall of the enclosed housing 305.

[120] The enclosed housing 305 may be at least partially in the shape of a pole, such as a pole complying with standard dimensions of the location, to make the camera system appear similar to other poles for signage, for example. In particular, PCB housing 304, power source housing 302, coupling part 303, and bottom part 310 may be in the shape of a pole when attached together.

[121] The enclosed housing 305 may be at least partially narrower than a hollow pole complying with standard dimensions of the location, to allow the housing to be at least partially inserted into the hollow pole.

[122] In particular, PCB housing 304, power source housing 302, coupling part 303, and bottom part 310, when they are attached together as shown in Figures 3A, 3B, 13A, and 13B, may have a diameter narrower than a hollow pole, to permit PCB housing 304, power source housing 302, coupling part 303, and/or bottom part 310 to be received inside the hollow pole.

[123] Top seal 309 and top part may have a larger diameter than PCB housing 304 and/or power source housing 302. Top seal 309 and top part may have a larger diameter than the hollow pole, so that the top seal 309 and top part may be mounted on the hollow pole.

[124] The other open end of power source housing 302 may be sealed via bottom part 310. Bottom part 310 may be cylindrically, annular, and/or circular shaped according to some embodiments

[125] Camera system 110 may be deployed so that the bottom part 310, power source housing 302, and/or PCB housing 304 is mounted to a surface or structure at a site, so that the camera system 110 is in an upright orientation, protruding upward to oppose gravitational force vector. In this orientation bottom part 310 is the lowermost part of enclosed housing 305, followed by power source housing 302, coupling part 303, PCB housing 304, then top seal 309.

[126] The electronic components of PCB 203 may include a camera connector 265, and a debug connector 268. Debug connector 268 may be communicatively coupled to Al module 250. Debug connector 268 is also connected to programmer port 204 on camera system 110. The programmer port 204 is exposed so that an external device may connect to programmer port and camera system 110.

[127] In some other embodiments, PCB housing 304 is configured to instead attach to bottom part 310, and therefore power source housing 302 is configured to instead attach to top seal 309. In this alternative orientation bottom part 310 is the lowermost part of enclosed housing 305, followed by PCB housing 304, coupling part 303, power source housing 302, then top seal 309. [128] Camera system 110 also comprises the one or more cameras 260, which may be coupled to the housing top seal 309, as shown in Figure 3B. Camera system 110 may comprise one, two, three, four, five, or six cameras 260, for example. As shown in Figures 3B, 3C, and 3E, camera system 110 may comprise two cameras 260.

[129] The PCB housing 304, power source housing 302, coupling part 303, and bottom part 310, when they are attached together as shown in Figures 3A, 3B, 13A, and 13B, may form the shape of a pole and have a diameter narrower than a pole, to permit those components of camera system 110 to fit inside the pole. In contrast, top seal 309 may have a diameter larger than PCB housing 304, power source housing 302, coupling part 303, and bottom part 310. As top seal 309 has a larger diameter, this permits top seal 309 to seal over the pole when the PCB housing 304, power source housing 302, coupling part 303, and bottom part 310 are inserted inside the pole, and allows PCB housing 304, power source housing 302, coupling part 303, and bottom part 310 to hang within the interior of the pole and remain attached to the other components of camera system 110. This arrangement also permits cameras 260 to have visibility over the edge of the pole.

[130] Camera system 110 may also comprise a camera housing wall 306. Camera housing wall 306 may be a cylindrically shaped wall. Camera housing wall 306 may be a hollow structure with two opened ends. Camera housing wall 306 may be mounted and/or coupled to top seal 309 at one opened end, thereby sealing or partially sealing the opened end. The camera housing wall 306 may surround the one or more cameras 260. Camera housing wall 306 may be transparent or somewhat translucent, to not significantly compromise the quality of the images captured by the one or more cameras 260. For example, the camera housing wall may be formed of a clear acrylic material. In some embodiments, part of camera housing wall 306 may be tinted so as to obscure or conceal the appearance of the one or more cameras 260.

[131] Camera system 110 may also comprise a camera housing seal 318. Camera housing seal 318 may seal or partially seal the other opened end of the camera housing wall 306. When camera housing seal 318 and top seal 309 seals and/or partially seals camera housing wall 306, an inner camera chamber is formed, the inner camera chamber containing the one or more cameras 260. When camera housing seal 318 and top seal 309 seals and/or partially seals camera housing wall 306, this may collectively be referred to as top part 320. Enclosed housing 305 may also comprise top part 320.

[132] Camera system 110 may also comprise an antenna base 322. The antenna base 322 may comprise a circular structure. Antenna base 322 may comprise a printed circuit board. Antenna base 322 may comprise and/or be connected to camera housing seal 318.

[133] Camera system 110 also comprises antenna platform 324. The antenna platform 324 may comprise a circular structure. The antenna platform 324 may have the same or similar diameter as antenna base 322. The antenna platform 324 may be less thick than antenna base 322. The antenna platform 324 may comprise a substrate. The antenna platform 324 may comprise of an aluminium substrate. The antenna platform 324 may also be referred to as ground plate 324 or ground plane 324.

[134] Antennas 216 may be mounted on antenna platform 324. Camera system 110 may include two antennas 216. One antenna 216 may be communicatively coupled to communications module 218, the other antenna may be communicatively coupled to geolocation unit 274. In some other embodiments, one of antennas 216 may be communicatively coupled to processor 214. Antennas 216 may be diametrically opposed to one another upon antenna platform 324.

[135] Antennas 216 may be an elongated member. Antennas 216 may be cylindrically formed and have each ends attached to antenna platform 324 to form a loop. The loop may be semi-circular shaped, as shown in Figures 3A, 3C, 3D, 3E, and 10. In some other embodiments, the loop is rectangular, triangular, square or omega shaped, for example. Antennas 216 may comprise copper.

[136] Camera system 110 may also include capacitive loaded loops (CLL) 316. CLL 316 may also be referred to as overhead structure 316. CLLs 316 may be elongated structures which connect and/or extend from each end to the antenna platform 324 to form a loop. The CLLs 316 loops may be rectangular or square waved shaped. In some other embodiments, the CLLs may be semi-circular, or omega shaped. CLLs 316 may have a substantially thin rectangular cross-section. The CLLs 316 may comprise aluminium. Each CLL 316 may include two s- shaped brackets, which together form the CLL’s 316 shape.

[137] CLL 316 may have its respective ends connect and/or extend from points on antenna platform 324 on an axis to the points of the connections of the ends of a respective antenna 216. The respective antenna 216 may be referred to as the proximate antenna 216, and the CLL 316 may be referred to as the proximate CLL 316 to the proximate antenna 216. In some embodiments, the respective axes may be parallel to each other. The ends of CLL 316 may be more widely spaced than the ends of its respective antenna 216. The CLL 316 may also be longer than its respective antenna 216 so that the CLL 316 forms an outer loop overhanging its respective antenna 216.

[138] Camera system 110 may also comprise a communications covering 308. Communications covering 308 may be cylindrically shaped with a flat closed circular top and an open end. Communications covering 308 may be coupleable to and detachable from antenna platform 324 at the rim of the open end. In some embodiments, communications covering 308 is attached to antenna platform 324. In some other embodiments, communications covering 308 may not have a flat top, but instead have a somewhat domed and/or rounded shaped top. Communications covering 308 may attach with the antenna platform 324 to form an inner chamber which contains the antennas 216 and the CLLs 316. Communications covering 308 may comprise a non-transparent plastic, for example. Communications covering 308 may be substantially opaque to light transmission but not to radio transmission.

Communications covering 308 may comprise a dark paint, tint, or hue. Therefore, communications 308 may be difficult for external observers to see antennas 216, and may reduce the likelihood of tampering and/or vandalism of camera system 110. [139] Figure 3C shows an internal view of the camera system 110 according to some embodiments.

[140] As shown in Figure 3C, PCBs 203, 206, and 209 may be housed in a PCB receiving chamber within PCB housing 304. Camera system 110 may include one or more PCB structure columns 319. PCB structure columns 319 may be attachable to and detachable from top seal 309 and/or PCB housing 304, and extend within the chamber of PCB housing 304. In some other embodiments, PCB structure columns 319 are fixed or integrally moulded to top seal 309 and/or PCB housing 304. PCBs 203, 206, and 209 may attach to PCB structure columns 319, allowing the PCBs 203, 206, and 209 to hang from top seal 309 within the chamber of PCB housing 309.

[141] As shown in Figure 3C, 4, and 9, PCB 203 may be mounted above PCB 206 and PCB 209; and PCB 206 mounted above PCB 209. In some other embodiments, PCB 209 may be mounted above PCB 203 and/or PCB 206. In some other embodiments, PCB 206 may be mounted above PCB 203. In some other embodiments, the components of PCB 203 may be included on PCB 206, and therefore camera system 110 may not include PCB 203.

[142] As shown in Figure 3C, power source 240 may be housed in a power source receiving chamber within power source housing 302. Camera system 110 may include a power source PCB 342. Camera system 110 may also include a power source clamp 340. Power source PCB 342 and power source clamp may be housed or partially housed in the power source receiving chamber within power source housing 302. As shown in Figures 3C and 10B, power source clamp 340 may be attached to power source PCB 342. In some other embodiments, power source clamp 340 is attached to power source PCB 342, coupling part 303, and/or the chamber wall of power source housing 302. In some embodiments, power source PCB 342 is attachable or optionally detachable from coupling part 303, and/or the chamber wall of power source housing 302. Power source 240 may be coupled to and optionally decoupled from power source PCB 342 and/or power source clamp 340. Power source clamp 342 may be fixed to flat cross sectional ends of a power source to secure the power source within the chamber of the power source housing 302. Power source PCB 342 may be communicatively coupled to one or more of PCBs 203, 206, and/or 209 or their respective components, to permit charging from power source 240 to other components in camera system 110.

[143] Antenna base 322, antenna platform 324, and/or top seal 309 may each comprise one or more conduits for cabling and/or one or more electrical connections between antennas 216 and printed circuit board 209, processor 214, geolocation unit 274, and/or communications unit 218.

[144] As shown in Figure 3C, camera system 110 comprises a programmer port 204. Programmer port 204 may be a magnetic port mounted on the PCB 203, extending upwards to face a conduit of top seal 309, so that when the antenna base 322 and antenna platform 324 are removed, an operator may readily attach an external device to the magnetic programmer port 204. Programmer port 204 is communicatively coupled via debug connector 268 to Al module 250 and/or processor 214. Therefore, external devices may connect to Al module 250 and initiate bootloader mode 1750 as described herein with reference to figure 17.

[145] As shown in Figures 3B, 3C, and 13B, the power source housing 302 has a similar or the same diameter to the PCB housing 304. PCB housing 304 and power source housing 302 may have a diameter between 40 and 100 millimetres, for example. PCB housing 304 and power source housing 302 may have a diameter of 40 to 70 millimetres, 45 to 75 millimetres, or 50 to 80 millimetres, for example. PCB housing 304 and power source housing 302 may have a diameter of about 50, 55, 60, 65, 70, 75, or 80 millimetres, for example. In some embodiments, shown in the Figures 3B, 3C, and 13B, PCB housing 304 and power source housing 302 have a diameter of about 50 millimetres.

[146] Power source housing 302 may have a length of about 70 to 130 millimetres (mm). In some embodiments, power source housing 302 may have a length of about 70 to 100 mm, 90 to 110 mm, or 100 to 120 mm. Power source housing 302 in Figures 3B, 3C, and 13B has a length of 100 mm. [147] PCB housing 304 may have a length of about 45 to 70 mm. In some embodiments, power source housing 302 may have a length of about 50 to 60 mm, 55 to 70 mm, or 60 to 70 mm. Power source housing 302 in Figures 3B, 3C, and 13B has a length of 58 mm.

[148] Bottom part 310 may have a length of about 6 to 13 mm. In the Figures, bottom part 310 has a length of 10 mm. Bottom part 310 may have a diameter slightly larger than power source housing 302, for example, in the Figures shown, bottom part 310 has a diameter of about 52 millimetres. Power source housing 302, PCB housing 304, coupling part 303, bottom part 310, and/or electrical housing are sized to be receivable in a pole having an inner diameter of at least 53 mm. In some embodiments, the pole may have an inner diameter of 53 mm and an outer diameter of 60.5 mm.

[149] As shown in Figures 3B, 3C, and 13B, the camera housing wall 306 may have a slightly larger diameter to the power source housing 302 and/or PCB housing 304. Camera housing wall 306 may have a diameter between 50 and 100 millimetres, for example. Camera housing wall 306 may have a diameter of 50 to 70 millimetres, 55 to 75 millimetres, or 60 to 80 millimetres, for example. Camera housing wall 306 may have a diameter of about 50, 55, 60, 65, 70, 75, or 80 millimetres, for example. In some embodiments, shown in the Figures 3B, 3C, and 13B, camera housing wall 306 has a diameter of about 60 millimetres.

[150] As shown in Figures 3B, 3C, and 13B, the top seal 309 may have a larger diameter to the power source housing 302 and/or PCB housing 304. Top seal 309 may have a diameter between 50 and 120 millimetres, for example. Top seal 309 may have a diameter of 50 to 80 millimetres, 60 to 85 millimetres, or 70 to 100 millimetres, for example. Top seal 309 may have a diameter of about 60, 65, 70, 75, 80, 85, or 90 millimetres, for example. In some embodiments, shown in the Figures 3B, 3C, and 13B, top seal 309 has a diameter of about 70 millimetres.

[151] As shown in Figures 3B, 3C, and 13B, the antenna base 322, antenna platform 324, and communications covering 308 may have a larger diameter to the power source housing 302 and/or PCB housing 304. Antenna base 322, antenna platform 324, and communications covering 308 may have a diameter between 70 and 150 millimetres, for example. Antenna base 322, antenna platform 324, and communications covering 308 may have a diameter of 80 to 120 millimetres, 90 to 130 millimetres, or 100 to 140 millimetres, for example. Antenna base 322, antenna platform 324, and communications covering 308 may have a diameter of about 100, 105, 110, 115, 120, 125, or 130 millimetres, for example. In some embodiments, shown in the Figures 3B, 3C, and 13B, communications covering 308 has a diameter of about 120 millimetres.

[152] The antenna base 322, antenna platform 324, and communications covering 308 may have a combined length of about 30 to 40 millimetres. In some embodiments, the combined length of the antenna base 322, antenna platform 324, and communications covering 308 is about 30, 32, 35, 38, or 40 millimetres. In the Figures 3B, 3C, and 13B, the combined length of the antenna base 322, antenna platform 324, and communications covering 308 is about 35 millimetres.

[153] Figure 3D shows an exploded view of the camera system 110 according to some embodiments. As shown in Figure 3D bottom part 310 may include an annular protrusion which assists securing the bottom part 310 to power source housing 302.

[154] Figure 4 shows a view of the printed circuit boards 203, 206, and 209, top seal 309, and camera mounts 460.

[155] PCBs 203, 206, and 209 may be stacked to form a PCB structure 402. PCBs 203, 206, and/or 209 may include one or more male and female headers 403. PCBs 203, 206, and/or 209 may include one or more column receiving structures 419 for receiving PCB structure columns 319.

[156] Camera system 110 may include one or more camera mounts 460. Each of the one or more cameras 260 may be mounted to a respective camera mount 460. The one or more camera mounts 460 may be attachable and optionally detachable from top seal 309. The one or more camera mounts 360 may include a mount includes a mount front piece 462, a mount rear piece 463, one or more mount springs 468, and a mount fitting piece 464, which is described in further detail in regards to Figures 11A, 1 IB, and 11C.

[157] Top seal 309 may include a top seal threaded protrusion 409 to better engage and/or secure top seal 309 to a threading of PCB housing 304 and/or power source housing 302.

[158] Figure 5 shows views of a stack of PCBs 203, 206, and 209 according to some embodiments. PCB 203 is, in use in a vertical orientation of the camera system 110, positioned as an upper PCB. PCB 206 is, in use in a vertical orientation of the camera system 110, positioned as a middle PCB. PCB 209 is, in use in a vertical orientation of the camera system 110, positioned as a lower PCB. As shown in Figure 5, PCB 203 includes debug connector 268, and one or more camera connectors 265. PCB 203, 206 and/or 209 may also include one or more PCB ridges 512 at the edge of the PCB. Each of the one or more PCB ridges 512 may include a respective ridge hole 514 for receiving a respective PCB structure column 319. PCB 203, 206, and/or 209 may include one or more edge recess 518 to permit passage of an antenna cable from lower PCB 209 up to the antenna. PCBs 203, 206, and/or 209 may include one or more column receiving recess for receiving one or more PCB structure columns 319.

[159] Figures 6A, 7A, and 8A show top views of PCBs 209, 206, and 203 respectively. The top side of PCB 209 my mount GNSS unit 274, processor 214, and a processor case 614, which may be in the form of a metal shield, for housing processor 214. The top side of PCB 206 may mount non-volatile memory 256, volatile memory 252, and/or expandable memory 254. The top side of PCB 203 may mount two camera connectors 265, and a debug connector 268.

[160] Figures 6B, 7B, and 8B show underside views of PCBs 209, 206, and 203 respectively. The underside of PCB 206 may mount the Al module 250.

[161] Figures 9A and 9B shows external and internal views respectively of PCBs 203, 206, and 209 mounted by PCB structure columns 319 within PCB housing 302. PCBs 203, 206, and 209 may be mounted in PCB housing 302 to form respective right sections within the PCB housing 302. Figures 9A and 9B also show upper sub-housing recess 903 located near the top of the PCB housing 302 for locating screws to fix with other modules, and lower sub-housing recess 906 located near the bottom of the PCB housing 302 for locating screws to fix with other modules.

[162] Figure 10A shows a top view of power source 240 secured by power source clamp 340 within power source housing 304.

[163] Figure 10B shows an internal side view of power source 240 secured by power source clamp 340 within power source housing 304. Coupling part 303 may be generally annular shaped and partially sealing the top of the power source housing 304. Bottom part 310 is shown in Figure 10B to seal the bottom of power source housing 304. Bottom part 310 may comprise a bottom part recess 1002. Figure 10B also shows power source clamp 340 attached to power source PCB 342. Power source PCB 342 is attached to the power source receiving chamber wall of power source housing 304 by fasteners, such as screws (not shown). In some other embodiments, a flat mounting structure 343 is attached to the power source receiving chamber wall of power source housing 304 (for example, by screws received through the chamber wall), and power source PCB 342 is attached to the flat mounting structure 343, for example by fasteners, such as screws.

[164] Figures 11A, 11B, and 11C shows views of two camera mounts 460 attached to top seal 309. Each camera mount 460 may include a platform attachment 1160 for securing the camera mount to the top seal 309. The one or more mount springs 468 may attach from mount rear piece 463 to platform attachment 1160 to better secure components of camera mount 460 to the top seal 309.

[165] Figures 11 A and 1 IB show a view of the top seal’s 309 platform surface 1109. Platform surface 1109 may be annular shaped and also includes a circular recess and a plurality of screw holes. As shown in Figure 11C an integration ring 1119 may be fitted over the platform surface 1109. The integration ring 1109 may include a circular protrusion to be received in the circular recess of the platform surface, and also may include respective plurality of recesses to fix screws through both the platform surface 1109 and the integration ring 1109.

[166] Figure 12 shows two antennas 216 (indicated by 216A and 216B) and respective CLLs 316 (indicated by 316A and 316B) mounted upon antenna platform 324. Each CLL 316 may comprise an antenna strip 1216 (indicated by 1216A and 1216 B for the two antennas shown) disposed in the middle of its respective CLL 316. The antenna strip 1216 may be a dielectric element, for example. The antenna strip may be a low-loss dielectric, such as a Rogers R04000 hydrocarbon ceramic, for example. Antenna strip 1216A may be wider or otherwise sized differently than antenna strip 1216B.

[167] Each CLL 316 is driven by the proximate antenna 216 which is fed through the ground plane. Camera system 110 may comprise one or more antenna systems 1232. An antenna system 1232 may comprise an antenna 216 and a proximate CLL 316. Antenna system 1232 may also comprise the antenna strip 1216 of the respective CLL 316 in antenna system 1232. Antenna system 1232 may also comprise a driven feed through ground plane. As shown in Figure 12, camera system 110 may comprise two antenna systems 1232, however camera system 110 is not limited thereto. In some other embodiments, camera system 110 comprises zero, one, two, three, or four antenna systems 1232. Two antenna systems 1232A and 1232B are shown in Figure 12. According to some embodiments, the two antenna systems 1232A and 1232B may be distanced about 60 to 100 millimetres from each other. According to some embodiments, the two antenna systems 1232A and 1232B may be distanced about 60 to 75 millimetres, 70 to 80 millimetres, or 75 to 90 millimetres from each other.

[168] Antennas 216 may have a low profile. The antennas 216 may be electrically small. The antennas 216 may be coax-fed. The antennas 216 may be three-dimensional (3D) magnetic EZ antennas. The antennas 216 may match or nearly match a respective 50 ohm source. [169] Antenna 216A of antenna system 1232A may be configured to operate between 850 MHz to 950MHz. In some embodiments the antenna 216A is configured to operate between 860 to 875MHz, or between 900 to 930 MHz, for example. In some embodiments, the antenna 216A is configured to operate at about 915 to 928 MHz. In some other embodiments, the antenna 216A is configured to operate at about 2.4 GHz. Antenna 216A may be communicatively coupled to communications unit 218. Antenna 216A may be referred to as communications antenna 216A. Antenna system 1232A may be referred to as communications antenna system 1232A. Antenna 216A may be proximate to CLL 316A. Antenna strip 1216A may be referred to as communications antenna strip 1216A. Antenna strip 1216B may be referred to as GNSS antenna strip 1216B. Communications antenna 216A and communications antenna system 1232A may be configured for wireless communications appropriate for LPWAN communications, such as LoRaWAN or SigFox. Communications antenna 216A and communications antenna system 1232A may be configured for LoRA™ communications. GNSS antenna 216B and GNSS antenna system 1232B may be configured for wireless communications appropriate for GNSS communications, such as signals for GPS, Galileo, or Beidou.

[170] The antenna 216B of antenna system 1232B may be configured to operate at about 1500 to 1600 MHz, for example. The antenna 216B may be configured to operate at about 1555 to 1575 MHz, or 1560 to 1580 Mhz, for example. The antenna 216B may be communicatively coupled to GNSS unit 274. Antenna 216B may be referred to as GNSS antenna 216B. Antenna system 1232B may be referred to as GNSS antenna system 1232B. Antenna 216B may be proximate to CLL 316B. GNSS antenna 216B and GNSS antenna system 1232B may be configured for receiving a wireless GNSS signals, such as GPS signals.

[171] Each CLL 316 may be a parasitic element located in the near filed of its proximate antenna 216. Each CLL 316 facilitates its respective antenna system 1232 being resonant. Each CLL 316 assists simplifying the antenna system 1232 to not rely on an external matching network to have reactive and resistive matching to a source. Each CLL 316 may enhance the overall radiation efficiency of its respective antenna system. Antenna system 1232 employing electrically small antennas 216 and CLLs 316 may be compact in size, easy to manufacture, inexpensive to manufacture, possess high efficiency, and may also be linearly scalable for wide range of frequencies at least between the low VHF through to X-band. However, such antenna systems 1232 may have narrow bandwidth.

[172] According to some embodiments, CLLs 316 may have a width of about 8 to 30 mm. CLLs 316 may have a width of about 10 to 18 mm, 14 to 20 mm, or 15 to 25 mm for example. In some embodiments CLLs 316 have a width of about 8, 10, 14, 16, 18, 20, 24, or 28 mm, for example. In some embodiments, CLL 316A has a width of about 37 mm. In some embodiments, CLL 316B has a width of about 14.7 mm.

[173] According to some embodiments, CLLs 316 may have a length of about 25 to 60 mm. CLLs 316 may have a length of about 30 to 40 mm, 35 to 45 mm, or 40 to 55 mm for example. In some embodiments CLLs 316 have a length of about 25, 30, 34, 38, 42, 46, 50, or 54 mm, for example. In some embodiments, CLL 316A has a length of about 18 mm. In some embodiments, CLL 316B has a length of about 44 mm.

[174] According to some embodiments, from an overhead perspective, each bracket of CLLs 316 may have a length of about 18 to 28 mm. From an overhead perspective, each bracket of CLLs 316 may have a length of about 18 to 22 mm, 20 to 25 mm, or 23 to 28 mm for example. In some embodiments, from an overhead perspective, each bracket of CLLs 316 have a length of about 18, 20, 21, 22, 23, 24, 26, or 28 mm, for example. In some embodiments, from an overhead perspective, each bracket of CLL 316A has a length of about 21 mm. In some embodiments, from an overhead perspective, each bracket of CLL 316B has a length of about 23.4 mm.

[175] The aluminium of CLLs 316 may have a thickness of about 0.5 to 1.4 mm. The aluminium of CLLs 316 may have a thickness of about 0.6 to 0.9 mm, 0.7 to 1.1 mm, or 0.8 to 1.3 mm for example. In some embodiments, the aluminium of CLL 316A has a thickness of about 0.8 mm. In some embodiments, the aluminium of CLL 316B has a thickness of about 0.8 mm. [176] The antenna 216 may have a diameter of about 1.8 to 3.2 mm. The antenna 216 may have a diameter of about 1.8 to 2.4 mm, 2.1 to 2.6 mm, or 2.3 to 2.9 mm for example. In some embodiments, the antenna 216A has a thickness of about 2.55 mm. In some embodiments, the aluminium of CLL 316B has a thickness of about 2.55 mm.

[177] According to some embodiments, antenna 216 may have a length from each of its ends of about 15 to 35 mm. Antenna 216 may have a length from each of its ends of about 18 to 22 mm, 20 to 28 mm, or 24 to 32 mm for example. In some embodiments, antenna 216 may have a length from each of its ends of about 15, 18, 22, 24, 26, 28, 30, or 32 mm, for example. In some embodiments, antenna 216A may have a length from each of its ends of about 26.36 mm. In some embodiments, antenna 216B may have a length from each of its ends of about 20.36 mm.

[178] According to some embodiments, antenna 216 may have a radius from the inside of its curved length to a centre point between antenna 216’s ends of about 6 to 10 mm. Antenna may have a radius from the inside of its curved length to a centre point between antenna 216’s ends of about 6 to 8 mm, 7 to 9 mm, or 8 to 10 mm for example. In some embodiments, may have a radius from the inside of its curved length to a centre point between antenna 216’s ends of about 6, 6.5, 7, 7.5, 8, 8.5, 9, or 9.5 mm, for example. In some embodiments, may have a radius from the inside of its curved length to a centre point between antenna 216A’s ends of about 8.12 mm. In some embodiments, antenna 216B may have a radius from the inside of its curved length to a centre point between antenna 216B’s ends of about 7.62 mm.

[179] According to some embodiments, antenna 216 may further extend from one end into ground plate 324, to connect to the feed and/or wires communicatively coupled to GNSS module 274 and/or communications module 218. The further extension may comprise a larger diameter than the diameter of antenna 216.

[180] Figure 13A and 13B shows external and internal views respectively of camera system 110 according to some other embodiments. [181] As shown in Figures 13A and 13B camera system 110 includes a first power source housing 302A and a second power source housing 302B. Power source housing 302A is coupled to PCB housing 304, and to power source housing 302B and/or bottom part 310A. Power source housing 302B is coupled to power source housing 302A and/or bottom part 310A and to bottom part 310B. In some embodiments, a bottom part, such as bottom part 310A, is not required to couple two power source housings 302 to each other, as both housings may readily connect and/or engage with each other.

[182] Power source housing 302A may house power source clamp 340A, power source PCB 342A, and/or power source 240A. Power source housing 302B may house power source clamp 340B, power source PCB 342B, and/or power source 240B.

[183] In some other embodiments, camera system 110 is not limited to one or two power source housing 302, power source clamp 340, power source PCB 342, and/or power source 240.

[184] In some other embodiments, a further one or more power source housing 302 may be coupled to the bottom of the second power source housing, the further one or more power source housing each including a respective power source receiving chamber, the respective power source receiving chambers containing a respective power sources 240, the respective power sources 240 configured to supply power to components of smart camera system.

[185] Therefore, in some other embodiments, camera system 110 may have zero, three, four, five, six, seven, eight, nine, ten, eleven, or twelve power source housing 302, power source clamp 340, power source PCB 342, and/or power source 240, for example. For example, camera system 110 may have six power source 240, each of the six power source 240 may be a single cell and have a respective power source housing 302, power source clamp 340, and power source PCB 342. Each power source 240 may be configured in three parallel connected groups of two cells in series. Each respective power source’s PCB 342 may be coloured to matched another power source’s PCB 342 if the power sources are connected in series, and differentiated in colour to further other power source PCBs 342 whose power source 240 are connected in parallel.

[186] Figure 14 shows an external view of camera system 110 according to some other embodiments. According to some other embodiments, camera system does not comprise antenna systems 1232, but instead comprises communications antenna 1417 and GNSS antenna 1416. Communications antenna 1417 and GNSS antennas 1416 may comprise whip antenna, patch antenna, dipole antenna, and/or other antenna device.

[187] Figure 15 shows components of camera system 110 interfacing with processor 214. According to some embodiments as shown in Figure 15, communications unit 218 and GNSS unit 274 are communicatively coupled to processor 214. Also as shown in Figure 15, power supply 240 may charge processor 214. Camera system 110 may comprise a debugging unit 1510, which may be communicatively coupled with processor 214 as shown in Figure 15 for allowing debugging of functions of processor 214. Camera system 110 may include processor interfaces 1520. Processor interfaces 1520 may be communication components which facilitate communications between processor 214 and Al module 250.

[188] Figure 16 shows a low-power operation method 1600 of camera system 110 according to some embodiments. Figure 17 shows a flowchart of a method of Al-based image processing according to some embodiments. Figure 19 shows a flowchart of a method of alert determination according to some embodiments.

[189] According to some embodiments method 1600 begins when processor executing OS code 222, then executes wake-up module code 226 and automatically initialises GNSS unit 274 at step 1610. After step 1610, the GNSS unit 274 may receive from one or more satellite 270 GNSS coordinates and/or a timestamp at step 1615. The timestamp may be UTC time or another time standard such as GPS, or TAI for example. GNSS unit 274 may communicate the GNSS coordinates and/or timestamp to processor 214. Processor 214 and/or GNSS unit 274 may then store the GNSS coordinates and/or time stamp in buffer 234 in volatile memory 230 at step 1615.

[190] Processor 214 executing wake-up module code 226 may then disable the GNSS module 274 at step 1620.

[191] Processor 214 executing wake-up module code 226 may then determine if the timestamp acquired at step 1615 is within an operation time range stored in volatile or non-volatile memory, at step 1625.

[192] When processor 214 determines the timestamp acquired at step 1615 is not within an operation time range at step 1625, processor 214 may then operate in a low- power mode until commencement of the operation time range at step 1630. When the commencement of the operation time range occurs, processor 214 executing wake-up module code 226 may then initialise GNSS unit 274 at step 1610.

[193] In some other embodiments, when processor 214 determines the timestamp acquired at step 1615 is not within an operation time range at step 1625, processor 214 operates in a low-power mode until a predetermined period of time has elapsed before commencing step 1610. The predetermined period of time may be about 5 minutes, 15 minutes, 30 minutes, an hour, two hours, four hours, or eight hours, for example.

Therefore, the predetermined times for waking up may occur at predetermined times of day and/or within a predetermined time window, such as at set times during normal working hours (where hazards may be more likely to be present), for example. The frequency of the wake-up times may be between 1 and 12 times per hour, 1 and 6 times per hour, or once every 1 to 8 hours, for example. Therefore, in one example, wake -up times may be set to occur multiple times per hour, optionally during a fixed time window of say 8 to 16 hours duration, or alternatively without reference to a time window. [194] When processor 214 determines the timestamp acquired at step 1615 is within an operation time range at step 1625, processor 214 executing wake-up module code 226 may automatically power on Al module 250, at step 1635.

[195] As shown in Figure 16, once powered Al module 250 may then automatically trigger one or more cameras to take images and then receive the image capture and then perform processing on the captured images at step 1640.

[196] Step 1640 is shown in further detail at Figure 17 according to some embodiments. According to some embodiments, step 1640 commences with Al module 150 determining whether an external device is connected to programmer port 204, at step 1710.

[197] If an external device is determined to be connected to programmer port 204 at step 1710, Al module 250 will then operate in a bootloader mode at step 1750.

[198] Al module 250 operating in bootloader mode will then determine whether a camera application starts in a particular address location on Al module 250 at step 1755.

[199] If the application is determined to start in the particular address location, the Al module 250 will then initialise the one or more cameras 260 to capture a stream of images at step 1765, to perform camera viewing. The Al module 250 may automatically after send the stream of images to the external device connected to programmer port 204 for user viewing. The images may then be received by the external computing device and automatically viewed on an application running on external device. Therefore, the operator of the external computing device may perform real time viewing of the stream of images and make live adjustments to the operational parameters, position or view angle of one or more cameras 260. Adjusting the one or more cameras 260 may comprise adjusting the focus, position, and/or orientation of the one or more cameras 260. In some other embodiments, cameras 260 may only take one or more images and then send the one or more images automatically or after a delay to the external device. In some embodiments, the external device may be client computing device 160, or a different device.

[200] If the application is determined to not start in the particular address location, the Al module 250 will operate in firmware update mode, and permit external device to initiate and execute firmware updates on Al module 250.

[201] If an external device is determined to not be connected to programmer port 204 at step 1710, Al module 250 will then activate one or more cameras 260 to capture one or more images at step 1715. There may be one, two, three, four, five, six, seven, eight, nine, ten, or eleven images captured at step 1715 by each of the one or more cameras 260, for example. In some other embodiments, a video stream of images is captured over a few seconds or part of a second. In some embodiments, there is predetermined delay or delays (in the order of one or two seconds or fractions of a second) between images or groups of images captured by the one or more cameras 260. In some embodiments when there is a plurality of cameras 260, each of the cameras 260 takes images at different timings to the other cameras 260. In some other embodiments, when there is a plurality of cameras 260, each of the cameras 260 takes images at the same or nearly the same timings as the other cameras 260.

[202] After one or more images are captured at step 1715, the Al module 250 will then calibrate images for processing at step 1720. Calibration of images for processing at step 1720 may include performing brightness control, linearization (i.e flattening) and/or contrast stabilising on the captured one or more images, for example. In some other embodiments, calibration of images at step 1720 is not performed within step 1640.

[203] After the one or more images have been captured and optionally calibrated, Al module 250 may then perform object recognition on the captured and optionally calibrated images using Al model 251 at step 1725. Object recognition may be performed using a trained Al model 251, such as a K-nearest neighbour, decision tree, support vector machine, an artificial neural network and/or convolutional neural network, or multiple such machines or networks, for example. The Al model 251 may be trained with known machine learning techniques. The Al model 251 may be loaded by Al module 250 from non-volatile memory 256 into volatile memory 252 when or before performing step 1725 or Al model 251 may be stored in a local non-volatile memory of Al module 250.

[204] The Al model 251 may be used to determine whether a work machine is detected (within a predetermined confidence level) in each of the one or more images. In some embodiments, the object being determined or identified in a captured image by the Al module 250 using the Al model 251 may include livestock, a person, a bike, a car, a clay delver, a truck, a cable plough, a tractor, an excavator, a Bobcat, a Ditch Witch, a horizontal drill, a boring rig, an auger, a bulldozer and/or a post driver, for example. These are examples of multiple different object types identifiable by the Al model 251. In some embodiments, the Al model 251 may be applied to detect one, a subset, or all of the object types listed or other object types.

[205] Upon performing step 1725, the Al model 251 may produce a result of the processed one or more images which indicates a probability (likelihood) that a predetermined object has been detected and optionally what the predetermined object is determined as likely to be.

[206] After performing object recognition at step 1725, the Al module 250 may optionally receive one or more further captured images from the cameras 260 (or alternatively retrieve the further captured images from temporary storage in volatile memory) at step 1715 and subsequently perform steps 1720 and 1725 again, as shown in Figure 17.

[207] In some embodiments, Al module 250 then transmits the result(s) to processor 214 at step 1730. In some embodiments, the transmitted result data indicating what the predetermined object is (or is likely to be) may be an indicator variable, an integer, or a string, for example, while the probability may be represented by an integer, percentage and/or decimal number. [208] Therefore, the one or more images processed by Al module 250 in method 1600 is not transmitted externally of the camera system 110, such as to gateway 120 and/or server system 150. Abstaining from external transmission of images can be advantageous for power saving purposes, and/or information security purposes, for example.

[209] Processor 214 may then receive the object determination result(s) from Al module 250 and store the result(s) as object determination data in buffer 236.

[210] Al module 250 then determines whether the expandable memory 254 is full at step 1735.

[211] If expandable memory 254 is determined to not be full at step 1735, Al module 250 stores images in expandable memory 254 at step 1745.

[212] If expandable memory 254 is determined to be full at step 1735, Al module 250 deletes the oldest image(s) at step 1740 before then storing the images in expandable memory 254 at step 1745, as shown in Figure 17.

[213] Step 1745 concludes method 1600 step 1640, as shown in Figure 17.

[214] After results have been received by processor 214 at step 1730 and/or images have been stored in expandable memory at step 1745, processor 214 executing wake-up module code 226 may then put Al module 250 into a low power mode and/or turn off at step 1645.

[215] After performing step 1645, processor 214 executing wake-up module code 226 may then filter detection result(s) which have confidence and/or probability that a predetermined object is detected lower than a predetermined threshold. For example, the predetermined threshold may be about 55, 60, 70, 80, 90, 95, or 99 percent confidence, or selected from 90 to 99 percent confidence. [216] In some embodiments, when the one or more images captured by the one or more cameras 260 includes a plurality of images, processor 214 executing wake-up module code 226 at step 1650 may filter the results if a predetermined number of the results have a probability /confidence determined to be above a predetermined confidence threshold. For example, processor 214 may keep the results if at least two results from two respective images pertaining to a detected object from three images captured by the one or more cameras 260 have a determined probability /confidence above the predetermined confidence threshold. The predetermined confidence threshold may be a selected number between 70 and 99 percent, for example. Therefore, in some embodiments, the one or more cameras 260 may take an odd number of images at step 1640, to assist setting of the predetermined number of results threshold, to more readily indicate a majority of detections in the one or more images.

[217] After filtering detections with low confidence at step 1650, processor 214 may then read the alert level for the detected object (e.g. as a variable or from a lookup table that maps objects or object types to alert levels) and update the alert level at step 1655. Step 1655 is shown in more detail in Figure 19.

[218] As shown in Figure 17, step 1655 commences with processor 214 executing alarm parameters code 224 to read prior detections in non-volatile memory 220 at step 1910. The prior detections may be prior detections of particular object types.

[219] Processor 214 executing alarm parameters code 224 then determines whether any prior detections are not in the recently received result(s) of the one or more images at step 1915.

[220] When processor 214 executing alarm parameters code 224 determines that any prior detections are not in the current results at step 1915, processor 214 then deletes those prior detections which are not found in the current results from non-volatile memory, and leaves the prior detections that were found in non-volatile memory at step 1920. In some embodiments, processor 214 only deletes prior detections if there is an intersection of the current results data and the prior detection data at step 1920. In some embodiments, processor 214 only deletes each prior detection if each prior detection has been detected on a number of previous consecutive detections. For example, a prior detection may be deleted if it was no longer appearing in the current results and it had been previously detected in the previous four result(s) received from four respective step 1640 executions. In some embodiments, the number of previous consecutive detections may be represented by an incremented or decremented number for the detected object. The incremented or decremented number is an example variable that may be referred to as an alert level for the detected object. For the rest of the disclosure regarding method steps of 1655 in Figure 19, the alert level is assumed to be an incrementable number.

[221] In some embodiments, processor 214 also executes alert module 228 and additionally sends a cancel emergency message via communications unit 218 to gateway 120 and/or server system 150. The cancel emergency message may indicate that a previous object that was detected at a site viewed by the one or more cameras 260 of camera system 110, is no longer present at the site. The cancel emergency message may indicate the message, coordinates, and reporting time 2030. The cancel emergency message may be represented by data containing an integer and/or string.

[222] When processor 214 executing alarm parameters code 224 determines that all prior detections are in the current results at step 1915, or if processor 214 has performed step 1920, processor 214 then selects a result which has yet to be processed by processor 214 executing the alarm parameters code 224 at step 1930.

[223] Processor 214 executing alarm parameters code 224 may then determine whether the adjusted alert level of the unprocessed result is at or above the maximum alert level at step 1935. The maximum alert level may be a predetermined number such as one, two, three, four, five, six, seven, eight, nine, or ten, for example.

[224] When the alert level of the unprocessed result is determined by processor 214 at step 1935 to not be at or above maximum, processor 214 executing alarm parameters code 224 may then then increments the existing alert level of the unprocessed result at step 1940. In some embodiments, there may be a threshold alert level for an emergency set an integer below the maximum alert level, this may be referred to as the emergency alert level.

[225] After determining the alert level is at or above maximum at step 1935 or after incrementing the alert level at step 1940, processor 214 may then determine if the alert level is below the emergency alert level threshold at step 1945.

[226] In some embodiments, when the alert level of the unprocessed result is determined by processor 214 at step 1945 to be below the emergency level threshold, processor 214 may set a result message to a string or integer that indicates that the unprocessed result is “information” or a ’’warning” at step 1960.

[227] When the alert level is not below the emergency alert level threshold at step 1945, processor 214 executing alarm parameters code 224 may then determine whether the alert level is at the emergency level threshold at step 1955.

[228] In some embodiments, when the alert level of the unprocessed result is determined by processor 214 at step 1955 to be at the emergency level threshold, processor 214 may set a result message to a string or integer that indicates that the unprocessed result is an “emergency” at step 1960.

[229] When the alert level is not at emergency alert level threshold at step 1955, processor 214 executing alarm parameters code 224 may set a result message to a string or integer that indicates that the unprocessed result is “information” or a “warning”. This may occur when the alert level is at or above maximum.

[230] In some embodiments, when the alert level of the unprocessed result is determined by processor 214 at step 1935 to be at or above maximum, processor 214 may set a result message to a string or integer that indicates that the unprocessed result is “information” or a “warning” without executing steps 1945 or 1955. [231] After steps 1960 or 1965 are executed, processor 214 may buffer the result, alert level and message at step 1970. The previously unprocessed result may now be considered processed.

[232] After step 1970 is executed, processor 214 may determine if there are any other unprocessed results at step 1925.

[233] When processor 214 determines there is an unprocessed result at step 1925, processor 214 then selects the unprocessed result at step 1930.

[234] When processor 214 determines there is no unprocessed result at step 1925, processor 214 executing alert module 228 then creates one or more communications payload indicating detection of the hazard and sends a first signal including the one or more communications payload at step 1660 to the communications unit 218, as shown in Figure 16 and 19. The communications pay load may comprise and/or indicate one or more of: detected objects/results, the object type, the detected object’s respective confidence level/probability, GPS coordinates of the camera system 110 and image capture timestamp buffered at step 1615, alert level and/or result message, for example.

[235] Communications unit 218 may then send a second signal (based on the received first signal) indicating detection of the hazard including and/or indicating the one or more communications pay load at step 1660 via communications unit 218 to gateway 120 and/or server 150.

[236] In some embodiments, as shown in Figure 16, after performing step 1660, processor 214 then operates in a low-power (sleep) mode at 1662 until a predetermined period of time has elapsed and a wake-up alarm triggers at 1665 to cause processor 214 to re-commence method 1600 at step 1610. In the low power (sleep) mode, the processor 214 shuts down all other power consuming components of camera system 110 and transitions its own functions to only minimal functions required to execute the wake-up module 226. Optionally, in the low power (sleep) mode, the processor 214 monitors input from IMU 280 in order to detect movement of the camera system 110. The predetermined period of time of the low power mode may be about 5 minutes, 15 minutes, 30 minutes, an hour, two hours, four hours, or eight hours, for example.

Therefore, the predetermined times for waking up may occur between 1 and 12 times per hour, between 1 and 6 times per hour, or once every 1 to 8 hours, for example.

[237] In alternative embodiments, the processor 214 may be configured to selectively operate in an always-on mode. For example, processor 214 may be configured to selectively operate in an always-on mode for multiple hours or days at a time. In another example, processor 214 may be configured to operate in an always-on mode until the battery becomes below a certain energy storage threshold or it becomes too depleted to perform further image capture and processing functions.

[238] Method 1600 was earlier described wherein processor operation in low-power mode occurred for a predetermined period of time. Figure 18 shows a method 1800 for processor 214 executing wake-up module code 226 to wake-up from being in a low- power mode, according to some other embodiments.

[239] In some embodiments, method 1800 begins when processor 214 is executing wake-up module code 226 while in a low-power mode, which may also be referred to as “sleep mode”, at step 1810.

[240] Following 1810, a sensor on camera system 110 such as IMU 280, humidity sensor 284, and/or temperature sensor 282 may send a signal to processor 214 indicating a sensor reading above a sensor reading threshold, at step 1820. This signal may indicate the camera system 110 moved, that a part of camera system 110 is getting too hot, or there is excessive humidity in the housing of camera system 110, for example.

[241] Processor 214 executing wake-up module code 226 may then determine if there is a sensor event at step 1820. Processor 214 may compare the sensor signal with a predetermined threshold value in memory in performing step 1820. In some embodiments, if the sensory signal exceeds the predetermined threshold value, processor 214 determines that a sensor event has occurred.

[242] When processor 214 determines that a sensor event has not occurred, processor 214 will continue operation in low power mode at step 1810.

[243] When processor 214 determines a sensor event has occurred, processor 214 will no longer operate in low-power mode, which is also referred to as “waking up”, at step 1830.

[244] After waking up at step 1830, processor 214 executing wake-up module code 226 may then enable GNSS module 274, to enable GNSS module 274 to receive position and time information from satellite 270, and store the GNSS position and time information acquired from GNSS module 274 in volatile memory at step 1835.

[245] After performing step 1835, processor 214 then may format a communications payload at step 1840. The communications payload at step 1840 may contain the GNSS position coordinates, GNSS time information, and/or an emergency level message or alert level. The communications payload may then be transmitted to gateway 120 via communications module 118.

[246] After sending the communications payload at step 1840, processor 214 may then continue monitoring the sensors and after a predetermined period of time, for example between 15 seconds and five minutes, processor 214 determines if there is still sensor signal(s) exceeding the predetermined threshold.

[247] If processor 214 determines the sensor signal(s) continue to exceed the predetermined threshold at step 1845, processor 214 will then execute step 1840 and/or 1835 again. [248] If processor 214 determines the sensor signal(s) do not exceed the predetermined threshold at step 1845, processor will then resume normal operation at step 1850 and return to sleep mode at step 1810.

[249] Figure 20 shows a deployment view of camera system 110 at site 2000, according to some embodiments. In Figure 20, camera system 110 includes two cameras 260 which are oriented to have a centre of line of sight indicated by 2010 and 2020 at site 2000. The two cameras 260 may be oriented to have a line of sight of remote assets, such as along sections of piping, and camera system 110 may be configured to detect hazards, such as heavy vehicles and/or people, for example according to method 1600.

[250] The two cameras 260 may have different fields of view. For example, in some embodiments, camera 260 may have a lens which narrows to 10 degrees field of view to permit camera system 110 to detect object about 100 to 200 metres away, for example. In some other embodiments, camera 260 may have a 130 degree lens to permit camera system 110 to detect objects about 15 to 30 metres away, for example. In some other embodiments, camera 260 may have a 136 degree lens to permit camera system 110 to detect objects about 10 to 20 metres away, for example. The two cameras 260 may face in different directions. For example, each of cameras 260 may face in generally opposite directions or directions separated by between about 135 and 180 degrees.

[251] Camera system 110, after performing methods 1600 and/or 1800, may transmit the message data/payload to gateway 120. Gateway 120 may store and then forward the data via a passing satellite 130, which may transmit the payload data to a ground station 140, which in turn transmits the payload data to a server system 150. A user may be able to visualise the object type, message, coordinates, confidence, and reporting time 2030 regarding camera system 110 in near-real time on their client device 160, by connecting client device 160 to server system’s data visualisation module 156 via data connection 155. In some embodiments, a user may be able to visualise the object type, message, coordinates, confidence, and reporting time 2030 regarding camera system 110 from not just the latest payload but also prior payloads, by connecting client device 160 to server system’s data visualisation module 156. The user may also be able to visualise camera system 110 and a representation of site 2000, and visualise the location of camera system 110 within site 2000, on their client device 160, by connecting client device 160 to server system’s data visualisation module 156. The user may also be able to visualise one or more other camera system 110 of camera system array 115 and their location within site 2000, on the client device 160, by connecting client device 160 to server system’s data visualisation module 156.

[252] The alert module 154 of the server system 150 may be configured to send a notification to a communicatively coupled external device, such as client device 160, or another device, upon reception of a predetermined number of consecutive signals indicating detection of the hazard.

[253] According to some embodiments, method 1600 may be considered a power efficient implementation. Camera system 110 operating using method 1600 may continue to operate for a year or more when utilising six power sources 240.

[254] It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the above-described embodiments, without departing from the broad general scope of the present disclosure. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive.