Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM, APPARATUS AND METHOD SUITABLE FOR INSPECTION OF A STRUCTURE
Document Type and Number:
WIPO Patent Application WO/2022/265490
Kind Code:
A1
Abstract:
There is provided an apparatus (102) suitable for inspection of a structure (101). Specifically, there is provided an apparatus (102) such as a drone which can be capable of flight. Moreover, the apparatus (102) can be suitable for performing flight-based inspection of a structure (101). The apparatus (102) can include at least one arm (202a) (e.g., corresponding to a manipulator arm which can, for example, have 3 Degrees of freedom of movement). The arm (202a) can be capable of carrying one or both of a sensor (202b) and a localization system (202d). The sensor (202b) and/or the localization system (202d) can be configured for performing inspection of the structure (101). Based on the inspection of the structure (101), at least one input signal can be generated and communicated from the apparatus (102). The input signal can be communicated from the apparatus (102) for further processing.

Inventors:
MOHAMED ABDULLAH ZAWAWI BIN (MY)
ABDUL RAHMAN MUHAMMAD ARIF BIN (MY)
Application Number:
PCT/MY2022/050045
Publication Date:
December 22, 2022
Filing Date:
June 13, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PETROLIAM NASIONAL BERHAD PETRONAS (MY)
International Classes:
G01M5/00; B64C39/02; B64D47/00; G01B21/32; G01S17/894
Foreign References:
US20200207488A12020-07-02
KR102123983B12020-06-18
KR20200090428A2020-07-29
US20200377233A12020-12-03
US20200174503A12020-06-04
Attorney, Agent or Firm:
KUA, Han Chun (MY)
Download PDF:
Claims:
Claim(s)

1. An apparatus (102) which is capable of flight and which is suitable for performing flight-based inspection of a structure (101), the apparatus (102) comprising: at least one arm (202a) capable of carrying at least one of a sensor (202b) and a localization system (202d) for performing inspection of the structure (101), at least one input signal capable of being generated based on inspection of the structure (101), the input signal being communicable from the apparatus (102), wherein at least one input signal is communicable from the apparatus (102) for further processing. 2. The apparatus (102) of claim 1 , wherein the arm (202a) corresponds to a manipulator arm having multiple axes in freedom of degree of movement. 3. The apparatus (102) of claim 2, wherein the apparatus (102) corresponds to a drone (250), and wherein the manipulator arm has three degrees of freedom (3 DoF) of movement

4. The apparatus (102) of claim 1 , wherein inspection performed on the structure (101) by at least one of the sensor (202b) and the localization system (202d) corresponds to contact-based inspection where contact is established with a surface of the structure (101).

5. The apparatus (102) of claim 1 , wherein the sensor (202b) is capable of generating at least one sensor signal indicative of data associated with the structure (101), and wherein data associated with the structure (101) includes at least one of: data concerning structural condition of the structure (101) and data related to the physical profile of the structure (101).

6. The apparatus (102) of claim 1 , wherein the localization system (202d) includes a light detection and ranging (Lidar) based sensor such as a 3-Dimensional Lidar (3D Lidar), and wherein the 3D Lidar is configurable to generate a 3-Dimensional (3D) map of the structure (101 ).

7. An apparatus (102) which is capable of flight and which is suitable for performing flight-based inspection of a structure (101), the apparatus (102) comprising: a localization system (202d) for performing inspection of the structure (101), at least one input signal capable of being generated based on inspection of the structure (101), the input signal being communicable from the apparatus (102), wherein at least one input signal is communicable from the apparatus (102) for further processing.

8. The apparatus (102) of claim 7, wherein the localization system (202d) includes a light detection and ranging (Lidar) based sensor such as a 3-Dimensional Lidar (3D Lidar), and wherein the 3D Lidar is configurable to generate a 3-Dimensional (3D) map of the structure (101 ), the input signal corresponding to the 3D map.

9. The apparatus (102) of claim 7, wherein the localization system (202d) is associable with ARTag (Augmented Reality Tag) to facilitate augmented reality based inspection of the structure (101).

Description:
A SYSTEM, APPARATUS AND METHOD SUITABLE FOR INSPECTION OF A

STRUCTURE

Field Of Invention

The present disclosure generally relates to a system suitable for inspecting a structure. The structure can, for example, correspond to a physical asset such as a building or a storage container (e.g., a brick-and-mortar container). Specifically, the present disclosure can generally relate to, for example, performing inspection using a flight capable apparatus (e.g., a drone). The present disclosure further relates to one or both of an apparatus and an inspection method associable to the system.

Background

Current practice for inspection of a structure (e.g., a physical asset such as a building or a storage container) can involve the use of scaffolding in order to access point(s) of interest and/or hard-to-reach area(s).

For example, scaffolding would need to be erected in order for inspector(s) (e.g., one or more persons carrying one or more instruments/equipment for the purpose of conducting inspection) to access an area of, for example, a building which can be at a height taller than the inspector(s).

Appreciably, such practice could require significant resources (e.g., cost and time to erect scaffolding), and could thus be cost inefficient and/or time inefficient.

Moreover, such practice could potentially be associated with various hazards. For example, undesirable site accidents could be possible (e.g., inspector(s) could fall from the scaffolding).

The present disclosure contemplates that it is generally useful to address (at least partially) one or more foregoing issue(s) associated with current practice for inspection of a structure. Summary of the Invention

In accordance with an aspect of the disclosure, there is provided an apparatus suitable for inspection of a structure. Specifically, there is provided an apparatus (e.g., a drone) which can be capable of flight. Moreover, the apparatus can be suitable for performing flight-based inspection of a structure. The apparatus can include at least one arm (e.g., corresponding to a manipulator arm which can, for example, have 3 Degrees of freedom of movement). The arm can be capable of carrying one or both of a sensor and a localization system. The sensor and/or the localization system can be configured for performing inspection of the structure Based on the inspection of the structure, at least one input signal can be generated and communicated from the apparatus. The input signal can be communicated from the apparatus for further processing.

In accordance with another aspect of the disclosure, there is provided an apparatus suitable for inspection of a structure. Specifically, there is an apparatus (e.g., a drone) which can be capable of flight. Moreover, the apparatus can be suitable for performing flight-based inspection of a structure. The apparatus can include a localization system for performing inspection of the structure. At least one input signal can be generated based on inspection of the structure. The input signal can be communicated from the apparatus. Specifically, the input signal can be communicated from the apparatus for further processing.

In accordance with yet another aspect of the disclosure, there is provided a method (e.g., referable to as an inspection method and/or a processing method) suitable for inspection of a structure

Brief Description of the Drawings

Embodiments of the disclosure are described hereinafter with reference to the following drawings, in which:

Fig. 1 shows a system which can include at least one apparatus and, optionally, at least one device, according to an embodiment of the disclosure; Fig, 2 shows the apparatus of Fig, 1 in further detail, according to an embodiment of the disclosure; Fig. 3 shows the device of Fig. 1 in further detail, according to an embodiment of the disclosure; and

Fig. 4 shows a method in association with the system of Fig. 1 , according to an embodiment of the disclosure.

Detailed Description

The present disclosure contemplates, in one embodiment, inspection of a structure (e.g., a physical asset) can be facilitated by manner of a flight-capable apparatus such as, for example, a drone. The, for example, drone can be configured to carry at least one multi-dimensional (e.g., three-Dimensional, 3D) localization type system, in accordance with an embodiment of the disclosure. The, for example, drone can be configured to carry at least one manipulator arm (associable with multiple axes of freedom), in accordance with an embodiment of the disclosure. The, for example, drone can be configured to carry both at least one multi-dimensional localization type system and at least one manipulator arm, in accordance with an embodiment of the disclosure.

Generally, the, for example, drone can be controlled (e.g., by a user such as a drone pilot) for the purpose of inspecting a structure, in accordance with an embodiment of the disclosure. Appreciably, one or more hard-to-reach areas/portions of the structure (e.g., a certain height above ground) can be inspected without the need for, for example, erecting scaffolding to facilitate inspection.

The foregoing will be discussed in further detail with reference to Fig. 1 to Fig. 4 hereinafter. Referring to Fig. 1 , a system 100 is shown, according to an embodiment of the disclosure. The system 100 can be in association with the inspection of a structure 101 such as a physical asset (e.g., a building or a brick-and-mortar container). The system 100 can include one or more apparatuses 102 and, optionally, one or both of at least one device 104 and a communication network 106. The system 100 can, for example, further include at least one control unit 108 as an option, in accordance with an embodiment of the disclosure. The apparatus(es) 102 can be coupled to the device(s) 104. Specifically, the apparatus(es) 102 can, for example, be coupled to the device(s) 104 via the communication network 106. The control unit 108 can be coupled to the apparatus(es) 102. In one embodiment, the apparatus(es) 102 can be coupled to the communication network 106 and the device(s) 104 can be coupled to the communication network 106. Coupling can be by manner of one or both of wired coupling and wireless coupling. The apparatus(es) 102 can, in general, be configured to communicate with the device(s) 104 via the communication network 106, according to an embodiment of the disclosure.

The apparatus(es) 102 can, for example, be flight-capable apparatus(es) which can be controlled. The apparatus(es) 102 can, for example, correspond to autonomous- type machine(s) (e.g., controllable using artificial intelligence and/or by manner of pre-programming, and human-based control will not be required), in accordance with an embodiment of the disclosure. The apparatus(es) 102 can, for example, correspond to semiautonomous-type machine(s) (e.g., partial control using artificial intelligence and partial human-based control), in accordance with an embodiment of the disclosure. The apparatus(es) 102 can, for example, correspond to nonautonomous-type machine(s) (e.g., full human-based control will be required). An example of an apparatus 102 can be a drone. The apparatus(es) 102 can, for example, be controlled by one or both of at least one user (e.g., at least one drone pilot) and artificial intelligence (Al) for the purpose of inspection. Moreover, the apparatus(es) 102 can be configured to generate and communicate one or more input signals. The input signal(s) can, for example, be communicated to the device(s) 104. Furthermore, the input signal(s) can, for example, be associated with one or more characteristics associable the structure 101 , in accordance with an embodiment of the disclosure. The apparatus(es) 102 will be discussed later in further detail with reference to Fig. 2, in accordance with an embodiment of the disclosure.

The device(s) 104 can, for example, correspond to one or more computers (e.g., laptops, desktop computers and/or electronic mobile devices having computing capabilities such as Smartphones and electronic tablets). The device(s) 104 can, in one embodiment, include one or more processors (not shown) which can be configured to perform one or more processing tasks. Generally, the device(s) 104 can be configured to receive one or more input signals (i.e., communicated from the apparatus(es) 102) and process the input signal(s) in a manner so as to produce one or more output signals. Moreover, the device(s) 104 can, in one embodiment, be configured to generate and communicate one or more control signals. Specifically, one or more users (e.g., drone pilot(s)) can, using, the device(s) 104 generate and communicate one or more control signals. The control signal(s) can be communicated from the device(s) 104 to the apparatus(es) 102 for controlling the apparatus(es) 102. For example, flight path/movement of the apparatus(es) 102 can be controlled via the control signal(s). The device(s) 104 will be discussed later in further detail with reference to Fig. 3, according to an embodiment of the disclosure.

The communication network 106 can, for example, correspond to an Internet communication network. Communication (i.e., between the apparatus(es) 102 and the device(s) 104) via the communication network 106 can be by manner of one or both of wired communication and wireless communication. The control unit 108 can, for example, correspond to a controller apparatus such as a joystick or a remote control usable by one or more users (e.g., drone pilot(s)) for generating and communicating one or more control signals for controlling the apparatus(es) 102. Specifically, one or more users (e.g., drone pilot(s)) can, using, the control unit 108 generate and communicate one or more control signals which can be communicated from the control unit 108 to the apparatus(es) 102 for controlling, for example, flight path/movement of the apparatus(es) 102.

In one embodiment, the control signal(s) can primarily be generated and communicated from the device(s) 104 and, secondarily, from the control unit 108. In this regard, the device(s) 104 can be considered to be a primary controller apparatus whereas the control unit 108 can be considered to be a secondary controller apparatus. In another embodiment, the control signal(s) can primarily be generated and communicated from the control unit 108 and, secondarily, from the device(s) 104. In this regard, the control unit 108 can be considered to be a primary controller apparatus whereas the device(s) 104 can be considered to be a secondary controller apparatus.

It is generally contemplated that a secondary controller apparatus may be useful in an event of primary controller apparatus failure. In this regard, the secondary controller apparatus can function as a fail-safe backup controller. Hence, with a fail safe backup controller, the system 100 can be considered to be more robust. A secondary controller apparatus (e.g., where the control unit 108 is the secondary controller apparatus) can possibly also be useful in reducing processing burden of the primary controller apparatus (e.g., where the device 104 is the primary controller apparatus) in that the primary controller apparatus (e.g., a device 104) need only be concerned with, for example, generating and communicating one or more control signals in association with obtaining type of input signal(s) communicable from the apparatus(es) 102 whereas the secondary controller apparatus (e.g., the control unit 108) can generate control signals concerned with movement/flight path of the apparatus(es) 104.

In yet another embodiment, the control signal(s) can be generated and communicated by one or both of the device(s) 104 and the control unit 108, with the apparatus(es) 102 being primarily pre-programmed (e.g., pre-programmed flight path and/or type of input signals to be generated and communicated). In this regard, the device(s) 104 and/or the control unit 108 can function as a secondary controller apparatus (e.g., as a fail-safe backup controller).

In yet a further embodiment, the apparatus(es) 102 can be fully pre-programmed (e.g., pre-programmed flight path and/or type of input signals to be generated and communicated), and it is not necessary for one or more control signals to be generated and communicated from one or both of the device(s) 104 and the control unit 108.

The aforementioned apparatus(es) 102 will be discussed in further detail with reference to Fig. 2 hereinafter. Referring to Fig. 2, an apparatus 102 is shown in further detail in the context of an example implementation 200, according to an embodiment of the disclosure.

The apparatus 102 can be configured to carry any one of at least one arm 202a, at least one sensor 202b, at least one probe 202c and at least one localization system 202d, or any combination thereof. Specifically, the apparatus 102 can be configured to carry at least one arm 202a, at least one sensor 202b, at least one probe 202c and/or a localization system 202d. In one embodiment, the apparatus 102 can include any one of the arm(s) 202a, the sensor(s) 202b, the probe(s) 202c and the localization system 202d, or any combination thereof. Specifically, the apparatus 102 can include at least one arm 202a, at least one sensor 202b, at least one probe

202c and/or a localization system 202d, according to an embodiment of the disclosure. In one embodiment, the arm(s) 202a can be coupled to any one of the one sensor(s) 202b, the probe(s) 202c and the localization system 202d, or any combination thereof. The sensor(s) 202b can be coupled to any one of the arm(s) 202a, the probe(s) 202c and the localization system 202d, or any combination thereof. The probe(s) 202c can be coupled to any one of the arm(s) 202a, the sensor(s) 202b and the localization system 202d, or any combination thereof. The localization system 202d can be coupled to the arm(s) 202a, the sensor(s) 202b and the probe(s) 202c, or any combination thereof. Coupling can be by manner of one or both of wired coupling and wireless coupling.

An arm 202a can, for example, correspond to a manipulator arm which can be associated with multiple axes of freedom in/of movement, in accordance with an embodiment of the disclosure. For example, the arm 202a can be associated with three degrees of freedom (3 DoF) of movement. In this regard, the arm 202a can, for example, be referred to as a 3 DoF arm. In this regard, the arm 202a can be a flexible arm (e.g., with at least 3 DoF) capable of carrying one or both of at least one sensor 202b and the localization system 202d, in accordance with an embodiment of the disclosure.

A sensor 202b can, for example, a contact type/based sensor, in accordance with an embodiment of the disclosure. The sensor 202b can be configured to substantially make/establish contact with one or more surfaces of the structure 101. The sensor 202b can be further configured to obtain one or more readings associated with the contacted surface(s) of the structure 101. It is contemplated that by substantially making/establishing contact, it is not necessary to make full physical contact. The present disclosure contemplates that substantially making/establishing contact can relate to the sensor 202b being brought near enough to the surface(s) (e.g., hover near a surface, but without making/establishing actual physical contact) in a manner so as to obtain the aforementioned sensor reading(s). A probe 202c can, for example, correspond to a rigid support structure suitable for carrying one or more items such as, for example, one or both of one or more sensors 202b and the localization system 202d, in accordance with an embodiment of the disclosure. The probe 202c can, for example, be a contact-based probe, in accordance with an embodiment of the disclosure.

A localization system 202d can, for example, include a light detection and ranging (Lidar) based sensor (e.g., 3-Dimensional Lidar, 3D Lidar), in accordance with an embodiment of the disclosure. In this regard, the localization system 202d can be referrable to as a 3D localization system. Optionally, the localization system 202d can, for example, further include one or both of a global positioning system (GPS) and camera, in accordance with an embodiment of the disclosure.

As mentioned earlier, the apparatus 102 can be a flight-capable apparatus.

The apparatus 102 can, for example, be a drone 250 in the context of the example implementation 200, in accordance with an embodiment of the disclosure.

In the example implementation 200, the apparatus 102 can be referred to as a drone 250.

The drone 250 can will be discussed hereinafter in the context of an example operation, in accordance with an embodiment of the disclosure. In the example operation, flight of the drone 250 can be controlled (e.g., by a drone pilot) in a manner such that the drone 250 can be positioned (e.g., in mid-air) near a desired general position relative to the structure 101. For example, the drone 250 can be flown near to (i.e., suitably positioned) one or more desired surfaces of the structure 101 desired for inspection. After being suitably positioned, one or both of the sensor(s) 202b and the localization system 202d can be used for inspection (e.g., of one or more surfaces of the structure 101 ). Inspection via the use of the sensor(s) 202b and/or the localization system 202d can be by manner of contacting the surface(s) of the structure 101. In this regard, inspection via the use of the sensor(s) 202b and/or the localization system 202d can be by manner of, for example, contact-based inspection.

In one embodiment, contact-based inspection can be based only on the sensor(s) 202b. In another embodiment, contact-based inspection can be based only on the localization system 202d. In yet another embodiment, contact-based inspection can be based on a combination of the sensor(s) 202b and the localization system 202d.

The present disclosure contemplates that contact-based inspection can be facilitated by one or both of the arm(s) 202a and the probe(s) 202c (i.e., by the arm(s) 202a and/or the probe(s) 202c). Specifically, in general, the sensor(s) 202b and/or the localization system 202d can be carried by one or both of the arm(s) 202a and the probe(s) 202c. In one embodiment, the sensor(s) 202b can be carried by the arm(s) 202a and the localization system 202d can be carried by the probe(s) 202c. In another embodiment, the localization system 202d can be carried by the arm(s) 202a.

In one embodiment, the probe(s) 202c and the localization system 202d can be omitted. In this regard, the drone 250 can include the arm(s) 202a and the sensor(s) 202b. The arm(s) 202a can be configured to carry (or hold) the sensor(s) 202b. After the drone 250 has been suitably positioned (e.g., hovering in mid-air approximate the desired surface(s) of the structure 101 for inspection) for contact- based inspection of the structure 101 , the arm(s) 202a can be controlled in a manner so as to flexibly move/position the sensor(s) 202b along the surface(s) of the structure 101 to be inspected. Earlier mentioned, the arm 202a can, for example, be a 3 DoF arm which allows for/facilitates flexibility in movement/positioning of the sensor(s) 202b relative to the structure 101. The arm(s) 202a can, for example, be controlled (e.g., via use of the aforementioned control signal(s)) either by a drone pilot or another (i.e., not the drone pilot). Control of the arm(s) 202a can, for example, be via one or both of the device(s) 104 and the control unit 108 (i.e., the device(s) and/or the control unit 108). The present disclosure contemplates that there can be a dedicated control part (e.g., by a drone pilot) for controlling movement (e.g., flight path/general positioning of the drone 250 relative to the structure 101) of the drone 250 and another (separate) dedicated control part (e.g., by another who is not the drone pilot) for controlling movement (e.g., 3 DoF) of the arm(s) 202a. When the sensor(s) 202b make contact with the surface(s) of the structure 101 , one or more sensor readings associated with the structure 101 can be obtained (e.g., when the sensor(s) 202b bump(s) on a desired surface for inspection). Sensor reading(s) can, for example, be indicative of/relate to information/characteristics of the structure 101 such as thickness and/or profile. In this regard, the aforementioned input signal(s) can, for example, include information corresponding to the sensor reading(s), in accordance with an embodiment of the disclosure. It is contemplated that by use of the arm(s) 202a, the sensor(s) 202b can be positioned in a manner so as to obtain sensor readings associated with one or more surfaces which may be considered to be inaccessible (i.e., difficult or hard to reach areas of the structure 101). Moreover, the use of the arm(s) 202a could possibly facilitate effectiveness/ease in obtaining sensor readings for complex surfaces (e.g., a curved surface and/or an irregular surface), if any, associated with the structure 101. In this manner, effort to align/position the sensor(s) 202b can be reduced (e.g., effort by the pilot to stabilize the drone 250 when obtaining sensor readings and/or time-consuming effort/effort in precision positioning of the drone 250 can be reduced). In this regard, it is appreciable that sensor readings can possibly be obtained more efficiently and/or in a more user-friendly manner. Generally, the arm(s) 202a can potentially facilitate flexibility and/or ease in positioning (e.g., of the drone 250 relative to the structure 101 and/or the sensor(s) 202b relative to, for example, an irregular surface) and/or precision positioning of the sensor(s) 202b. Moreover, simplification of task can, for example, be facilitated in the sense that a drone pilot can concentrate on navigating the drone 250 and another (i.e., not the drone pilot) can concentrate on positioning the sensor(s) 202b, in accordance with an embodiment of the disclosure. In another embodiment, the arm(s) 202a can be omitted. In this regard, the drone 250 can include the sensor(s) 202b, the probe(s) 202c and the localization system 202d. The sensor(s) 202b and/or the localization system 202d can be carried by the probe(s) 202c. Earlier mentioned, the localization system 202d can, for example, include a 3D Lidar. In this regard, the localization system 202d can be referable to as a 3D localization system. In one embodiment, the localization system 202d can be referred to as a Lidar based 3D positioning system which can be configured to determine (e.g., pre-determining) location associable with the aforementioned sensor reading(s). Specifically, the localization system 202d can be configured to obtain a 3D map associated with the structure 101 which can be useful for determining one or more suitable/desirable locations associated with the structure 101 for obtaining one or more sensor reading(s). For example, the drone 250 can be flown to perform a 3D scan (via the localization system 202d) of the structure 101 to obtain a 3D map of the structure 101. The 3D map can be helpful/useful for determining one or more suitable/desirable locations for obtaining the aforementioned sensor reading(s) via the sensor(s) 202b. Determination of suitable/desirable location(s) can, for example, be by manner of any one of Al-based determination, automatic-based determination (e.g., by manner of pre-programmed algorithm) and manual-based determination (e.g., the drone pilot and/or a user of a device 104), or any combination thereof, in accordance with an embodiment of the disclosure. In this regard, the aforementioned input signal(s) can, for example, include information corresponding to the 3D map, in accordance with an embodiment of the disclosure.

It is contemplated that the 3D map can be helpful in proper/more accurate location of one or more surfaces of the structure 101 to be inspected which can possibly facilitate, for example, improved monitoring of corrosion rate.

In yet another embodiment, the probe(s) 202c can be omitted. The drone 250 can include the arm(s) 202a, the sensor(s) 202b and the localization system 202d. In this regard, the relevant portions of the foregoing discussion concerning the arm(s) 202a, the sensor(s) 202b and the localization system 202d analogously apply. Other combinations are also useful. In one example, the drone 250 can include a combination of the arm(s) 202a, sensor(s) 202b, the probe(s) 202c and the localization system 202d. In another example, the sensor(s) 202b can be omitted.

The present disclosure generally contemplates that, in view of the foregoing, the need for inspection of a structure at height (e.g., using scaffolding etc.) using, for example, hand-held measurement devices (e.g., ultrasonic thickness testing type device(s) and/or surface profiling probe(s)) which could require manual testing (e.g., by a human tester/operator) at point(s) of interest (which could be at a height requiring the use of scaffolding) can be substantially eliminated (if not fully eliminated). Appreciably, the effort and cost associated with erecting scaffolding structures could possibly be eliminated (or, at least substantially eliminated). Moreover, it is appreciable that Health, Safety and Environment (HSE) hazards could potentially be substantially reduced (e.g., reduction in worksite accidents involving human tester(s)/operator(s)).

In view of the foregoing, it is appreciable that, in accordance with an embodiment of the disclosure, the present disclosure generally contemplates an apparatus 102 which can be capable of flight. Moreover, the apparatus 102 can be suitable for performing flight-based inspection of a structure 101 (e.g., by being flown by, for example, a drone pilot to the structure 101). The apparatus 102 can include/carry one or more arms 202a. The arm(s) 202a can be capable of carrying one or both of one or more sensors 202b and one or more localization systems 202d (i.e., at least one sensor 202b and/or at least one localization system 202d. In this regard, the arm(s) 202a can be configured to carry at least one of the sensor(s) 202b and the localization system(s) 202d. The sensor(s) 202b and/or the localization system(s) 202d can be capable of (i.e., configured to) for performing inspection of the structure 101. Based on inspection of the structure 101 , one or more input signals can be generated (i.e., by the sensor(s) 202b and/or by the localization system(s) 202d). The input signal(s) can be communicated from the apparatus 102 for further processing (e.g., by the aforementioned device(s) 104). In one embodiment, the arm(s) 202a can correspond to, for example, a manipulator arm having multiple axes in freedom of degree of movement. For example, the manipulator arm can be associated with three degrees of freedom (3 DoF) of movement.

In one embodiment, inspection performed on the structure 101 by at least one of the sensor(s) 202b and the localization system(s) 202d (i.e., the sensor(s) 202b and/or the localization system(s) 202d) can correspond to contact-based inspection where contact is established with one or more surfaces of the structure 101.

In one embodiment, the sensor(s) 202b can be configured to generate at least one sensor signal (e.g., corresponding to the aforementioned sensor reading(s)) indicative of data associated with the structure 101. Data associated with the structure 101 can, for example, include data concerning structural condition of the structure 101 and/or data related to the physical profile of the structure 101 (i.e., one or both of data concerning structural condition of the structure 101 and data related to the physical profile of the structure 101).

In one embodiment, the localization system 202d can include a light detection and ranging (Lidar) based sensor such as a 3-Dimensional Lidar (3D Lidar). The 3D Lidar can be configured to generate a 3-Dimensional (3D) map of the structure 101.

Moreover, in view of the foregoing, it is appreciable that, in accordance with an embodiment of the disclosure, the present disclosure generally contemplates an apparatus 102 which can be capable of flight. Moreover, the apparatus 102 can be suitable for performing flight-based inspection of a structure 101 (e.g., by being flown by, for example, a drone pilot to the structure 101). The apparatus 102 can include at least one localization system 202d for performing inspection of the structure 101. One or more input signals can be generated based on inspection of the structure 101. The input signal(s) can be communicated from the apparatus 102 for further processing (e.g., by the aforementioned device(s) 104). In one embodiment, the localization system 202d can include a light detection and ranging (Lidar) based sensor such as a 3-Dimensional Lidar (3D Lidar). The 3D Lidar can be configured to generate a 3-Dimensional (3D) map of the structure 101.

Earlier mentioned, the input signal(s) can, in one embodiment, be communicated from the apparatus(es) 102 to the device(s) 104 for further processing, as will be discussed later in further detail with reference to Fig. 3.

The input signal(s) can include one or both of the information relating to the aforementioned sensor reading (s) and information relating to a 3D map associated with the structure 101 , in accordance with an embodiment of the disclosure. Specifically, the input signal(s) can, for example, include information relating to the aforementioned sensor reading(s) and/or information relating to a 3D map associated with the structure 101 , according to an embodiment of the disclosure.

Referring to Fig. 3, a block diagram 300 in association with a device 104 is shown, in accordance with an embodiment of the disclosure. Specifically, the block diagram 300 can, for example, be representative of the aforementioned device(s) 104, in accordance with an embodiment of the disclosure.

The block diagram 300 can, for example, include any one of an input portion 302, a processing portion 304 and an output portion 306, or any combination thereof.

In one embodiment, the block diagram 300 can include an input portion 302, a processing portion 304 and an output portion 306.

The input portion 302 can be coupled to the processing portion 304. The processing portion 304 can be coupled to the output portion 306. In one embodiment, the input portion 302 can correspond to an electronic hardware- based receiver which can be configured to receive the input signal(s) communicated from the apparatus(es) 102. The input signal(s) can be further communicated from the input portion 302 to the processing portion 304.

In one embodiment, the processing portion 304 can be capable of processing the input signal(s). It is contemplated that the processing portion 304 can, for example, possibly correspond to an algorithm. In this regard, the processing portion 304 can be considered to be software-based, in accordance with an embodiment of the disclosure.

In one embodiment, based on the input signal(s), the processing portion 304 can be configured to generate one or more output signals which can correspond to a 3D map associated with the structure 101. In this regard, the output signal(s) can correspond to graphically perceivable based output signal(s), in accordance with an embodiment of the disclosure.

In another embodiment, based on the input signal(s), the processing portion 304 can be configured to generate one or more output signals which can, for example, correspond to data concerning structural condition of the structure 101 (e.g., structural health such as corrosion related data of the structure 101) and/or data related to the physical profile of the structure 101 (e.g., thickness of a wall of the structure 101). In yet another embodiment, based on the input signal(s), the processing portion 304 can be configured to generate one or more output signals which can, for example, correspond to data concerning structural condition of the structure 101 and which can, for example, correspond to a 3D map associated with the structure 101. The output signal(s) can be further communicated from the processing portion 304 to the output portion 306. In one embodiment, the output portion 306 can correspond to an electronic hardware-based transmitter which can be configured to transmit the output signal(s). The output signal(s) can be further communicated from the output portion 306 to, for example, the apparatus(es) 102 and/or another device (not shown).

In another embodiment, the output portion 306 can correspond to a display which can be configured to display the output signal(s) (e.g., display a 3D map associated with the structure 101).

Moreover, the present disclosure contemplates the possibility that the input and output portions 302/306 can be an integrated software-based transceiver module (e.g., an electronic part which can carry a software program/algorithm in association with receiving and transmitting functions/an electronic module programmed to perform the functions of receiving and transmitting). Furthermore, the processing portion 304 can, in one embodiment, correspond to a hardware-based processor (e.g., a microprocessor) carrying an algorithm.

Coupling between the input, processing and/or output portions 302/304/306 can, for example, be by manner of one or both of wired coupling and wireless coupling. Each of the input, processing and/or output portions 302/304/306 can correspond to one or both of a hardware-based module and a software-based module, according to an embodiment of the disclosure.

Referring to Fig. 4, a method 400 in association with the system 100 is shown, according to an embodiment of the disclosure. The method 400 (e.g., an inspection method) can, for example, correspond to a processing method (i.e., referable to as a “processing method 400”), in accordance with an embodiment of the disclosure.

The processing method 400 can include any one of an obtaining step 402, a sensing step 404 and an output generating step 406, or any combination thereof. Specifically, the processing method 400 can include an obtaining step 402, a sensing step 404 and/or an output generating step 406, in accordance with an embodiment of the disclosure.

With regard to the obtaining step 402, a 3D map associated with the structure 101 can be obtained, in accordance with an embodiment of the disclosure. As discussed earlier, the localization system 202d can, for example, be configured to facilitate in the obtaining of the 3D map.

With regard to the sensing step 404, sensor reading(s) can be obtained via the sensor(s) 202b.

The input signal(s) can include one or both of the information relating to the aforementioned sensor reading (s) and information relating to a 3D map associated with the structure 101 , in accordance with an embodiment of the disclosure. Specifically, the input signal(s) can, for example, include information relating to the aforementioned sensor reading(s) and/or information relating to a 3D map associated with the structure 101 , according to an embodiment of the disclosure.

With regard to the output generating step 406, input signal(s) can be processed to generate one or more output signals. As mentioned earlier, the input signal(s) can include one or both of the information relating to the aforementioned sensor reading(s) and information relating to a 3D map associated with the structure 101. Moreover, the input signal(s) can be communicated from the apparatus(es) 102 to the device(s) 104 for processing to generate one or more output signal(s), in accordance with an embodiment of the disclosure.

The processing method 400 will be discussed in further detail in the context of one example operation, in accordance with an embodiment of the disclosure hereinafter. In the example operation, a drone pilot can operate the drone 250 so that the drone 250 can be flown to the structure 101 and perform 3D scanning of the structure 101 using the localization system 202d so as to generate a 3D map associated with the structure 101. One or more input signals corresponding to information concerning the 3D map can, for example, be communicated from the drone 250 to one or more device(s) 104 (which can, for example, be operated by a user different from the drone pilot or by the drone pilot). The 3D map can, for example, be saved (e.g., stored) in the device(s) 104, according to an embodiment of the disclosure. Based on the 3D map, the drone 250 can be flown in a more targeted manner/precise manner (e.g., flown to a more targeted/precise location associated with the structure 101) to perform inspection of the structure 101. Specifically, based on the 3D map, the present disclosure contemplates that one or more contact surfaces (of the structure 101) of interest for the purpose of inspection can possibly be identified. Thereafter, for example, the user of the device(s) 104 can communicate, via a communication device (e.g., a phone), with the drone pilot concerning such contact surface(s) of interest so that the drone pilot can fly the drone 250 to such contact surface(s) accordingly. Inspection (of the structure 101) via the use of the sensor(s) 202b and/or the localization system 202d can be by manner of contacting the desired surface(s) of the structure 101. In this regard, inspection via the use of the sensor(s) 202b and/or the localization system 202d can be by manner of, for example, contact-based inspection. Based on the aforementioned inspection, one or more input signals can be communicated (e.g., from the drone 250) to the device(s) 104 for processing to generate one or more output signals. The output signal(s) can, for example, relate to information concerning a correlation between the 3D location(s) (associated with the structure 101) and data concerning the structure 101. Data concerning the structure 101 can, for example, relate to/correspond to data concerning structural condition of the structure 101 (e.g., structural health such as corrosion related data of the structure 101) and/or data related to the physical profile of the structure 101 (e.g., thickness of a wall of the structure 101). The output signal(s) can be saved/stored for further analysis (e.g., predictive analysis based on historic trends in association with the saved/stored output signal(s)). Analysis can, for example, be based on machine-learning based processing (e.g., the device(s) 104 can be capable of machine-learning based related processing tasks). It should be further appreciated by the person skilled in the art that variations and combinations of features described above, not being alternatives or substitutes, may be combined to form yet further embodiments. In one example, the apparatus(es) 102 can include a processor module (not shown) analogous to the processing portion 304. In this regard, processing related tasks can be carried out/performed solely by the apparatus(es) 102. In another embodiment, processing can be carried out/performed by the apparatus(es) 102 and the device(s) 104 (e.g., identical processing can be performed by both the apparatus(es) 102 and the device(s) 104 for the purpose of, for example, redundancy in consideration of processing robustness). In yet another embodiment, one portion of the aforementioned processing can be carried out/performed by the apparatus(es) 102 and another portion of the aforementioned processing can be carried out/performed by the device(s) 104.

In another example, the processing portion 304 can be omitted (i.e., the device(s) 104 can, for example, be without significant processing capabilities) and processing can be carried out/performed solely by the apparatus(es) 102. Specifically, in one embodiment, the device(s) 104 can be without processing capabilities and can be configured solely for, for example, viewing a 3D map of the structure 101.

In yet another example, the aforementioned communication network 106 can be omitted. The apparatus(es) 102 and the device(s) 104 can be directly coupled to each other by manner of one or both of wired coupling and wireless coupling.

In yet a further example, it is appreciable that, although not explicitly described (for the purpose of brevity), the apparatus(es) 102 and/or the device(s) 104 can include one or more further elements/parts/modules/features required to perform one or more of the aforementioned functions. For example, earlier mentioned, the input signal(s) can be communicated from the apparatus(es) 102. It is appreciable that the apparatus(es) 102 can, for example, include/carry one or more other hardware or software elements/parts/modules/features to facilitate transmission of the input signal(s) (e.g., a transceiver), though not specifically/explicitly discussed for the purposes of brevity.

In yet a further additional example, the localization system 202d can be associated with Augmented Reality Tag (ARTag) based system to facilitate augmented reality based inspection of the structure (101). In this regard, the present disclosure contemplates the possibility that the aforementioned 3D Lidar can be replaced using ARTag, in accordance with an embodiment of the disclosure. In yet another embodiment, the localization system 202d can be based on a combination of the aforementioned 3D Lidar and ARTag.

In the foregoing manner, various embodiments of the disclosure are described for addressing at least one of the foregoing disadvantages. Such embodiments are intended to be encompassed by the following claims, and are not to be limited to specific forms or arrangements of parts so described and it will be apparent to one skilled in the art in view of this disclosure that numerous changes and/or modification can be made, which are also intended to be encompassed by the following claims.