Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AN AUTONOMOUS UNMANNED AERIAL VEHICLE FOR INSPECTION OF A VERTICAL BUILDING PASSAGEWAY
Document Type and Number:
WIPO Patent Application WO/2022/248124
Kind Code:
A1
Abstract:
The present invention provides a system and a method for inspecting a vertical building passageway (100), such as an elevator shaft, using an autonomous unmanned aerial vehicle (200). Furthermore the present invention provides an inspection system and a corresponding computer platform (300) for processing and displaying the data acquired by the UAV (200) during an inspection of a vertical building passageway, such as an elevator shaft. The inspection system is provided, in addition to the UAV(200) and the computer platform (300), a ground station (600), which is configured to measure the distance between the ground station and the UAV by emitting a laser beam and measuring the energy reflected back to the ground station (600).

Inventors:
STRILIGKAS MICHAIL (GR)
Application Number:
PCT/EP2022/060234
Publication Date:
December 01, 2022
Filing Date:
April 19, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
VERTLINER PRIVATE COMPANY (GR)
International Classes:
B64C39/02; B66B19/00; G05D1/00; G05D1/10; G06T19/00; G08G5/00; G08G5/04
Domestic Patent References:
WO2019149215A12019-08-08
WO2018041815A12018-03-08
WO2019149215A12019-08-08
Foreign References:
US20190049962A12019-02-14
US10996683B22021-05-04
US20190215495A12019-07-11
US20210116941A12021-04-22
US10769844B12020-09-08
US20170277180A12017-09-28
US20190049962A12019-02-14
US10996683B22021-05-04
US20190215495A12019-07-11
US20210116941A12021-04-22
US10769844B12020-09-08
Download PDF:
Claims:
Claims

1 . An autonomous unmanned aerial vehicle, UAV, (200) for the inspection of an elevator shaft (100) within a building comprising at least one opening (110a, 110b)) and an internal space defined by corresponding opposing walls (110), the UAV comprising: a frame (211 ) comprising: a sensor module configured to collect data associated with a set of parameters associated with the surrounding environment and/or the UAV (200) position and operation, the sensor module is configured to generate, based on the collected data, a representative three-dimensional scan dataset comprising spatial measurements of the surrounding environment characteristics; and a processing unit (297) configured to collect the data generated from the sensor module, which data is communicated to a remote central computing platform for display to a user; wherein the processing unit (297) is configured, upon receiving an activation signal and with the UAV (200) in a starting position, to scan the surrounding environment and collect data from the sensor module to locate an opening (110a, 110b) of the elevator shaft (100), using the collected data the processing unit (297) is configured to calculate a first target point to guide the UAV (200) from the starting position through the opening of the elevator shaft (100) to an entrance position within the internal space of the elevator shaft (100), wherein upon entering the internal space, the processing unit (297) is configured to perform the inspection of the elevator shaft by the UAV (200) by successively calculating intermediate target points determined with respect to a global reference point based on data collected from the sensor module to compute a respective UAV flight path to guide the UAV (200) within the elevator shaft (100), wherein the data generated from sensor modules is collected at predetermined time steps along the computed flight path; wherein the processing unit is configured, upon completing the building passageway (100) inspection, to guide the UAV (200) back to the starting position via the calculated target points; wherein the processing unit (297) is configured to calculate between each target point an occupancy grid using the data collected from the sensor unit indicating obstacles occupying the scanned 3D space and based on the information obtained from the occupancy grid calculate a corresponding minimum jerk trajectory towards each target point to avoid possible collisions with obstacles identified within the scanned 3D space.

2. The autonomous unmanned aerial vehicle, UAV, (200) of claim 1 , wherein each intermediate target point is set along the vertical direction of the elevator shaft (100), the z-axis of the global reference frame, and within the maximum range of a depth camera of the sensor module.

3. The autonomous unmanned aerial vehicle, UAV, (200) of claim 1 or 2, wherein at each time step, the target position of the calculated trajectory is given as an input to a Model Predictive Controller of the processing module (297) configured to calculate a reference velocity for the UAV along the calculated flight path between successive target points, and wherein the calculated reference velocity is transmitted to a flight controller (270) of the UAV (100) configured to calculate based on the calculated reference velocity a respective angular velocity for each motor so as to reduce the velocity error between current velocity of the UAV and the calculated reference velocity .

4. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims, wherein the processing unit (297) is configured to maintain the UAV (200) at a predetermined safe distance from the walls (110) of the elevator shaft (100) based on the data collected from the sensor module.

5. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims , wherein, (100), the processing unit (297) is configured to perform at each time step a set of spatial measurements to determine the distance between a reference point in the x,y plane and the walls (120) so as to estimate during the flight the structural measurements of the elevator shaft (100).

6. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims, wherein the sensor module comprises a Stereo Camera (210) configured to collect localisation data of the UAV based on Visual Inertial Odometry techniques. (200).

7. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims, wherein the sensor module comprises a photodiode array (285) mounted at the bottom of the UAV (200) comprising a plurality of photodiodes (2851 ) and a semi-reflective glass (2852) positioned on top of the photodiodes (2851 ), wherein the semi-reflective glass (2852) is configured to reflect a part of the energy of a laser beam (645) emitted by a supporting ground station (600) positioned at a bottom part of the elevator shaft (100), while allowing the remaining energy of the laser beam (645) to be absorbed by the photodiodes and converted into a corresponding current value, and wherein the processing unit (297) is configured to measure the current values of each photodiode to determine the lateral displacement of the UAV (200) with respect to the laser beam (645).

8. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims, wherein the sensor module comprises the depth camera (220) to collect depth and distance data from a target, and a Light Detection and Ranging, LiDaR, module (230) configured to scan the surrounding and measure the distance, elevation and contours of areas of the elevator shaft (100), and, wherein the processing unit (297) is configured to process the data received from the depth camera (220) and/or the LiDaR (230) and insert the received data into an occupancy grid to determine obstacles in the space between successively calculated target points along the path of the UAV (200).

9. The autonomous unmanned aerial vehicle, UAV, (200) of claim 7 wherein upon an obstacle is detected, the processing unit (297) is configured to dynamically adjust the flight path of the UAV (200) between the successively calculated target points to avoid collision with the detected obstacle.

10. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims, wherein the sensor module comprises an altitude sensor unit (260) configured to measure the position of the UAV in relation to a reference bottom level (130) of the elevator shaft (100), and an Inertia Measurement Unit, IMU, sensor.

11. The autonomous unmanned aerial vehicle, UAV, (200) of claim 9, wherein the altitude sensor unit (260) comprises a laser unit and/or a sonar altimeter unit.

12. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims, wherein the successive target points are positioned between a predetermined threshold distance from a ground bottom level (130) and a top level (140) of the elevator shaft (100).

13. The autonomous unmanned aerial vehicle, UAV, (200) of any one of the preceding claims, wherein the set of parameters comprising any one of or a combination of dimensions of the building passageway, position of the UAV, speed of the UAV, position of the UAV in relation to a target, obstacle detection, orientation of the UAV, verticality of the building passageway, and gravity vector in the geographic location of the building passageway

14. A central computer platform (300) configured for processing inspection data collected from an autonomous UAV (200) according to claims 1 to 13 during inspection of an elevator shaft (100), the central computer platform (300) comprising: a communication module (320) configured to receive inspection data from the UAV (200); a processing module (310) configured to process the collected inspection data to compute a set of structural characteristics of the elevator shaft (100), and based on the computed values of the structural characteristics and the collected inspection data to generate an as-built model of the elevator shaft (100); and a graphical user interface (330) accessible by a user configured to display a graphical representation of the as-built model of the elevator shaft (100); wherein the processing module (310) is configured to superimpose on the graphical user interface the as-built model with a virtual reference model of the elevator shaft (100) to highlight offsets between the measured dimensions of the internal space obtained during inspection with reference dimensions of the virtual model.

15. The central computer platform (300) of claim 14, wherein the superimposed measurement information displayed on the graphical user interface (330) is configured to provide visual and textual information to the user (500) associated with the inspection carried out by the UAV (200), the visual and textual information comprising information associated with regions of the elevator shaft (100) where the measured dimensions exceed a predetermined threshold from the reference dimensions, the geographical location of the building, and images taken during inspection.

16. An elevator shaft inspection system comprising a UAV (200) according to claims 1 to 13; a computer platform (300) according to claim 14 and 15; and a ground station (600) comprising a laser rangefinder (640) configured to emit a laser beam (645) that is aligned with the gravity vector of the elevator shaft (100); wherein the ground station is configured to measure the distance between the ground station (600) and the UAV (200) by measuring a part of the energy of the laser beam reflected by the UAV (200), and communicate the detected distance, via communication link, to the processing unit (297) of the UAV (200).

17. The elevator shaft inspection system of claim 15, wherein the range finder is mounted on a rotating base (650) of the ground station (600) configured to be rotated by two rotational actuators (625,630) that provide two degrees of freedom to the rotating base (650) so that the laser beam 645 can be aligned with the gravity vector of the elevator shaft (100).

18. The elevator shaft inspection system of claim 15 or 16, wherein the inclination of the rotating base (650) with respect to the gravity vector is measured by an Inclinometer/Gyroscope mounted on the rotating base (650), and wherein the rotation of the rotating base (650) is controlled by a computing unit (675) based on information measured by the Inclinometer/Gyroscope.

19. A method for operating a UAV (200) according to claims 1 to 13 for the inspection of an elevator shaft (100), the method comprises the steps of: positioning the UAV (200) in a starting position; collecting data, by the sensor module, associated with a set of parameters associated with the surrounding environment and/or the UAV (200) position and operation; generating, based on the collected data, a representative three-dimensional scan dataset comprising spatial measurements of the surrounding environment characteristics; processing, by a processing unit (297), the collected data and scan data generated by the sensor module to determine an opening (110a, 110b) of the elevator shaft; calculating, based on the scan dataset collected, a UAV flight path towards the detected opening; guiding, by the processing unit (297), the UAV (200) through the opening and into an internal space of the elevator shaft (100); inspecting the internal space of the elevator shaft (100) by performing the steps of : guiding, by the processing unit, the UAV (200) between successively calculated target points defined with respect to a global reference point along a respectively computed UAV flight path within the elevator shaft (100) the successive target points and corresponding UAV flight path being calculated by the processing unit based on data collected, by the sensor module, at predetermined time steps along the UAV flight path, collecting, by the processing unit (297), the scan data collected by the sensor module at each time step; wherein, upon completing the inspection of the internal space of the elevator shaft, the processing unit performs the steps of: guiding the UAV (200) to the entrance opening via the calculated target points; calculating a flight path from the entrance opening to the starting position; guiding the UAV (200) to the starting position; and communicating the scan data collected by the sensor module to a central computing platform; wherein the step of guiding the UAV between successively calculated target points comprises the steps of: determining an occupancy grid between the successively calculated target points to determine any obstacles occupying the scanned space; and calculating a minimum jerk trajectory from a UAV current position to a calculated target point based on information from the occupancy grid; and guiding the UAV from the current position to the target point along the computed flight path.

Description:
AN AUTONOMOUS UNMANNED AERIAL VEHICLE FOR INSPECTION OF A VERTICAL BUILDING PASSAGEWAY

Field

[0001] The present invention relates an autonomous unmanned aerial vehicle and a method for the inspection of a vertical building passageway, such as an elevator shaft.

Background

[0002] Inspecting the dimensions of internal vertical building passageways, such as an shaft of an elevator, is an essential task that should be carried during the construction of the building. In general, elevators, also referred to as lifts, are configured to transport goods and passengers between different levels of a building. The elevator of a building is enclosed by an elevator shaft, which is typically square or rectangular, that rises vertically through the building and often terminates above roof level. Inspection of elevator shafts is primary carried out manually using known techniques such as laser devices or measurement strips, which lack accuracy and coverage. This is because access to the internal space of an elevator shaft is difficult, leading to estimations of the dimensions. Inaccuracies in the construction of the building passageway may compromise safety and lead to delays and extra cost.

[0003] W02018041815A1 discloses a method and a system for measuring the dimensions of an elevator shaft. The method and the proposed system work by securing, via a rope, a measurement device to a winch positioned at the top surface of the elevator shaft. The measurement system is manually lowered to the bottom of the elevator shaft, and the internal space dimensions are measured. However, the proposed system has a disadvantage in that the measurement unit is suspended via a rope and thereby would swing from side to side as it is lowered and raised in the elevator shaft, risking collision of the measurement unit to the walls of the elevator shaft. Therefore, it is left up to the operator to ensure the safe deployment of the measurement device. Furthermore, the higher the elevator shaft, the longer the rope that needs to be used, which would further increase the instability of the measurement system as it is moved vertically along the elevator shaft.

[0004] US 2019/0049962, US 10,996,683, US 2019/0215495, US 2021/0116941 A, US 10,769,844, WO 2019/149215 provide further examples related to the use of autonomous vehicles.

[0005] US 2019/0049962 discloses robotic technologies for industrial inspection that can flexibly scale to meet many different types of industrial inspection. By utilizing the autonomous robotic technologies disclosed herein, increasingly large and complex industrial inspections may be completed in a fraction of the time previously required for inspections of smaller scales.

[0006] US 10,996,683 provides a technique for touchdown detection during autonomous landing by an aerial vehicle. In some embodiments, the introduced technique includes processing perception inputs with a dynamics model of the aerial vehicle to estimate the external forces and/or torques acting on the aerial vehicle. The estimated external forces and/or torques are continually monitored while the aerial vehicle is landing to determine when the aerial vehicle is sufficiently supported by a landing surface. In some embodiments, semantic information associated with objects in the environment is utilized to configure parameters associated with the touchdown detection process.

[0007] US 2019/0215495 provides a vehicle trajectory determination method whereby a vehicle equipped with a camera system that captures video while the vehicle moves. The vehicle records the captured video and/or wirelessly transmits the captured video to a remote user device for playback.

[0008] US 2021/0116941 A flight system for indoor positioning includes an unmanned aerial robot, and a station and a server of the unmanned aerial robot. The unmanned aerial robot may sense a plurality of laser beams generated from the station through a first camera and/or a first sensor, perform adjustment such that a horizontal axis position of the unmanned aerial robot is located at a centre position of a measurement space for the indoor positioning based on the plurality of sensed laser beams, and perform positioning in the measurement space while flying in a vertical direction [0009] US 10,769,844 discloses Methods, systems, and apparatus, including computer programs encoded on a storage device, for generating a three- dimensional map of a property. The method includes storing at least a portion of an initial map modification data structure, receiving an instruction that instructs the robotic device to initiate property mapping, obtaining image data describing a portion of the property that is mounted to the robotic device, analysing the obtained image data to determine whether the image data depicts a marker, in response to a determination that the obtained image data depicts a marker determining the pose of the camera that is mounted to the robotic device, updating the initial map modification data structure using the pose of the camera that is mounted to the robotic device, obtaining an initial three-dimensional map of the property; and modifying the initial three-dimensional map of the property based on the updated map modification data structure.

[0010] WO 2019/149215 provides an intelligent robot control method, a device, a system, and a storage medium. The method comprises: acquiring a current first position of an intelligent robot and a destination position (S202); determining a moving path of the intelligent robot from the first position to the destination position, wherein the moving path comprises a specific obstacle (S204); and when the intelligent robot has moved from the first position to a second position, and a distance from the second position to a third position of the specific obstacle to which the intelligent robot is approaching reaches a target distance, controlling the intelligent robot to send an access request, wherein the access request is for requesting an access instruction to be sent to the specific obstacle at the third position, and the access instruction is used to instruct removal of access restrictions of the specific obstacle at the third position before arrival of the intelligent robot (S206). The method resolves the issue of low efficiency in controlling an intelligent robot in control methods provided by the related art.

[0011] However, none of the above prior art solution provide a system for autonomously inspecting a vertical building passageway, such as an elevator shaft or other forms of vertical corridors without the need for a pre-loaded preliminary map of the structure to be inspected. [0012] Therefore, there is a need to provide a system and a method for autonomously inspecting the internal dimensions of a hard-to-reach, vertical building passageway, such as an elevator shaft, without the need of a pre-loaded preliminary map of the structure.

Summary of the invention

[0013] The aim of the present invention is to provide an autonomous unmanned aerial vehicle (UAV) and a corresponding method for inspecting the internal dimensions of a building passageway such as an elevator shaft or another type of vertical corridor within a building.

[0014] A further aim of the present invention is to provide a central computer platform for processing the data collected from the UAV for a display to the user.

[0015] A further aim of the present invention is to provide an inspection system for inspecting the internal dimensions of a building passageway such as an elevator shaft or another type of vertical corridor. The inspection system may be provided with a ground station configured to provide localization information to the UAV in harsh environmental conditions such as poor lighting conditions, and the like.

[0016] The aims of the present invention are achieved with the systems and method of the independent claims, while preferred embodiments are described as independent claims.

[0017] According to an aspect of the present invention, an autonomous unmanned aerial vehicle, UAV, is provided for the inspection of a vertical building passageway such as an elevator shaft comprising at least one opening and an internal space defined by corresponding opposing walls, the UAV comprising: a frame comprising: a sensor module configured to collect data associated with a set of parameters associated with the surrounding environment and/or the UAV position and operation, the sensor module is configured to generate, based on the collected data, a representative three-dimensional scan dataset comprising spatial measurements of the surrounding environment characteristics, wherein the set of parameters comprising any one of or a combination of dimensions of the building passageway, position of the UAV, speed of the UAV, position of the UAV in relation to a target, obstacle detection, orientation of the UAV, verticality of the building, and gravity vector in the geographic location of the building passageway; and a processing unit configured to collect the data generated from the sensor module, which data is communicated to a remote central computing platform for display to a user; wherein the processing unit is configured, upon receiving an activation signal and with the UAV in a starting position, to scan the surrounding environment and collect data from the sensor module to locate an opening of the elevator shaft, using the collected data the processing unit is configured to calculate a first target point to guide the UAV from the starting position through the opening of the elevator shaft to an entrance position within the internal space of the elevator shaft, wherein upon entering the internal space, the processing unit is configured to perform the inspection of the elevator shaft by the UAV by successively calculating intermediate target points determined with respect to a global reference point based on data collected from the sensor module to compute a respective UAV flight path to guide the UAV within the elevator shaft, wherein the data generated from sensor modules is collected at predetermined time steps along the computed flight path; wherein the processing unit is configured, upon completing the building passageway inspection, to guide the UAV back to the starting position via the calculated target points; wherein the processing unit is configured to calculate between each target point an occupancy grid using the data collected from the sensor unit indicating obstacles occupying the scanned 3D space and based on the information obtained from the occupancy grid calculate a corresponding minimum jerk trajectory towards each target point to avoid possible collisions with obstacles identified within the scanned 3D space.

[0018] It has been found that the autonomous UAV of the present invention is capable of operating autonomously for the inspection of the internal dimensions of a vertical building passageway, such as an elevator shaft. In particular, the UAV is provided with a sensor module which is operated by a processing unit to scan and collect data from the surrounding environment. The sensor module is configured to collect data associated with a set of parameters associated with the surrounding environment and/or the UAV position and operation such as the dimensions of the building passageway e.g. width, depth, height, and the like, position of the UAV, speed of the UAV, position of the UAV in relation to a target, obstacle detection, orientation of the UAV, vertically of the building passageway, and gravity vector in the geographical location of the building passageway. It should be noted that the sensor module may be configured to monitor any other parameters that are required for the inspection of the vertical building passageway. The internal building passageway may be an elevator shaft, a vertical corridor e.g. an air corridor, or another type of vertical building passageway.

[0019] Based on the data received from the sensor module, the processing unit calculates the positioning of the UAV and guides the UAV through the internal space of the building passageway for the inspection of its internal dimensions. During the operation of the UAV, the processing unit is configured to collect and process the data generated from the sensor module at predetermined time steps. Each time step may be associated with a grid of the internal space of the building passageway. The processing unit is configured to collect and store the data generated from the sensor module at each time step. The collected data may be processed at the processing unit and additional information may be added to the collected data. The data collected from the sensor module is in the form of a three-dimensional (3D) point cloud and visual data such as images or videos. Once the inspection of the building passageway is complete, the processing unit guides the UAV back to the starting position. The processing unit is configured to communicate the data collected to a central computer platform, e.g. via a wireless or wired communication network. The exchange of data to and from the central computing platform may be facilitated using a communication module. The processing unit may be pre-loaded with a set of inspection tasks to be performed during the inspection of the building passageway such as type of measurements to be taken and the like. The inspection tasks may also be communicated to the UAV via a wireless communication link from a computer platform and/or a ground station. The sensor module may be configured to collect data associated with a set of parameters such as UAV speed, UAV position, UAV proximity to objects such as the wall of the building passageway or obstacles, UAV rotation and orientation, UAV angle, dimensions of the building passageway, and the like. The sensor module is configured to generate a scan dataset that is representative of the characteristics of the surrounding environment. Upon activation, the processing unit is configured to calculate the distance and other parameters from the opening and accordingly calculate a flight path using the data collected from the sensor module. Similarly, once the UAV is positioned inside the building passageway, the processing unit calculates an obstacle-free flight path along the desired direction, based on the data collected from the sensor module.

[0020] According to embodiments of the present invention, each intermediate target point is set along the vertical direction of the elevator shaft, the z-axis of the global reference frame, and within the maximum range of a depth camera of the sensor module.

[0021] According to embodiments of the present invention, at each time step, the target position of the calculated trajectory of the UAV is given as an input to a Model Predictive Controller of the processing module configured to calculate a reference velocity for the UAV along the calculated flight path between successive target points, and wherein the calculated reference velocity is transmitted to a flight controller of the UAV configured to calculate based on the calculated reference velocity a respective angular velocity for each motor so as to reduce the velocity error between current velocity of the UAV and the calculated reference velocity .

[0022] According to embodiments of the present invention, the processing unit is configured to maintain the UAV at a maximum possible distance from the walls of the building passageway and other obstacles such as prominent materials based on the data collected from the sensor module.

[0023] According to embodiments of the present invention, upon guiding the UAV into the internal space of the building passageway, the processing unit is configured to perform a set of spatial measurements by means of the sensor module to determine the width and depth of the internal space. For example, the processing unit, based on the data collected from the spatial measurements, may be configured to position the UAV at a preferred distance from the walls along the calculated path.

[0024] To ensure the safe operation of the UAV within the internal space of the building passageway, the processing unit may use the data collected from the sensor module to determine distances from the walls of the building passageway. The processing unit, based on the distances measured, may position the UAV at a preferred distance from the walls of the building passageway, to avoid possible collision and/or damage e.g. at a preferred safe distance from the walls and/or obstacles encountered along the flight path. The processing unit may be configured to maintain the position of the UAV as it moves along the calculated path within the building passageway, thereby ensuring the safe operation of the UAV and accurate inspection of the internal space of the building passageway.

[0025] According to embodiments of the present invention, the sensor module comprises a stereo camera used to localize the UAV with visual Inertial Odometry, VIO techniques and configured to collect visual data of the surrounding environment.

[0026] According to embodiments of the present invention, the sensor module comprises two depth and/or Time-of-flight (ToF) cameras e.g. one upward facing and one downward facing, to collect depth and distance data from a target, and Light Detection and Ranging, LiDaR, module configured to scan the surrounding and measure the distance, elevation and contours of areas of the building passageway.

[0027] The sensor module comprises an inertia measurement unit based on acceleration sensors and inclination sensors (gyroscopes), a stereo camera for localisation, two RGB-Depth and/or Time-of-flight (ToF) cameras e.g. one upward facing and one downward facing, for structural data acquisition and obstacle detection, and a LiDaR module to scan the surrounding environment and measure the distance, elevation and contours of areas of the building passageway. The sensor module collects information that is necessary to ensure the safe and autonomous operation and allow the processing unit to assess the UAV operation accurately, including the UAV three-dimensional position, linear and angular velocity as well as its orientation by measuring the angles of inclination of the UAV during a three-axis rotation. The measurements are taken with respect to a global reference point, which is associated with the UAV starting position. For example, the global reference point may define the position (x,y,z) = (0,0,0) of the UAV at the starting position along with the unit vectors which define the reference directions along which the orientation is calculated.

[0028] According to embodiments of the present invention, the processing unit is configured to process the data received from the two depth and/or ToF cameras and/or the LiDaR to determine obstacles along the path of the UAV.

[0029] According to embodiments of the present invention, wherein upon an obstacle is detected, the processing unit is configured to adjust the path of the UAV to avoid the obstacle.

[0030] The processing unit uses the information collected from the sensor module to detect objects along the flight path of the UAV, thereby ensuring its safe operation and avoiding collisions. For example, the processing unit may use the data collected from the depth/ToF cameras, among others, to determine the characteristics of the obstacle and the volume it occupies. Based on the data collected, the processing unit is configured to adjust the flight path of the UAV to avoid the obstacle and continue its operations. If the obstacle blocks the entire path of the UAV, the processing unit may guide the UAV back to the starting point and/or issue a notification to an operator.

[0031] According to embodiments of the present invention, the sensor module comprises an altitude sensor unit configured to measure the distance and/or position of the UAV in relation to a reference bottom level of the building passageway, and an Inertia Measurement Unit, IMU, sensor.

[0032] According to embodiments of the present invention, the altitude sensor unit comprises a laser unit and/or a sonar altimeter unit.

[0033] The sensor module may further be provided with an altitude sensor, which is configured to measure the position of the UAV from a bottom level and a top level of the building passageway, which may enable to extend the inspection coverage of the UAV.

[0034] According to embodiments of the present invention, the vertical building passageway is an elevator shaft. According to embodiments of the present invention, the processing unit is configured to perform the building passageway inspection by guiding the UAV towards successively calculated target points defined with respect to a global reference point along the UAV computed flight path, the global reference point being associated with the starting position of the UAV.

[0035] The processing unit is configured to perform the inspection of the building passageway by guiding the UAV towards successive target points positioned between a predetermined threshold distance from a bottom level and a top level of the building passageway, the target points being successively calculated based on data collected from the sensor module such that an inspection is carried out substantially along the entire internal space of the building passageway, upon completing the inspection the processing unit is configured to guide the UAV towards the entrance opening, where the UAV exits the internal space of the building passageway and is guided back to the starting position.

[0036] The processing unit of the UAV is configured to process the data collected from the sensor module and accordingly compute a flight path for the UAV between successive calculated target points. For example, the processing unit may use the data collected by the depth/ToF cameras to calculate a target point, or target location, within the internal space of the building passageway. The processing unit is configured to guide the UAV to the calculated target point along the respective UAV direction e.g. upwards or downwards along the building passageway, and upon arrival at the calculated target point calculate the next target point. As such, the processing unit guides the UAV through successive target points, until a desired portion of the internal space of the building passage is scanned. The desired portion of the internal space is defined between a bottom reference point and top reference point of the building passageway. The bottom and top reference points may be positioned at a predetermined threshold distance from the top level and bottom level of the building passageway. For example, processing unit is configured, upon positioning the UAV in the internal space, to guide the UAV first towards a bottom level of the building passageway, upon reaching a predetermined distance from the bottom level, the processing unit is configured to guide the UAV towards a top level of the building passageway, and upon reaching a predetermined distance from the top level, the processing unit is configured to guide the UAV towards the entrance opening, where the UAV exits the internal space of the building passageway and is guided back to the starting position.

[0037] The processing unit uses the data collected from the sensor module to guide the UAV within the building passageway, such that its internal space and surrounding walls can be inspected with greater accuracy along its entire length. During the inspection and with the UAV positioned in the internal space, the processing unit may guide the UAV first towards a first target point, as previously described, e.g., towards the bottom level of the building passageway. According to embodiments of the present invention, the processing unit is configured to collect and store the data obtained from the sensor module at each time step in the form of a set of visual data such as images or videos and three-dimensional, 3D, points expressed with respect to a global reference point associated with the starting position of the UAV. The sets of 3D points collected during inspection of the building passageway are inserted into a global 3D point cloud, which global point cloud is communicated to the central computer platform.

[0038] During the operation of the UAV, the data collected from the sensor module at each time step are collected and processed by the processing unit. The processing unit stores the data collected from each time step in the form of a set of 3D points. Each set of 3D points corresponds to a section of the building passageway, which is added to a global 3D point cloud representing the entire building passageway. The processing unit communicates the visual data, the global 3D point cloud and other information, such as the geographical location, to the central computer platform, where the data is processed and graphically represented to a user. The processing unit may be configured to flag measurement data exceeding a certain threshold. For example, if the characteristics of a wall section are over a predetermined threshold, e.g., vertical wall angle is over or below a predetermined threshold, the processing unit is configured to flag the corresponding measurement data in the dataset, e.g., by adding metadata.

[0039] The sensor module may be further provided with a photodiode array mounted at the bottom of the UAV comprising a plurality of photodiodes and a semi-reflective glass positioned on top of the photodiodes, wherein the semi- reflective glass is configured to reflect a part of the energy of a laser beam emitted by a supporting ground station positioned at a bottom part of the elevator shaft, while allowing the remaining energy of the laser beam to be absorbed by the photodiodes and converted into a corresponding current value, and wherein the processing unit is configured to measure the current values of each photodiode to determine the lateral displacement of the UAV with respect to the laser beam. The photodiode array is configured to reflect part of the energy of the emitted laser beam, while at least part of the remaining energy is absorbed photodiodes. The reflected energy is captured by a laser rangefinder of the ground station, which processes the sensed data to calculate the distance of the rotating plate of the ground station to the UAV. The UAV processing unit, or a dedicated computing device, measures the current generated by each photodiode in the photodiode array. Accordingly, the processing unit based on the measured current detects the displaced of the UAV from the gravity vector, which is represented by the emitted laser beam. The displacement is calculated based on the position on the array of the photodiodes capturing the emitted laser beam, and thus generating the highest current, from a central location of the array.

[0040] According to a second aspect of the present invention, a central computer platform is provided configured for processing inspection data collected from an autonomous UAV according to embodiments of the first aspect during inspection of a building passageway, the central computer platform comprising: a communication module configured to receive data from a UAV according to embodiments of the first aspect during inspection of a building passageway; a processing module configured to process the collected data to compute a set of structural characteristics of the building passageway, and based on the computed values of the structural characteristics and the collected data to generate an as-built model of the building passageway; and a graphical user interface configured to display a graphical representation of the as-built model of the building passageway.

[0041] According to embodiments of the present invention, the processing unit is configured to superimpose on the graphical user interface the as-built model with a virtual reference model of the elevator shaft to highlight offsets between the measured dimensions of the internal space obtained during inspection with reference dimensions of the virtual model. The superimposed measurement information displayed on the graphical user interface is configured to provide visual and textual information to the user associated with the inspection carried out by the UAV, the visual and textual information comprising information associated with regions of the elevator shaft where the measured dimensions exceed a predetermined threshold from the reference dimensions, the geographical location of the building, and images taken during inspection.

[0042] The present invention further provides a central computer platform that is configured to receive and process the data collected from the UAV during the inspection of the internal space of the building passageway. The central computing platform may be communicatively coupled to the UAV via a communication module of the processing unit. The processing unit may be configured to transmit the collected scan data, which is then processed at the central computer platform before being graphically displayed to a user. The central computer platform may be configured to superimpose the collected scan data to a reference virtual model of the building passageway and accordingly highlight sections of the reference model with a discrepancy between the measured and reference data. As such, the user is able to quickly identify areas that require special attention and accordingly initiate, if required, a set of corrective actions, such as repairs or ordering adjustments to the equipment that would operate within the building passageway, such as adjustments to the elevator shaft.

[0043] The processing unit may be configured to superimpose on the graphical user interface the as-built model with a virtual reference model of the elevator shaft to highlight offsets between the measured dimensions of the internal space obtained during inspection with reference dimensions of the virtual model.

[0044] The superimposed measurement information displayed on the graphical user interface is configured to provide visual and textual information to the user associated with the inspection carried out by the UAV (200), the visual and textual information comprising information associated with regions of the elevator shaft where the measured dimensions exceed a predetermined threshold from the reference dimensions, the geographical location of the building, and images taken during inspection.

[0045] According to a third aspect of the present invention, a method for operating a UAV according to embodiments of the first aspect is provided for the inspection of a building passageway, the method comprises the steps of: positioning the UAV in a starting position; collecting data, by the sensor module, associated with a set of parameters associated with the surrounding environment and/or the UAV position and operation; generating, based on the collected data, a representative three-dimensional scan dataset comprising spatial measurements of the surrounding environment characteristics; processing, by a processing unit, the collected data and scan data generated by the sensor module to determine an opening of the elevator shaft; calculating, based on the scan dataset collected, a UAV flight path towards the detected opening; guiding, by the processing unit, the UAV through the opening and into an internal space of the elevator shaft; inspecting the internal space of the elevator shaft by performing the steps of guiding, by the processing unit, the UAV between successively calculated target points defined with respect to a global reference point along a respectively computed UAV flight path within the elevator shaft the successive target points and corresponding UAV flight path being calculated by the processing unit based on data collected, by the sensor module, at predetermined time steps along the UAV flight path, collecting, by the processing unit, the scan data collected by the sensor module at each time step; wherein, upon completing the inspection of the internal space of the elevator shaft, the processing unit performs the steps of: guiding the UAV to the entrance opening via the calculated target points; calculating a flight path from the entrance opening to the starting position; guiding the UAV to the starting position; and communicating the scan data collected by the sensor module to a central computing platform;

[0046] According to embodiments of the third aspect, the step of guiding step of guiding the UAV between successively calculated target points comprises the steps of: determining an occupancy grid between the successively calculated target points to determine any obstacles occupying the scanned space; and calculating a minimum jerk trajectory from a UAV current position to a calculated target point based on information from the occupancy grid; and guiding the UAV from the current position to the target point along the computed flight path. [0047] According to a fourth aspect of the present invention provides an elevator shaft inspection system comprising a UAV according to embodiments of the first aspect; a computer platform according to embodiments of the second aspect; and a ground station comprising a laser rangefinder configured to emit a laser beam that is aligned with the gravity vector of the elevator shaft; wherein the ground station is configured to measure the distance between the ground station and the UAV by measuring the energy of a laser beam reflected by the UAV, and communicate the detected distance, via communication link, to the processing unit of the UAV.

[0048] The range finder is mounted on a rotating base of the ground station configured to be rotated by two rotational actuators that provide two degrees of freedom to the rotating base so that the laser beam can be aligned with the gravity vector of the elevator shaft.

[0049] The inclination of the rotating base with respect to the gravity vector is measured by an Inclinometer/Gyroscope mounted on the rotating base, and wherein the rotation of the rotating base is controlled by a computing unit based on information measured by the Inclinometer/Gyroscope.

[0050] The Ground Station is configured to support the localization of an autonomous UAV according to embodiments of the first aspect during inspection of a building passageway in tall buildings where drift of the VIO localization estimation is expected or when the UAV is operated under harsh environmental conditions such as extremely low illumination, dusty environment or lack of visual features in general. The Ground station may be provided with a Wireless Communication Module configured to transfer data from/to a UAV according to embodiments of the first aspect during inspection of a building passageway; a Laser Rangefinder configured to measure the distance from the Ground station to the UAV according to embodiments of the first aspect during inspection of a building passageway; a rotating base where the Laser Rangefinder is mounted configured to move using three actuators the Laser Rangefinder from the Ground floor to the inner space of the building passageway and align its laser beam with the gravity vector during inspection of the building passageway by the autonomous UAV; a high precision Inclinometer/Gyroscope configured to measure the inclination of the rotating base with respect to the gravity vector; a computing unit configured to regulate the actuators such as the rotating base is aligned with the gravity vector, read the Laser Rangefinder sensor output and send it (via the Wireless Communication Module) to the UAV according to embodiments of the first aspect during inspection of a building passageway;

[0051] The Ground Station is placed on the Ground floor of the shaft and activated by the user.

[0052] According to embodiments of the fourth aspect an array of photodiodes is mounted in the bottom part of the UAV configured to read the laser beam by means of a photodiode array mounted at the bottom of the UAV and provide the absolute position of the UAV along the lateral directions (x,y) in relation to the rotating base of the Ground Station.

In general, the present invention offers a range of advantages over existing solution such as:

- accurate inspection of indoor spaces without the need of a GPS signal such as elevator shafts, and other type of vertical corridors e.g. air corridors;

- autonomous mode of operation of the UAV for the inspection of the indoor spaces without the need of a GPS signal and without the active participation of an operator;

- direct data transfer from the autonomous UAV to a central computing platform via a wireless communication interface; - managing, processing, and graphically displaying the collected scan data together with the reference virtual model to a user, thereby ensuring that discrepancies between the measured and reference dimensions are quickly identified.

- a novel Ground Station configured to cooperate with a photodiode array mounted on the UAV for the accurate localisation of the UAV in harsh environmental conditions e.g. low lighting, dusty environment, tall buildings, etc.

Brief Description of the drawings

[0053] The following drawings are provided as an example to explain further and describe various aspects of the invention:

Figure 1 shows an exemplified implementation of an inspection system according to embodiments of the present invention.

Figures 2 and 3 show respectively a cross-sectional and top view of an exemplified graphical representation of the autonomous unmanned aerial vehicle (UAV) according to embodiments of the present invention.

Figure 4a and 4b show respectively a cross-sectional and bottom view of an exemplified implementation of the UAV of figures 2 and 3 provided with a photodiode array mounted on the bottom of the UAV according to embodiments of the present invention.

Figure 5a and 5b show respectively a cross-sectional and a top view (as seen from the bottom of the UAV) of the photodiode array shown in Figures 4a and 4b according to embodiments of the present invention.

Figures 6a and 6b show a cross-sectional view of a supporting ground station respectively in an open and closed position according to embodiments of the present invention.

Figure 7 shows an exemplified method for measuring the dimensions of an elevator shaft according to embodiments of the present invention. Figure 8 shows an exemplified graphical representation of the Ground Station in operation to provide localization support to the UAV during the inspection of a building passageway according to embodiments of the present invention.

Detailed Description

[0054] The present invention will be illustrated using the exemplified embodiments shown in the figures, which will be described in more detail below. It should be noted that any references made to dimensions are only indicative and do not restrict the invention in any way. While this invention has been shown and described with reference to certain illustrated embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention.

[0055] Figure 1 shows an exemplified implementation of an inspection system according to embodiments of the present invention. The inspection system is configured for inspecting the internal spaces of building passageways, such as elevator shafts 100. It should be noted that although the invention would be exemplified using as an example of a building passageway, an elevator shaft, the invention could equally apply to other types of building passageways, such as air vents and the like. As shown, elevator shaft 100 is provided with a plurality of opening 110, which would form the entrance points to the elevator cart. During the elevator shaft construction, an inspection is carried out to ensure that the internal dimensions of the elevator shaft are according to the specified reference design dimensions. According to the present invention, the inspection is carried out by an autonomous unmanned aerial vehicle (UAV) 200, which enters through an opening 110 in the internal space of the shaft 100 and carries out an array of measurements. The internal space of the elevator shaft 100 is defined by the corresponding walls 120.

[0056] The measurements collected are transmitted via a communication network 400 to a central cloud computing platform 300, where the measurements and other information transmitted by the UAV 200 is processed and displayed to a user 500. The measurement data collected by the UAV 200 may be in the form of visual data such as videos and/or images and a 3D point cloud representing the elevator shaft. The 3D point cloud may be annotated with other data, such as the speed of the UAV 200, measurement offsets, and the like. The visual, 3D point cloud and any other information transmitted by the UAV 200 is received at the communication module 320 of the cloud computing platform 300. The communication module 320 may store the information received in a database 340. A processing unit 310 may retrieve the transmitted information from the database 340 for processing and analysis. At the processing unit 310, the measurement information may be superimposed with a virtual reference model of the elevator shaft 100, and an offset between the measured dimension of the internal space with the reference dimension of the virtual model may be highlighted. The superimposed measurement information may be displayed on a graphical user interface (GUI) 330, where the user 500 is provided with visual and textual information associated with the inspection carried out by the UAV 200, such as regions of the elevator shaft 100 where the measured dimensions exceed a predetermined threshold from the reference data, the geographical location of the building, images taken at each stage, and the like. The user 500 may further use the cloud platform 300 to initiate corresponding actions. For example, the user 500, based on the information received, may design or alter the dimensions of complementary components for the building passageway. For example, in the case of an elevator shaft 100, the user 500 may use the measured dimensions from the UAV 200 to alter the dimensions of the elevator cart so that it fits the actual dimensions of the shaft. Furthermore, the user 500 may be configured to control via the cloud platform 300 the operation of the UAV 200, e.g. by issuing activation commands and/or requesting specific tasks to be performed during inspection of the elevator shaft 100. The UAV 200 is configured to inspect the internal space of the elevator shaft 100 defined between the bottom level 130 and the top level 140 of the elevator shaft. To avoid collision, the UAV 200 is configured to stop at a predetermined distance from the bottom level 130 and top level 140. As previously explained, the measurements are taken at distinct measurement steps along the elevator shaft 100. A supporting ground station 600 may be further provided to provide localization information to the UAV 200 in order to reduce the drift of the VIO localization approaches in tall buildings or in case of extreme environmental conditions such as but not limited to low lighting, heavy dust or in general lack of visual features. The supporting ground station would be described in more details with reference to figures 6a and 6b.

[0057] Figures 2 and 3 show a cross-sectional and a top-view of an exemplified implementation of an autonomous UAV 200 according to embodiments of the present invention. The autonomous UAV 200 is configured to inspect the internal space of the elevator shaft 100 based on information gathered from a sensor module. As shown in figures 2 and 3, the sensor module comprises a range of sensory instruments and processing systems mounted on a frame 211. Frame 211 is provided with a plurality of propellers 290 arranged around frame 211 , coupled to corresponding motors controlled by the processing unit 297. Each propeller 290 may be provided with a guard 280, to protect it from a possible collision with the walls or any other obstacle. The processing unit 297 is an embedded system comprising a plurality of control and processing modules such as a flight control system 270 configured to control the operation of the propeller motors, a communication module 240, a global positioning system (GPS) 250, and the like. The sensor module comprises a range of sensor units such as a 360-degree Lidar module, a Visual Inertial Odometry camera 210, an upward facing RGB-depth/ToF camera 220, a downward facing RGB-depth/ToF camera 225, Laser and/or sonar altimeter module 260 to measure distances from objects, a battery 215, and the like.

[0058] The UAV 200 is positioned by an operator at the starting position, who activates the UAV 200. The processing unit 297 is configured upon receiving the activation signal to obtain the geographic coordinates of the UAV 200 via the GPS module 250. As the UAV 200 is operated indoors, the GPS signal may not be available. In this case, the geographic coordinates may be obtained manually by the operator or may be obtained from a mobile network to which the UAV 200 is connected via the communication module 240, e.g. 5G, 4G, Wi-Fi, and the like. The UAV 200 may be positioned at the lowest entry point of the elevator shaft, in case, for example, the building is under construction and access to the other floors is restricted or not possible.

[0059] The processing unit 297 is configured to control the position of the UAV 200 based on measurements obtained from the sensor module, such as measurements obtained from an inertia measurement unit based on acceleration data, the Visual Inertial Odometry (VIO) camera 210 for localisation of the UAV 200, the RGB-Depth/ToF cameras 220,225 for structural data acquisition and obstacles detection, the LiDaR module 230, and the like. The processing unit 297 collects the data generated from the sensor module and accordingly calculates a flight path for the UAV 200, which is transmitted for execution to a flight control system 270, which is configured to operate the propeller motors. The processing unit 297 may further adjust the flight path based on the data collected, e.g. in the case of a detecting an obstacle and/or to compensate for disturbances wind gusts and the like.

[0060] Once the autonomous UAV 200 is activated by the user, the processing unit 297 performs a set of initialisation checks, and if successfully completed, the processing unit calculates a flight path from the starting position to the location of the entrance opening 110 of the elevator shaft. The flight control system 270 guides the UAV 200 from the starting position to the internal space of the elevator shaft 100 through the opening 110. Once in the internal space, the processing unit 297 conducts a set of spatial measurements using the sensor module to calibrate the position of the UAV 200 in the centre of the shaft in relation to the width and depth of the shaft to ensure the maximum possible safety distance from the walls120. Once the calibration has been completed, the processing unit 297 calculates a flight path for guiding the UAV 200 within the internal space of the shaft 100 to start the inspection of the space while maintaining the UAV at an appropriate position with a minimum preferred distance from the walls and other obstacles. For example, the processing unit 297 may first guide the UAV 200 towards the bottom of the shaft 100 while scanning the collecting data from the surrounding environment using the sensor module. When it reaches the bottom level of the well, the processing unit 297 may reverse the course of the UAV 200 to begin inspection of the internal space towards the upper part of the shaft 100. During the inspection of the internal space, processing unit 297 is configured to collect data from the sensor module associated with the width, depth, height and side openings of shaft 100. During the inspection of the internal space, the measurement data collected may be in the form of visual data and a 3D Point Cloud, which can be stored in a storage space of the processing unit and/or transmitted over a wireless network to the cloud computing platform 300 (Wi- Fi,3G,4G,5G). The sensor module may further measure the angle of the walls, the vertical distances per floor, the averages of the width and depth of the openings 100 per floor and between the floors, the bottom (pit) depth, the path from the bottom height of the first floor to the bottom of the top floor, and the height of the top floor.

[0061] The processing unit 297 may process the measurement data collected to determine deviations of the measured dimensions from reference dimensions. The processing unit 297 is configured to identify and label any deviations identified, and the information is included in the information to be transmitted to the cloud computing platform 300.

[0062] During the flight, the processing unit 297 may adjust the flight path of the UAV 200, e.g. by performing manoeuvres, to avoid obstacles or compensate for environmental disturbances, such as wind gusts, which may affect the accuracy and reliability of measurements by adding noise. The processing unit 297 may be configured to analyse the quality of the measurement data collected, and if the quality is below a certain threshold, the processing unit 297 is configured to repeat the measurements to correct errors.

[0063] The cloud platform 300, as previously indicated, is configured to process the 3D point cloud data, the visual data and other information transmitted from the UAV 200 to determine the dimensions of the shaft 100. The cloud platform 300 is configured to store the 3D point cloud data in a database 340 together with the geographical location of shaft 100, the time and day of measurement, and other information which is used to generate a unique identifier for the shaft 100. The cloud platform 300 is configured to generate a virtual representation of the shaft 100 based on the information transmitted from the UAV 200, which may be compared and/or superimposed to a reference model of the shaft. As such, deviations between the measured dimensions and reference dimensions may be easily identified and corrected.

[0064] The cloud platform 300 may be hosted on one or more computer servers and is accessible by a plurality of users 500, via internet. The information may be displayed and/or shared with other registered users.

[0065] The UAV 200 may be further provided with a 3.0-speed Universal Serial Bus (USB) port for computer communication and charging the UAV battery 215.

[0066] Frame 211 of the UAV 200 may be made of aluminium or carbon fibre, while the casing mounted on frame 211 , which protects the sensor module and processing module, may be made from polymer materials with an IP67 rating for dust and moisture resistance. A special moisture coating may be provided to protect the sensor module and processing unit from moisture.

[0067] Referring now to figures 4a and 4b, the UAV 200 may be provided with a photodiode array 285, which is mounted to the bottom of the UAV 200. The photodiode array 285 may be provided in addition to or as an alternative to the Laser and/or sonar altimeter 260 and may be part of the sensor module of the UAV 200. The photodiode array 285 comprises a plurality of photodiodes equally spaced apart from oner another with respect to a central photodiode placed at or near the centre of the photodiode array 285.

[0068] Figures 5a and 5b show a cross-sectional and top-view, as seen from the bottom of the UAV 200, of the photodiode array 285 shown in Figures 4a and 4b. The array of photodiodes is covered with a semi-reflective glass 2852 that is configured to reflect part of the energy of a laser beam 645 emitted from the supporting ground station 600, while allowing the rest of the laser beam energy to be absorbed by the photodiodes 2851 . The semi-reflected glass 2852 may be manufactured and tuned with the desired reflectivity properties by using any known process such as film coated, laser processing, and the like. For example, the semi-reflective glass 2852 may be tuned to reflect 70% of the laser beam energy while allowing 30 % of the energy to pass-through. However, the reflectivity to pass-through ratio may be tuned to any desired value. The semi- reflective glass 2852 may be placed at a predetermined distance from the top of the photodiodes 2851 , as shown in figure 4a. The distance between the semi- reflective glass 2852 and the photodiodes 2851 may be in the range of 1 .0-10.0 mm or any other desired distance. The photodiodes 2851 may be arranged in an array configuration on top of an electronic pad 2853 that provides an electrical connection between the photodiode array 285 and the rest of the components of the UAV 200 such as the processing unit 297, the battery 215 and the like. The photodiodes 2851 may be arranged at an equal distance from one another, e.g. in the range of 3.0 to 5.0 mm measured from the respective centres of each photodiode 2851 . A photodiode 2851 is a semiconductor device that converts light into an electrical current. The electrical current may be measured by the processing unit 297 or a dedicated microcontroller (not shown). The array of photodiodes 285 is mounted in the centre of the bottom part of the UAV 200. In operation with a supporting ground station 600, the processing unit 297 or the dedicated microcontroller processes the current generated by each photodiode 2851 to detect the location of the laser beam 645 in the photodiode array 645 and thus the displacement of UAV 200 with respect to the laser beam emitted by the supporting ground station 600. In other words, the photodiode or photodiodes that absorbs the maximum energy of the laser beam 645 would generate higher current values with respect to the other photodiodes in the array. Since the photodiodes are equally spaced from one another and are evenly distributed about the array with respect to a central location, the location of each photodiode could be identified by its X and Y coordinates in relation to the central location, as shown in figure 5b. A photodiode 2851 may be placed at the central location of the array 285. In this way, the lateral displacement of the UAV 200 with respect to the ground station emitting the laser beam 645 is detected. Combining the distance measurements measured with a Laser Rangefinder 640 of the ground station 600 from the reflected energy of the laser beam 645 and the lateral displacement measurements provided by the array of photodiodes 285 the position of the UAV 200 in relation to the Ground station rotating base 650 can be calculated. The dimension of the photodiode array 285 and the distance between two photodiodes 2851 may be varied according to the accuracy requirements of the desired application. [0069] Figures 6a and 6b show cross-sectional views of an exemplified implementation of the supporting Ground Station system 600 respectively in an open position and a closed position. The supporting ground station 600, as previously described, is configured to provide localization information to the UAV 200 in order to reduce the drift of the VIO localization approaches in tall buildings or in case of extreme environmental conditions such as but not limited to low lighting, heavy dust or in general lack of visual features. Referring to Figure 6a and 6b, the components of the Ground station 600 are enclosed in a case 610 manufactured for transportation and protection of its components. The upper part of the case 615 is mounted on rotational actuator 620, i.e. , an electrical motor that can rotate the upper case of the case 615 between a closed position and open position so that it can be transferred to the inner space of the building passageway. On the upper part of the case 615, a Laser Rangefinder 640 is mounted that is configured to measure the distance from the Ground station to the UAV 200 by emitting a laser beam 645. As previously discussed with reference to figures 4a, 4b and 5a and 5b, part of the energy of the emitted laser beam 645 is reflected by the semi-reflective glass 2852 of the photodiode array 28, which is located at the bottom part of the UAV. The reflected energy is collected by the Laser Rangefinder 640, where the information is processed by a local computer 675 to determine the distance from the Ground Station 600 to the autonomous UAV 200. The Laser Rangefinder 640 is mounted on a Rotating base 650 that is configured to be rotated by two rotational actuators 625,630 that provide two degrees of freedom to the Rotating base 650 so that the laser beam 645 can be aligned with the Gravity vector of the building passageways 100. The inclination of the Rotating base 650 with respect to the gravity vector is measured by a high precision Inclinometer/Gyroscope mounted on the Rotating base 650.

[0070] As shown in Figure 7, the Ground station 600 configured to communicate during operation with the UAV 200. For that purpose, the ground station 600 is equipped with a Wireless Communication Module 670 configured to transfer data from/to the autonomous UAV 200, such as the distance measured from the Laser Rangefinder 640 and the like. A computing unit 675 is provided in the case 610 configured to operate the actuators 620,625,630 for the upper part of the case 615 and the rotating base 650 so that the emitted laser beam 645 is aligned with the gravity vector. The local computer 675 reads the Laser Rangefinder 640 sensor output and transmits the resulting distance and/or the raw sensor data collected to the UAV 200 via a Wireless Communication link provided by a communication Module 670. A battery 680 is provided to power the supporting Ground Station 600 during operation.

[0071] Referring now to Figure 8, for an elevator shaft 100 the structural measurements include specific height measurements, which include the distance between the bottom level 130 of the shaft 100 to the bottom of the first opening 110a, the distance between the bottom level of the last opening 110b to the top level 140 and the distance between the bottom level of the first opening 110a to the bottom level of the last opening 110b. Regarding the depth and width of the shaft 100, the values are calculated with respect to the gravity vector at the geographic location of the building passageway, taking into account the absolute vertically, which is calculated via the acceleration measurement from the IMU sensor and an inclinometer of the UAV. In particular, distance measurements are calculated at densely sampled levels 111 within the shaft 100 between the gravity vector and the side walls of the shaft 100, and the minimum values along depth and width direction are stored. This calculation process can be conducted both from the UAV 200 during the flight and the central computer platform 300 in the post-processing procedure. The sampled levels 111 are taken, as previously described, at predetermined time steps.

[0072] An exemplified method for operating the UAV 200 of the present invention is provided below. The method comprises the steps of:

- positioning the UAV 200 in a starting position, such as in front of the entrance opening 110 of the elevator shaft 100;

- scanning, by the sensor module and/or the LiDaR module 230, the surrounding environment;

-processing, by a processing unit 297, the scan data generated by the sensor module and/or LiDaR module to determine an opening of the elevator shaft 100; - calculating, by the processing unit 297 and based on the scan data collected, a flight path to the opening;

- guiding, by the processing unit 297, the UAV 200 through the opening 110 and into an internal space of the elevator shaft 100;

- performing, by the sensor module and/or LiDaR module 230, a set of spatial measurements to determine distances from the walls 120 of the elevator shaft;

- positioning, by the processing unit 297, the UAV 200 at a predetermined distance from the walls 120 of the elevator shaft 100;

- inspecting the internal space of the elevator shaft 100 by performing the steps of:

- guiding, by the processing unit 297, the UAV 200 between successively calculated target points, also referred to as goals, defined with respect to a global reference point along a computed flight path within the elevator shaft 100, the flight path being calculated by the processing unit 297 based on data collected by the sensor and/or Lidar module 230 and/or instruction stored in the memory of the processing unit 297 or received by a user,

- scanning, by the sensor module and LiDaR module 230, the internal space at predetermined time steps along the calculated flight path,

- collecting, by the processing unit 297, the scan data collected at each time step; wherein, upon completing the inspection of the internal space of the elevator shaft 100, the processing unit 297 performs the steps of :

- guiding the UAV 200 to the entrance opening 110;

- calculating a flight path from the entrance opening to the starting position;

- guiding the UAV 200 to the starting position; and

- communicating the scan data collected to a central computing platform.

[0073] The step of guiding the UAV between successively calculated target points comprises the steps of: - calculating a minimum jerk trajectory from a UAV current position to the position of calculated target point within the internal space of the building passageway; and

- obtaining information from an occupancy grid between the UAV current position and the target point along the computed flight path. The current UAV position is calculated with respect to the global reference point, which is associated with the UAV starting position.

[0074] The different stages of the UAV 200 operation for performing the inspection of the building passageways are exemplified with the method steps shown below:

Activation stage:

1 . Position the UAV 200 on the ground floor of the elevator shaft 100 facing the shaft opening 110(door).

2. Start all sensors and activate the processing unit 297

3. Obtain the laser value in the forward direction of the 360-degree rotating lidar 230 to determine the distance of the shaft’s back wall 120.

4. Obtain the acceleration measurement from an IMU sensor to calculate the vertically of the global reference frame, which is associated with the starting position of the UAV. For example, the global reference point may be the coordinates of the UAV starting position defined as (x,y,z) = (0,0,0) along with the unit vectors which define the reference directions along which the orientation of the UAV. The verticality is determined by calculating the angle between the z-axis of the global reference frame and the gravity vector, calculated by the IMU.

5. The UAV’s pose (position and orientation) with respect to a global reference frame is calculated during the flight from the VIO camera 210, from the RGB-Depth/ToF Cameras 220,225 and the 360-degree rotating lidar 230 combined with the Laser and/or sonar altimeter 260 and the IMU (Inertial Measurement Unit). This information fusion is contributing not only to the accuracy of the measurements but also to the flight robustness in case of failure of one method due to hardware problem or harsh conditions in the environment (e.g. dust, poor illumination). At predetermined time steps, the information obtained from the RGB- Depth/ToF Cameras 220,225, the 360-degree rotating lidar 230, and the Laser and/or sonar altimeter 260 is represented as 3D points expressed with respect to the global reference frame and is inserted into an occupancy grid, which indicates the possibility of a cuboid to be occupied by an obstacle in the 3D space. Arm propeller motors and take off.

Entering the elevator shaft Calculate at the processing unit 297, a first goal, also referred to as target point, to guide the UAV 200 from the starting position to the entrance position inside the elevator shaft 100. Calculate the minimum jerk trajectory towards the first goal, obtaining information from the occupancy grid to avoid possible collisions with obstacles in the space. At each time step, the target position of the calculated trajectory is given as an input to a Model Predictive Controller of the processing module 297, which takes into account the dynamic parameters of the UAV 200, e.g. mass, Inertia, and the like, and the current pose and velocity, see step 5, of the UAV 200 to calculate the reference velocities for the flight controller 270. At each time step the flight controller 270 receives as input the reference velocities from step 10 and takes into account the current velocity, state estimation at step 5, of the UAV 200, in order to calculate the angular velocity of each motor, thereby reducing the velocity error between a reference and the current velocity;

Descend to the bottom level Calculate at the processing unit 297 the second goal to guide the UAV 200 from the entry position to the lowermost position of the shaft 100. 13. Calculate the minimum jerk trajectory towards the second goal, obtaining information from the occupancy grid to avoid possible collisions with obstacles in the space.

14. Repeat step 10

15. Repeat step 11

16. At each time step, the information obtained from the RGB-Depth/ToF Cameras 220,225 , and/or the 360-degree rotating lidar 230, and/or the Laser and/or sonar altimeter 260, and/or the photodiode array 285 and the supporting ground station 600 is represented as 3D points expressed with respect to the global reference frame and is inserted into a global Point Cloud, which will be used for post-processing for the extraction of structural measurements of the elevator shaft 100 in the cloud platform.

17. At each time step, calculate by the processing unit 297 the distance between a reference point in the x,y plane and the side walls 120 to estimate during the flight the structural measurements of the elevator shaft. Calculate the height of each floor.

Ascend to Top Level

18. Calculate the i-th goal to guide the UAV 200 from the lowermost position gradually to the top-level position in the elevator shaft 100. The i-th intermediate goal is set along the vertical direction, the z-axis of the global reference frame, and within the maximum range of the RGB-Depth Camera 220, so the UAV 200 can have accurate information about the operating environment.

19. Calculate the minimum jerk trajectory towards the i-th goal, obtaining information from the occupancy grid to avoid possible collisions with obstacles in the space.

20. During ascending Repeat step 10

21. During ascending Repeat step 11

22. During ascending Repeat step 16

23. During ascending Repeat step 17

24. When the top-level position is reached, calculate the total height of the shaft 100. Descend to Entrance Level

25. Use the same goal points calculated for the ascent to guide the UAV 200 from the top-level position gradually to the entrance position in the elevator shaft 100.

26. During descend Repeat step 10

27. During descend Repeat step 11

28. During descend Repeat step 16

29. During descend Repeat step 17

Exit shaft

30. Calculate the minimum jerk trajectory from the entrance position to the starting position, obtaining information from the occupancy grid to avoid possible collisions with obstacles in the space.

31. Repeat step 10

32. Repeat step 11

33. When reaching the starting position, the UAV 200 lands.

34. If possible, transmit the structural measurements and the Point Cloud data through wireless connectivity networks to the Cloud Platform 300.