Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CALIBRATION TECHNIQUE FOR CAPTURING PANORAMIC IMAGES
Document Type and Number:
WIPO Patent Application WO/2017/098090
Kind Code:
A1
Abstract:
A method, an apparatus and a sensor device system for calibration are provided. The method includes mounting the sensor device on a platform (405). The method further includes facilitating rotating the platform to provide a movement of the sensor device in a substantially eccentric circle (410). The method also includes receiving a set of signals from the sensor device while the platform is rotating (415). The method further includes determining a center of projection and one or more rotation angles of the sensor device, based on analysis of the set of signals (420). The method further includes determining a center of rotation of the platform (425). The method also includes determining a displacement between the center of rotation of the platform and the center of projection of the sensor device (430). The method further includes facilitating moving the sensor device by the determined displacement (435).

Inventors:
KAUHANEN HEIKKI (FI)
Application Number:
PCT/FI2016/050867
Publication Date:
June 15, 2017
Filing Date:
December 12, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
AALTO UNIV FOUND (FI)
International Classes:
F16M11/08; G01C11/02; G01M11/02; G02B7/00; G03B17/56; G03B37/02; G12B5/00; G12B13/00
Domestic Patent References:
WO2007031248A22007-03-22
Foreign References:
CN101477294A2009-07-08
Other References:
TOMASI, C. ET AL.: "How to rotate a camera", PROCEEDINGS OF INTERNATIONAL CONFERENCE ON IMAGE ANALYSIS AND PROCESSING, 27 September 1999 (1999-09-27), Venice, Italy, pages 606 - 611, XP010354240
A. KUKKO: "A new method for perspective centre alignment for spherical panoramic imaging", THE PHOTOGRAMMETRIC JOURNAL OF FINLAND, vol. 19, no. 1, 2004, pages 37 - 46, XP055392827
QU, Y. ET AL.: "Multimodal 3D panoramic imaging using a precise rotating platform", PROCEEDINGS OF INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS, 6 July 2010 (2010-07-06), Montreal, Canada, pages 260 - 265, XP031855728
KAUHANEN, H. ET AL.: "Motorized panoramic camera mount - calibration and image capture", PROCEEDINGS OF ISPRS ANNALS OF THE PHOTOGRAMMETRY, REMOTE SENSING AND SPATIAL INFORMATION SCIENCES, vol. III-5, 12 July 2016 (2016-07-12), Prague, Czech Republic, pages 89 - 96, XP055392831
Attorney, Agent or Firm:
PAPULA OY (FI)
Download PDF:
Claims:
CLAIMS

What is claimed is: 1. A method, comprising:

mounting at least one sensor device on a platform, a plane of the platform being defined by a first axis and a second axis substantially perpendicular to the first axis;

facilitating rotating the platform about a third axis substantially perpendicular to the first axis and the second axis, to provide a movement of the at least one sensor device in a substantially eccentric circle;

receiving a set of signals from the at least one sensor device while the platform is rotating;

determining a center of projection and one or more rotation angles of the at least one sensor device, based on analysis of the set of signals;

determining a center of rotation of the platform by minimizing a function based on the center of projection and the one or more rotation angles of the at least one sensor device;

determining a displacement between the center of rotation of the platform and the center of projection of the at least one sensor device; and

facilitating moving the sensor device by the determined displacement.

2. The method as claimed in claim 1, wherein the at least one sensor device is fixedly oriented with respect to one of the first axis and the second axis.

3. The method as claimed in claim 1 , wherein the displacement comprises distances for movement of the at least one sensor device along directions of the first axis and the second axis. 4. The method as claimed in claim 1 further comprising:

facilitating rotating the platform about the first axis, to provide a movement of the at least one sensor device in a substantially eccentric circle;

receiving a subsequent set of signals from the at least one sensor device while the platform is rotating; determining a subsequent center of projection and one or more subsequent rotation angles of the at least one sensor device, based on analysis of the subsequent set of signals;

determining a subsequent center of rotation of the platform by minimizing a function based on the subsequent center of projection and the one or more subsequent rotation angles of the at least one sensor device;

determining a subsequent displacement between the subsequent center of rotation of the platform and the subsequent center of projection of the at least one sensor device; and

facilitating moving the at least sensor device by the determined subsequent displacement.

5. The method as claimed in claim 4, wherein the subsequent displacement comprises distance for movement of the imaging device at least along direction of the third axis.

6. The method as claimed in claim 4 further comprising, facilitating automating the movement of the at least one sensor device by the determined displacement and the determined subsequent displacement.

7. The method as claimed in claim 4 further comprising, facilitating automating triggering of the at least one sensor device for capturing the set of signals and the subsequent set of signals.

8. The method as claimed in claim 1, wherein the function is a circular fit function.

9. The method as claimed in claim 1, wherein the at least one sensor device is one or more of an imaging device and a laser ranging device.

10. The method as claimed in 9, wherein the at least one sensor device is an imaging device.

11. The method as claimed in claim 10 providing for calibrating the platform for capturing a concentric panoramic image.

12. An apparatus, comprising: a memory to store platform calibration instructions; and a processor electronically coupled with the memory, the processor configured to execute the platform calibration instructions stored in the memory to cause the apparatus to perform at least:

facilitating rotating a platform, onto which at least one sensor device is mounted and a plane of which is defined by a first axis and a second axis substantially perpendicular to the second axis, about a third axis substantially perpendicular to the first axis and the second axis, to provide a movement of the at least one sensor device in a substantially eccentric circle;

receiving a set of signals from the at least one sensor device while the platform is rotating;

determining a center of projection and one or more rotation angles of the at least one sensor device, based on analysis of the set of signals;

determining a center of rotation of the platform by minimizing a function based on the center of projection and the one or more rotation angles of the at least one sensor device;

determining a displacement between the center of rotation of the platform and the center of projection of the at least one sensor device; and

facilitating moving the sensor device by the determined displacement.

13. The apparatus as claimed in claim 12, wherein the processor is configured to execute the platform calibration instructions stored in the memory to cause the apparatus to further perform at least:

facilitating rotating the platform about the first axis, to provide a movement of the at least one sensor device in a substantially eccentric circle;

receiving a subsequent set of signals from the at least one sensor device while the platform is rotating;

determining a subsequent center of projection and one or more subsequent rotation angles of the at least one sensor device, based on analysis of the subsequent set of signals;

determining a subsequent center of rotation of the platform by minimizing a function based on the subsequent center of projection and the one or more subsequent rotation angles of the at least one sensor device; determining a subsequent displacement between the subsequent center of rotation of the platform and the subsequent center of projection of the at least one sensor device; and

facilitating moving the at least sensor device by the determined subsequent displacement.

14. The apparatus as claimed in claim 13, wherein the processor is further configured to execute the platform calibration instructions stored in the memory to cause the apparatus to further perform at least automatic movement of the at least one sensor device by the determined displacement and the determined subsequent displacement.

15. The apparatus as claimed in claim 13, wherein the processor is further configured to execute the platform calibration instructions stored in the memory to cause the apparatus to further perform at least automatic triggering of the at least one sensor device for capturing the set of signals and the subsequent set of signals.

16. A sensor device system, comprising:

at least one sensor device;

a platform configured to allow mounting of the at least one sensor device thereon, the platform enabling a movement of the at least one sensor device about three substantially perpendicular first axis, a second axis and a third axis; a memory to store platform calibration instructions; and

a processor electronically coupled with the memory, the processor configured to execute the platform calibration instructions stored in the memory to cause the apparatus to perform at least:

facilitating rotating the platform about the third axis to provide a movement of the at least one sensor device in a substantially eccentric circle; receiving a set of signals from the at least one sensor device while the platform is rotating;

determining a center of projection and one or more rotation angles of the at least one sensor device, based on analysis of the set of signals; determining a center of rotation of the platform by minimizing a function based on the center of projection and one or more rotation angles of the at least one sensor device;

determining a displacement between the center of rotation of the platform and the center of projection of the at least one sensor device; and

facilitating moving the sensor device by the determined displacement.

17. The sensor device system as claimed in claim 16, wherein the processor is configured to execute the platform calibration instructions stored in the memory to cause the apparatus to further perform at least:

facilitating rotating the platform about the first axis, to provide a movement of the at least one sensor device in a substantially eccentric circle;

receiving a subsequent set of signals from the at least one sensor device while the platform is rotating;

determining a subsequent center of projection and one or more subsequent rotation angles of the at least one sensor device, based on analysis of the subsequent set of signals;

determining a subsequent center of rotation of the platform by minimizing a function based on the subsequent center of projection and the one or more subsequent rotation angle of the at least one sensor device;

determining a subsequent displacement between the subsequent center of rotation of the platform and the subsequent center of projection of the at least one sensor device; and

facilitating moving the at least sensor device by the determined subsequent displacement.

18. The sensor device system as claimed in claim 17, wherein the platform comprises: three rails coupled to each other and disposed substantially perpendicular to each other along the first axis, the second axis and the third axis;

at least two translational actuators mounted, respectively, on two of the rails and configured to provide translational movement of the at least one sensor device along the corresponding two rails; and at least one rotational actuators coupled to one of the rail and configured to provide rotational movement of the at least one sensor device about the corresponding one of the rail.

19. The sensor device system as claimed in claim 18, wherein the platform comprises: at least three translational actuators, with each of the translational actuators mounted on each of three rails and configured to provide translational movement of the at least one sensor device along the corresponding two rails; and

at least three rotational actuators, with each of the at least three rotational actuator coupled to each of the rail and configured to provide rotational movement of the at least one sensor device about the corresponding rail.

20. The sensor device system as claimed in claim 18, wherein the processor is configured to regulate the translational and rotational actuators to automatically move the at least one sensor device by the determined displacement and subsequent displacement.

Description:
CALIBRATION TECHNIQUE FOR CAPTURING PANORAMIC IMAGES

TECHNICAL FIELD [0001] The present disclosure generally relates to photogrammetry and, more particularly, to calibration technique for a platform, to which a sensor device is mounted, for capturing panoramic images and other sensor data.

BACKGROUND

[0002] Panoramic images have been created almost as early as there have been cameras. Panoramic images provide impressive and immersive experience of the imaged scene. Recently, panoramic images have attracted more attention with applications in virtual navigation, such as with tools like Google Street View. In addition, development in virtual glasses has created wide interest in capturing better panoramic images. Particularly, for these applications it is required to capture concentric panoramic images which provide the viewer with a sense of depth in the captured image. Moreover, such concentric panoramic images are required for photogrammetric applications, for example, for producing mapping data utilizing Structure-from-motion (SFM) reconstruction techniques or the like.

[0003] Panoramic images can be captured with a rotating frame or a linear array camera system. Typically, these systems have only one camera sensor which capture multiple sub-images which are later be stitched together. For creating a concentric panoramic image, it is desired that all the sub-images have same projection centers, that is, there are no perspective distortions. It is known that the presence of perspective distortions can make seamless image stitching more difficult, and sometimes even impossible. Generally, all panoramic images are captured using a platform. Concentric image capturing requires accurate placing of the camera in the platform in order to ensure that a center of rotation of the platform and a center of projection of the camera coincide.

[0004] Hence, techniques are needed for solving the amounts and directions of necessary camera shifts for calibrating the platform to ensure that a center of rotation of the platform and a center of projection of the camera coincide, and therefore ensure concentric image capturing. Also, such techniques may be beneficial for sensor devices other than camera or in combination with a camera, such as, a laser ranging device. Further, there is a desire to provide a platform which can automate such calibration technique.

SUMMARY

[0005] Various methods, systems and computer readable mediums for calibrating a platform for capturing concentric panoramic images are disclosed.

[0006] In an embodiment, a method is disclosed. The method includes mounting at least one sensor device on a platform. A plane of the platform is defined by a first axis and a second axis substantially perpendicular to the first axis. Further, the method includes facilitating rotating the platform about a third axis substantially perpendicular to the first axis and the second axis, to provide a movement of the at least one sensor device in a substantially eccentric circle. Furthermore, the method also includes receiving a set of signals from the at least one sensor device while the platform is rotating. The method also includes determining a center of projection and one or more rotation angles of the at least one sensor device, based on analysis of the set of signals. The method further includes determining a center of rotation of the platform by minimizing a function based on the center of projection and the one or more rotation angles of the at least one sensor device. The method further includes determining a displacement between the center of rotation of the platform and the center of projection of the at least one sensor device. Based on the determined displacement, the sensor device is moved for calibrating the platform.

[0007] In another embodiment, an apparatus for calibrating a platform is disclosed. The apparatus includes a memory and a processor. The memory stores platform calibration instructions and the processor is electronically coupled with the memory. The processor is configured to execute the platform calibration instructions stored in the memory to cause the apparatus to perform facilitating rotating the platform onto which at least one sensor device is mounted and a plane of which is defined by a first axis, and a second axis substantially perpendicular to the second axis. The platform is rotated about a third axis substantially perpendicular to the first axis and the second axis, to provide a movement of the at least one sensor device in a substantially eccentric circle. The apparatus also receives a set of signals from the at least one sensor device while the platform is rotating. The apparatus further determines a center of projection and one or more rotation angles of the at least one sensor device, based on analysis of the set of signals. The apparatus also determines a center of rotation of the platform by minimizing a function based on the center of projection and the one or more rotation angles of the at least one sensor device. The apparatus also determines a displacement between the center of rotation of the platform and the center of projection of the at least one sensor device. The apparatus facilitates moving the sensor device by the determined displacement.

[0008] In yet another embodiment, a sensor device system is disclosed. The sensor device system includes at least one sensor device. The sensor device system also includes a platform configured to allow mounting of the at least one sensor device thereon. The platform enables a movement of the at least one sensor device about three substantially perpendicular first axis, a second axis and a third axis. The sensor device system further includes a memory to store platform calibration instructions. The sensor device system also includes a processor electronically coupled with the memory. The processor is configured to execute the platform calibration instructions stored in the memory to cause the apparatus to perform at least facilitating rotating the platform about the third axis to provide a movement of the at least one sensor device in a substantially eccentric circle. The apparatus further receives a set of signals from the at least one sensor device while the platform is rotating. The apparatus also determines a center of projection and one or more rotation angles of the at least one sensor device, based on analysis of the set of signals. The apparatus further determines a center of rotation of the platform by minimizing a function based on the center of projection and one or more rotation angles of the at least one sensor device. The apparatus further determines a displacement between the center of rotation of the platform and the center of projection of the at least one sensor device. The apparatus facilitates moving the sensor device by the determined displacement.

[0009] Other aspects and example embodiments are provided in the drawings and the detailed description that follows. BRIEF DESCRIPTION OF THE FIGURES

[0010] For a more complete understanding of example embodiments of the present technology, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:

FIG. 1 illustrates a block diagram representation of an apparatus, in accordance with an example embodiment;

FIG. 2 illustrates a perspective view of a sensor device system, in accordance with an example embodiment;

FIGS. 3 A and 3B illustrate schematic views of the sensor device system, in accordance with an example embodiment;

FIG. 4 illustrates a flow diagram depicting a method for calibration of a platform, in accordance with an example embodiment;

FIG. 5 illustrates a perspective view of a non-motorized mount assembly, in accordance with an example embodiment;

FIG. 6 illustrates a geometric representation to determine displacements for a sensor device about one set of axes, in accordance with an example embodiment;

FIG. 7 illustrates a geometric representation to determine displacements for a sensor device about another set of axes, in accordance with an example embodiment;

FIG. 8 illustrates a geometric representation for correlating two sets of signals to verify an accuracy of calibration, in accordance with an example embodiment; and

FIG. 9 illustrates a sample panoramic image as captured by a sensor device system, in accordance with an example embodiment.

[0011] The drawings referred to in this description are not to be understood as being drawn to scale except if specifically noted, and such drawings are only exemplary in nature. DETAILED DESCRIPTION

[0012] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure can be practiced without these specific details. In other instances, apparatuses and methods are shown in block diagram form only in order to avoid obscuring the present disclosure.

[0013] Reference in this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.

[0014] Moreover, although the following description contains many specifics for the purposes of illustration, anyone skilled in the art will appreciate that many variations and/or alterations to said details are within the scope of the present disclosure. Similarly, although many of the features of the present disclosure are described in terms of each other, or in conjunction with each other, one skilled in the art will appreciate that many of these features can be provided independently of other features. Accordingly, this description of the present disclosure is set forth without any loss of generality to, and without imposing limitations upon, the present disclosure.

[0015] The term 'panoramic image', as used herein, refers to images with horizontally elongated fields of view. Further, the term 'concentric panoramic image' refers to a panoramic image stitched by using sub-images with coinciding centers of projection. The term 'center of projection' for a sensor device, as used herein, refers to a virtual center where substantially all the optical rays captured by the sensor device coincides. The term 'center of rotation' for a platform, as used herein, refers to a point about which the platform rotates for movement of a sensor device mounted thereon. [0016] FIG. 1 illustrates a block diagram of an apparatus 100 for implementing the systems and methods of the disclosure. It is understood that the apparatus 100 as illustrated and hereinafter described is merely illustrative of an apparatus that could benefit from embodiments of the disclosure and, therefore, should not be taken to limit the scope of the disclosure. The apparatus 100 may be any computing or data processing machine for example, a laptop computer, a tablet computer, a mobile phone, a server, and the like. It is noted that the apparatus 100 may include fewer or more components than those depicted in FIG. 1. Moreover, the apparatus 100 may be implemented as a centralized device, or, alternatively, various components of the apparatus 100 may be deployed in a distributed manner while being operatively coupled to each other. In an embodiment, one or more components of the apparatus 100 may be implemented as a set of software layers on top of existing hardware systems.

[0017] In at least one example embodiment, the apparatus 100 includes at least one processor for example, a processor 102, and at least one memory for example, a memory 104. The memory 104 is capable of storing machine executable instructions, particularly the image processing instructions. Further, the processor 102 is capable of executing the stored machine executable instructions. The processor 102 may be embodied in a number of different ways. In an embodiment, the processor 102 may be embodied as one or more of various processing devices, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In an example embodiment, the processor 102 is an Arduino based or a similar processing unit. In at least one example embodiment, the processor 102 utilizes computer program code to cause the apparatus 100 to perform one or more actions responsible for calibration of a system for capturing concentric panoramic images. The processor 102 is programmed to provide a suitable horizontal and vertical overlap between images depending on the image capture requirements, for creating best possible panoramic images.

[0018] The memory 104 may be embodied as one or more volatile memory devices, one or more non-volatile memory devices, and/or a combination of one or more volatile memory devices and non- volatile memory devices. For example, the memory 104 may be embodied as magnetic storage devices (such as hard disk drives, floppy disks, magnetic tapes, etc.), optical magnetic storage devices (e.g., magneto -optical disks), CD- ROM (compact disc read only memory), CD-R (compact disc recordable), CD-R/W (compact disc rewritable), DVD (Digital Versatile Disc), BD (Blu-ray® Disc), and semiconductor memories (such as mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.).

[0019] In at least one embodiment, the apparatus 100 includes a user interface 106 (also referred to as UI 106) for providing an output and/or receiving an input. The user interface 106 is configured to be in communication with the processor 102 and the memory 104. Examples of the user interface 106 include, but are not limited to, an input interface and/or an output interface. Examples of the input interface may include, but are not limited to, a keyboard, a mouse, a joystick, a keypad, a touch screen, soft keys, a microphone, and the like. Examples of the output interface may include, but are not limited to, a display such as light emitting diode display, thin-film transistor (TFT) display, liquid crystal display, active-matrix organic light-emitting diode (AMOLED) display, a microphone, a speaker, ringers, vibrators, and the like. In an example embodiment, the processor 102 may include user interface circuitry configured to control at least some functions of one or more elements of the user interface 106, such as, for example, a speaker, a ringer, a microphone, a display, and/or the like. The processor 102 and/or the user interface circuitry may be configured to control one or more functions of the one or more elements of the user interface 106 through computer program instructions, for example, software and/or firmware, stored in a memory, for example, the memory 104, and/or the like, accessible to the processor 102.

[0020] In an example embodiment, the apparatus 100 includes a sensor device 108. In at least one embodiment of the disclosure, the sensor device 108 is an imaging device. Hereinafter, the terms 'sensor device', 'imaging device' 'camera' and 'laser ranging device' have been used interchangeably throughout the description, without any limitations. The sensor device 108 is configured to be in communication with the processor 102 and/or other components of the apparatus 100 to capture digital image frames, videos and/or other graphic media. The sensor device 108 may include hardware and/or software necessary for taking various kinds of images, for example, planar images, spherical images, or planar panoramic or spherical panoramic images. The sensor device 108 may include hardware, such as a lens and/or other optical component(s) such as one or more image sensors. Examples of one or more image sensors may include, but are not limited to, a complementary metal-oxide semiconductor (CMOS) image sensor, a charge- coupled device (CCD) image sensor, a backside illumination sensor (BSI) and the like. In an example embodiment, the sensor device 108 may further include a processing element such as a co-processor that assists the processor 102 in processing image frame data and an encoder and/or a decoder for compressing and/or decompressing image frame data. The encoder and/or decoder may encode and/or decode according to a standard format, for example, a Joint Photographic Experts Group (JPEG) standard format and the like.

[0021] In an example embodiment, the apparatus 100 includes a mount assembly 110. The mount assembly 110 provides a platform (not shown in FIG. 1) to allow mounting of the sensor device 108 thereon. Hereinafter, sometimes the terms 'mount assembly' and 'platform' have been interchangeably used, without any limitations. In at least one embodiment, the mount assembly 110 is a motorized mount, which has been described in detail in the description of FIG. 2. Alternatively, the mount assembly 110 is a non-motorized mount (as illustrated in FIG. 5) which may be manually calibrated, as desired. In an embodiment, the apparatus 100 provides for calibration of the platform for proper capturing of the signals by the sensor device 108 mounted thereon. In at least one embodiment, the apparatus 100 is configured to provide necessary displacements for the sensor device 108 to calibrate the platform so that a center of rotation of the platform coincides with a center of projection of the sensor device 108, in order to enable the sensor device 108 to capture two or more signals with minimal perspective disorder, so as to produce a concentric panoramic image.

[0022] The various components of the apparatus 100, such as components (102- 110) may communicate with each other via a centralized circuit system 112 to calibrate the platform. The centralized circuit system 112 may be various devices configured to, among other things, provide or enable communication between the components (102-110) of the apparatus 100. The centralized circuit system 112 may provide wire or wireless communication between the components (102-110) of the apparatus 100. In certain embodiments, the centralized circuit system 112 may be a central printed circuit board (PCB) such as a motherboard, a main board, a system board, or a logic board. The centralized circuit system 112 may also, or alternatively, include other printed circuit assemblies (PCAs) or communication channel media.

[0023] In at least one embodiment, the memory 104 is configured to store platform calibration instructions for controlling movement of the platform. In an example embodiment, the platform calibration instructions also include information for triggering of the sensor device 108 to capture signals, as programmed. The platform calibration instructions stored in the memory 104 are executable by the processor 102 for performing a method explained later with reference to FIG. 4.

[0024] FIG. 2 illustrates a perspective view of a sensor device system 200, in accordance with an exemplary embodiment of the disclosure. In the sensor device system 200, the mount assembly 110 provides a platform 114 to which the sensor device 108 is mounted. The mount assembly 110 is manufactured to be a robust structure. In an example embodiment, the mount assembly 110 is made of a metallic substrate, such as aluminum, stainless steel or the like. In an example, the mount assembly 110 is designed to be a portable structure to be easily carried for mobile applications. It is further contemplated that the mount assembly 110 is designed to have a low center of gravity to minimize its movement from small lateral forces.

[0025] In at least one example embodiment, the sensor device 108 is an imaging device including at least one camera. The camera may be a digital or a film based camera without any limitations. In other examples, the sensor device 108 may include two or more cameras arranged in an array of predetermined manner. As illustrated in FIG. 2, the sensor device 108 includes a body 116 which forms an outer shell of the sensor device 108. The body 116 encloses various internal components of the sensor device 108, and further supports various external assemblies which are affixed thereto in a rigid manner. It may be contemplated the body 116 provides a rigid structure for such purposes.

[0026] The platform 114 provides a support surface for mounting the sensor device 108 and is suitably shaped to accommodate the sensor device 108 thereon. In an example embodiment, the body 116 of the sensor device 108 and the platform 114 may include complementary mechanisms (not shown) to mount the sensor device 108 on the platform 114. One simple example for such complementary mechanisms provides one or more extrusions (not shown) from the platform 114 which are configured to be coupled with one or more complementary orifices (not shown) in the body 116 of the sensor device 108; however other elaborated and more sophisticated mechanism may be employed without any limitations. The platform 114 may further include some securing mechanism (not shown) including, but not limited to, the use of hooks, latches, etc. which may be utilized for the purpose of locking the sensor device 108 onto the platform 114. [0027] FIGS. 3A-3B schematically illustrate different views of the mount assembly 110, in accordance with an exemplary embodiment of the disclosure. The mount assembly 110 enables a movement of the sensor device 108 about three substantially perpendicular axes, a first axis 'X', a second axis 'Z' and a third axis Ύ', as representatively shown in FIG. 2 and further in FIGS. 3A-3B. The terms 'first axis', 'second axis' and 'third axis' have interchangeably been used with the terms 'x-axis', 'z- axis' and 'y-axis', respectively. FIGS. 3A-3B also illustrate three rotational axes for the platform 114, namely rotation about the first axis 'X' being 'pitch' rotation, rotation about the second axis 'Z' being 'roll' rotation' and rotation about the third axis 'Y' being the 'yaw' rotation.

[0028] Referring back to FIG. 2, the mount assembly 110 includes three rails, namely a first rail 118 parallel to the first axis 'X', a second rail 120 parallel to the second axis 'Z' and a third rail 122 parallel to the third axis 'Υ', and thereby disposed substantially perpendicular to each other. The three rails 118, 120, 122 enables the movement of the sensor device 108 along all the three axes X, Z and Y, respectively. In the present configuration, the rails 118, 120, 122 are slidably engaged with each other, that is, the first rail 118 can slide with respect to the second rail 120 and the third rail 122, and correspondingly the other two rails can slide with respect to their other two counterparts. In the mount assembly 110, the platform 114, onto which the sensor device 108 is mounted, may be fixed to any one of the rails 118, 120, 122; and thereby the sensor device 108 may be moved and positioned at any point within the spatial constrains of the platform 114 by sliding one or more rails 118, 120, 122.

[0029] In at least one embodiment of the disclosure, the mount assembly 110 is a motorized mount. In other words, the mount assembly 110 is configured to provide a movement to the platform 114 therein, without the need of any external physical input, say by an operator of the sensor device system 200. In an embodiment, the mount assembly 110 includes at least two translational actuators and at least one rotational actuator. Each translational actuator is directly coupled with one of the rails 118, 120, 122 to provide translational movement to the corresponding rail. Further, each rotational actuator is coupled with one of the rails in the mount assembly 110 to provide rotational movement of the platform 114 about the corresponding rail. In an example, the mount assembly 110 includes an actuator (not shown) mounted to a bottom of the platform 114 and provides rotational movement to the platform 114 about its longitudinal axis. It may be understood that this enables the platform 114 to rotate the sensor device 108 about its longitudinal axis, if desired.

[0030] It may be understood that the translational actuator and the rotational actuator as used in the mount assembly 1 10, may have the same specifications; and have only been distinguishingly named to clarify their corresponding purposes. In an example embodiment, the actuators employed in the mount assembly 110 include stepper brushless motors; however, it may be contemplated that other types of actuators like servo motors, and hydraulic or pneumatic cylinders may alternatively be employed without any limitations. Brushless motors are preferably used instead of servo motors as they provide better stabilization, and relatively fast operation, while also being mechanically less complex. Further, the use of stepper motors, as the actuators, allow to control the movement of the rails 1 18, 120, 122 by a desired amount of steps. In an example, each step of an actuator corresponds to approximately 1.8μιη movement of the rail along the corresponding axis. It may be contemplated by a person skilled in the art that each of the actuator may first be connected to a thread rod (not shown), which in turn connects to a nut of the rail for converting a rotational movement of the actuator, in case of a motor, to a linear translational movement of the rail, in the mount assembly 110.

[0031] In at least one embodiment, the mount assembly 110 includes at least two translational actuators and at least one rotational actuator; the two translational actuators being disposed with the first rail 118 along the first axis 'X' and the second rail 120 along the second axis 'Z', and the rotational actuator being disposed to provide rotational movement of the platform 114 about the third axis 'Ύ' (that is, yaw rotation). In an embodiment, the mount assembly 110 includes three translational actuators and three rotational actuators with one of each disposed with one of each of three of the rails 118, 120, 122. In particular, the mount assembly 110 includes a first translational actuator 124 coupled with the first rail 118, a second translational actuator 126 coupled with the second rail 120, and a third translational actuator 128 coupled with the third rail 122. In the illustrated embodiment, it may be seen that the third rail 122 includes two rails disposed along each side of the platform 114 and is further provided with two translational actuators which are synchronized, via belts or the like, to spin by same amount and time. Such configuration is provided to be able to lift the weight of the platform 114 along with the sensor device 108 against the gravity without any substantial jerks. Further, it is shown that the mount assembly 110 includes a first rotational actuator 130 coupled to rotate the platform 114 about the first axis 'X', a second rotational actuator 132 coupled to rotate the platform 114 about the second axis 'Z' and a third rotational actuator 134 coupled to rotate the platform 114 about the third axis Ύ'. Thereby, the mount assembly 110 may allow three translational movement of the sensor device 108 along the three axes X, Y and Z; and further three rotational movements about the three axes X, Y and Z, that is, pitch, yaw and roll rotational movements. It may be contemplated that the sizes, shapes and locations of the actuators as illustrated in FIG. 2 are exemplary only and may be varied based on the design constraints of the mount assembly 110.

[0032] In an embodiment, each of the actuator is controlled by the processor 102 to automate the movement of the sensor device 108 in the mount assembly 110. For this purpose, the actuators are disposed in signal communication with the processor 102, via the centralized circuit system 112. The actuators receive instructions from the processor 102 correlating to one or more of its rotational speed, torque, power, etc. This way the processor 102 can move and/or rotate the one or more rails 118, 120, 122 to move the sensor device 108 to a desired position.

[0033] In at least one embodiment, the mount assembly 110 further includes an Inertial Measurement Unit (IMU) 136 to determine an orientation and movement of the platform 114, in the mount assembly 110. The IMU 136, basically, includes a triad of gyroscopes and triad of accelerometers that measures linear and angular motion of the device to which it is coupled. Such IMUs are well known in the art and particularly for their application in the field of photography; and thus have not been described herein in detail for the brevity of the disclosure. The IMU 136 can be located at suitable location in the mount assembly 110 and connected to the processor 102, via the centralized circuit system 112, to communicate its measurement data. Alternatively, the IMU 136 is integrated/embedded on a same circuit as the processor 102, for example the centralized circuit system 112, to provide its measurement data thereto. In general, the IMU 136 measures the orientation and movement of the platform 114, and thereby the sensor device 108, in the platform 114. In an example, the IMU 136 may additional provide positional feedback to the actuators employed in the mount assembly 1 10.

[0034] In one exemplary embodiment, the entire mount assembly 110 may be mounted on a gimbal arrangement (labelled in FIG. 2 with the numeral 138). The gimbal arrangement 138 may provide stabilization to the platform 114 against the unwanted external physical forces. In some examples, the gimbal arrangement 138 may utilize three brushless motors (not shown) to stabilize all three rotation axis, namely yaw, pitch and roll. It should be noted however that the brushless motors have no positional feedback and thus the orientation of the platform 114 relies solely on the data from the IMU 136. Specifically, the IMU 136 is used to find out the orientation and movement of the platform and that information is fed into the processor 102 that calculates in which directions the brushless motors need to be driven to keep the platform 114 levelled and pointing in the right direction, in the mount assembly 110. The gimbal arrangement 138 may not be required for terrestrial applications where the stabilization is not critical, because the mount assembly 110 will be usually mounted on a stable tripod or clamped to some solid surface. However, the gimbal arrangement 138 may be employed to make the sensor device system 200 to also be able to perform mobile tasks, such as mounting on a rover or Unmanned Aerial Vehicle (UAV).

[0035] In some examples, the processor 102 is also communicatively coupled with the sensor device 108, via the centralized circuit system 112, and configure the sensor device 108 to capture an image. The processor 102 may include a transistor with its specification depending on the specification of the sensor device 108 and to be used as a switch in order to provide an appropriate voltage signal to the sensor device 108 for capturing the image. Specifically, the processor 102 may generate two servo signals and one trigger signal to trigger the sensor device 108, in which one servo signal is fed into a control circuit of the gimbal arrangement 138 to control yaw axis rotation and the other servo signal controls the pitch axis rotation.

[0036] Referring now to FIG. 4, a flow chart is depicted providing an example method 400 for calibrating the platform 114 in order to capture a concentric panoramic image, in accordance with an example embodiment. The method 400 depicted in the flow chart may be executed by, for example, the apparatus 100 of FIG. 1. It should be noted that to facilitate discussions of the flowchart of FIG. 4, certain operations are described herein as constituting distinct steps performed in a certain order. Such implementations are examples only and non-limiting in scope. Certain operations may be grouped together and performed in a single operation, and certain operations may be performed in an order that differs from the order employed in the examples set forth herein. Moreover, certain operations of the method 400 are performed in an automated fashion. These operations involve substantially no interaction with the user. Other operations of the method 400 may be performed in a manual fashion or semi-automatic fashion. These operations involve interaction with the user via one or more user interface presentations.

[0037] The method 400 provides the concentricity calibration based on observations of the movement of centers of projection of the sensor device 108, when the platform 114 is rotated. The concentricity calibration provides distance and directions of shifts required for moving the sensor device 108 which aligns the center of projection of the sensor device 108 with the center of rotation of the platform 114. As for preliminary measures, the sensor device 108 is calibrated for interior orientation ' ' and lens distortion, for example by using the UI 106. This is done using 'Australis calibration target' technique which is well known in the art and therefore have not been described herein. This technique also yields homogenous 3D point coordinates of calibration rig { j} used herein. For convenience of the calculation, the global coordinate system is used. Further, the positions of the sensor device 108 is indexed with 'i' and is written as 'Q'. For the purpose of understanding of the calibration method 400, the sensor device 108 shall be considered as the imaging device capturing multiple images as signals, for further discussion.

[0038] At 405, the method 400 includes mounting the sensor device 108 on the platform 114. The plane of the platform 114 is defined by the first axis 'X' and the second axis 'Z'. As discussed earlier, the sensor device 108 may be mounted to the platform 114 using any suitable arrangement. In an embodiment, the sensor device 108 is fixedly oriented with respect to one of the first axis 'X' and the second axis 'Z', i.e. an optical axis of the sensor device 108 is parallel to either one of the first axis 'X' and the second axis 'Z'.

[0039] At 410, the method 400 includes facilitating rotating of the platform 114 about the third axis Ύ' which is substantially perpendicular to the first axis 'X' and the second axis 'Z'. The rotation of the platform 114 is carried in a manner to provide a movement of the sensor device 108 in a substantially eccentric circle. It may be contemplated by a person skilled in the such eccentric rotation of the platform 114 is achieved by simultaneous action of the first rail 118 and the second rail 120. In particular, the processor 102 provides instructions to the first translational actuator 124 and the second translational actuator 126 to simultaneous generate translational movements of the corresponding rails with respect to the third rail 122, imparting the rotation of the platform 114 about the third axis Ύ' some eccentricity. The imaging device 108, mounted on the platform 114, is rotated on a circular trajectory around the yaw axis with a constant radius, R=Vx 2 + Z 2 . The radius 'R' is chosen large to cause highly eccentric rotation. During rotation, the orientation of the sensor device 108 is kept fixed with respect to the platform 114, in the mount assembly 110. It is to be understood that the sensor device 108 is rotated while it is being displaced.

[0040] It may be contemplated that the sensor device 108 generates a set of signals by capturing optical signals, while the platform 114 is rotating. In case of the imaging device 108, a set of sub-images are captured. The generation of the set of signals by the sensor device 108 may be triggered by the processor 102, according to a predefined logic. For example, the processor 102 may trigger the sensor device 108 to capture each subsequent signal after a gap of predefined time of, for example, six (6) seconds. Alternatively, the processor 102 may use data from the IMU 136 to trigger the sensor device 108 to capture each subsequent signal, when it is determined that the sensor device 108 has achieved enough stability after moving to and stopping at a subsequent position in the course of rotation of the platform 114. The second approach may provide faster image capture without reducing the image quality. At step 415, the method 400 includes receiving the generated set of signals from the sensor device 108, by the processor 102.

[0041] At 420, the method 400 includes determining the center of projection and one or more rotation angles of the sensor device 108. This is achieved by the processor 102 based on analysis of the set of signals. For this purpose, at least three set of signals (i.e. three sub-images in case of the imaging device) of the test field are taken while rotating the sensor device 108. This provides at least two positions of the sensor device 108, and for each position i, the rotation matrix Ri and translation vector Ti for the sensor device 108 are computed from known 3D and image points {Uj} . Therefore, the center of projection for the sensor device 108 for each of its positions when signals are generated during rotation, is calculated as: Pi = K [ ¾ | TJ , which provides, Uj = Pi Xj for ideal points. In global coordinates, the centers of projection for the sensor device 108 are written as: Ci = -Ri _1 Ti, where R "1= R T by orthogonality.

[0042] At 425, the method 400 includes determining the center of rotation, referenced as 'C rot ', of the platform 114. In an example embodiment, the center of rotation 'Crot' of the platform 114 is determined by minimizing a function based on the center of projection 'Ci' and the one or more rotation angles of the sensor device 108. That is, using the determined center of projections Q on the XZ-plane and the a-priori knowledge that they lie on the perimeter of a circle, we perform a function to find the center of the circle, which is the center of rotation 'C ro t' of the platform 114. The function used here is a least- squares circular fit function, and minimized as: where the distance, ^ = i A i ~ Λ -) ~ ÷ { Δ ι ~ r ) ' ~

Therefore, C rot = (x r ,z r ), is obtained through iteration, along with R. It may be contemplated that previously stated requirement of at least three signal is a practical minimum to obtain a solution for the circular fit function; and redundancy in form of more signals is used to avoid multiple minima and can also be also for error determination.

[0043] At 430, the method 400 includes determining a displacement between the center of rotation 'C rot ' of the platform 114 and the center of projection of the sensor device 108. In the present embodiment, the displacement comprises distances for movement of the sensor device 108 along directions of the first axis 'X' and the second axis 'Z'. For determining the displacement, the distance between the center of projection of the sensor device 108 and the center of rotation 'C ro t' has to be separated into orthogonal components in order to recover the underlying X and Z-directional displacements. This is done, first by computing the distance of the point 'C I0t ' on the X-Z plane to the line, ¾X + b;Z + Ci = 0, drawn by the optical axis of each position 'i' of the sensor device 108, which is the X-directional displacement; and second, by minimizing the distance of the center of rotation C Iot on the same plane to the line drawn perpendicularly to the optical axis of each position 'i' of the sensor device 108, which is the Z-directional displacement. In mathematical form, it is represented as:

[0044] Further, for both the sets of displacement, the averages: X = avg(Xi) and Z = avg(Zi), which are the final results and determines displacement, that is, the distance and direction to be translated by the sensor device 108 in order to have its center of projection on top of the center of rotation of the platform 114.

[0045] In an example embodiment, the method 400 also includes determine the required displacement along y-axis, which is calculated in a similar manner by using the rotation about one of the x-axis and z-axis. For example, if the x-axis is used, that is, rotation over the pitch axis, displacement along z-axis is obtained again, but this time it ought to be close to zero. Briefly, the steps for determine the required displacement along y-axis include: rotating the platform 114 about the first axis 'X' to provide a movement of the sensor device 108 in a substantially eccentric circle; receiving a subsequent set of signals from the sensor device 108 while the platform 114 is rotating; determining a subsequent center of projection and one or more subsequent rotation angles of the sensor device 108 based on analysis of the subsequent set of signals; determining a subsequent center of rotation of the platform 114 by minimizing a function based on the subsequent center of projection and the subsequent rotation angle of the sensor device 108; determining a subsequent displacement between the subsequent center of rotation of the platform 114 and the subsequent center of projection of the sensor device 108; and finally moving the sensor device 108 by the determined subsequent displacement. The mathematics behind determining the displacement along y-axis remains principally the same, as have been used for determining displacement along x-axis and the y-axis; and thus has not been repeated for the brevity of the disclosure.

[0046] In an example embodiment, this computer-aided process of determining the displacements can be streamlined into one step by rotating the sensor device 108 about multiple angles during the signal capture phase. Therefore, it is possible to calculate the displacements in along all three axes from a single set of signals.

[0047] At 435, the method 400 includes facilitating moving the sensor device 108 by the determined displacement. In at least one embodiment, the movement of the sensor device 108 is carried out in an automatic fashion, using the motorized mount assembly 110 of the present disclosure. Such processes and procedures for movement of the sensor device 108 have been described in much detail in FIG. 2, and have not been described herein. Although the embodiments of the disclosure have been described in view of implementation on the motorized mount assembly 110, it should be contemplated that similar steps for providing an eccentric rotation path to the sensor device 108 could be achieved by using a regular mount assembly 110, as illustrated in FIG. 5. In such case, the precise movements may be manually done using a screwdriver or the like. Further, the amount of movement may be precisely measured with tools, such as caliper or the like.

[0048] FIG. 6 illustrates a geometric representation to determine the displacements, that is, displacements along the first axis 'X' and the second axis 'Z', in accordance with the present embodiment of the disclosure. FIG. 6 shows a plurality of solid black dots which represent the centers of projection of the sensor device 108 as it moves with the rotational movement of the platform 114. It may be seen that the centers of projection of the sensor device 108, as represented by the solid black dots, form a substantially circular arc when joined together. Further, as illustrated, a first line (shown as dashed line) is drawn from each of the center of the projections (solid black dot) facing the direction of the optical axis (shown with arrow head), or in the present case opposite to the direction of the optical axis as dependent on the direction of the Z-shift. Furthermore, a second line (shown as bold solid line) is drawn from each of the center of the projections (solid black dot) perpendicular to the first line (shown as dashed line) and facing the direction obtained from orientation roll parameter of the sensor device 108. Then, for each of those lines, a third line (shown as thin solid line) is drawn from the yaw axis, which was obtained as the center of rotation 'C ro t' drawn through center of projection coordinates, perpendicularly to the first line and the second line drawn in the previous steps. This gives us two sets of resections which are marked in FIG. 6 as hollow dashed circles and hollow solid circles. Thereby, the translation for the sensor device 108 along the x-axis (ΔΧ in FIG. 3B) would be the radius of the circle fitted to the vectors facing forward (hollow dashed circles) and the translation for the sensor device 108 along the z-axis (ΔΖ in FIG. 3A) would be the radius of the circle fitted to the vectors facing forward (hollow solid circles), respectively.

[0049] FIG. 7 illustrates a geometric representation to determine the y-axis translation needed to move the center of projection of the sensor device 108 into the center of rotation of the platform 114 about pitch axis. It may be contemplated that a similar process could be performed perpendicularly to find the displacement/shift along the y-axis. The y-axis displacement is obtained by fitting a circle into the coordinates of the sensor device 108 and the radius of that circle is the required y-axis displacement, and thereby the concentricity calibration is complete.

[0050] FIG. 8 illustrates a geometric representation correlating to verify an accuracy of the calibration by photographing a known test field while rotating the sensor device 108 in yaw and pitch directions. In the right representation of FIG. 8, the change in the centers of projection during the rotations are shown, in which dark colored points are centers of projection due to pitch rotations and light colored points presents projection centers due to yaw rotations before the calibration. The left representation of FIG. 8 highlights how the centres of projection move along circles. In the right representation; after the calibration, the centres of projection of the captured images (as represented by solid dots) are almost coinciding with each other without any regular pattern, such that there is only a single dot shown, which indicates good calibration of the sensor device 108 with respect to the platform 114. A sample panoramic image as captured by the sensor device system 200 of the present disclosure is shown in FIG. 9.

[0051] Some example embodiments of performing the operation of the method 400 are explained herein with following description. For example, the processor 102 is configured to, with the platform calibration instructions stored in the memory 104, and optionally with other components described herein, to cause the apparatus 100 to perform: facilitating rotating the platform 114 about the third axis Ύ' to provide movement of the sensor device 108 in a substantially eccentric circle, receiving the set of signals from the sensor device 108 while the platform 114 is rotating; determining a center of projection and the one or more rotation angle of the sensor device 108, based on analysis of the set of signals; determining a center of rotation of the platform 114 by minimizing a function based on the center of projection and the one or more rotation angle of the sensor device 108; determining the displacement between the center of rotation of the platform and the center of projection of the sensor device 108; and facilitating moving the sensor device 108 by the determined displacement. In an example embodiment, the processor 102 is further configured to execute the platform calibration instructions stored in the memory 104 to cause the apparatus 100 to further perform: facilitating rotating the platform 114 about the first axis 'X', to provide a movement of the sensor device 108 in a substantially eccentric circle; receiving the subsequent set of signals from the sensor device 108 while the platform 114 is rotating; determining the subsequent center of projection and the subsequent rotation angle of the sensor device 108 based on analysis of the subsequent set of signals; determining the subsequent center of rotation of the platform 114 by minimizing a function based on the subsequent center of projection and the subsequent rotation angle of the sensor device 108; determining a subsequent displacement between the subsequent center of rotation of the platform 114 and the subsequent center of projection of the sensor device 108; and facilitating moving the sensor device 108 by the determined subsequent displacement. The processor 102 is further configured to execute the platform calibration instructions stored in the memory 104 to cause the apparatus 100 to further perform at least automatic movement of the sensor device 108 by the determined displacement and the determined subsequent displacement. The processor 102 is further configured to execute the platform calibration instructions stored in the memory 104 to cause the apparatus 100 to further perform at least automatic triggering of the sensor device 108 for capturing the set of signals and the subsequent set of signals. The mechanical and mathematical basis for performing the above mentioned steps have been described earlier, and have not been repeated herein for the brevity of the disclosure.

[0052] It may be understood that in photogrammetric applications, knowledge of the camera internal orientation is crucial for best possible accuracy. Thus it is common practice to calibrate cameras used in measurement task each time the camera system is changed in any way. For many applications this is done when changing the lens or zoom settings of the camera. However, for precise tasks the camera focus should also be taken into account as it changes the principle distance of the camera depending on the type of lens used. This also changes the co-ordinates of the camera projection center. So for precise panoramic applications it is not only desired to calibrate the camera internal orientation if the camera settings are changed, but also calibrate the concentricity of the panoramic system. In practice this would require so much work that when using the non- motorized mount, it is required to calibrate the camera internal orientation and then move the camera projection center into the crossing of the pitch and yaw axis of the mount. After this it is required that the camera parameters are kept fixed. This limits the panoramic image capture to only certain situations because a user cannot change focus or zoom.

[0053] The present embodiments provide a motorized mount assembly 110 for the sensor device 108 that provides both internal moving of the sensor device 108 for calibration purposes and automatic image acquisition. The proposed calibration technique includes the interior orientation of the camera, solving locations of projection centers of sub-images after rotating camera, computing of necessary shifts of the camera within the mount, applying the shifts, and verification of concentric imaging after the calibration. It may be contemplated that the proposed calibration technique is applicable to both non- motorized and motorized camera mounts. Compared to non-motorized mount, the motorized mount of the present disclosure with automatic calibration makes it possible to acquire panoramic images more quickly and with higher versatility. With motorized mount it is possible to quickly calibrate the concentricity of the system, so changing zoom, focusing or even changing the camera becomes possible. In addition to improved speed of operation and versatility, the motorized mount also offers greatly improved accuracy of translational calibration movements. This is caused by much better repeatability of the calibration movements offered by the stepper motor drive approach. Therefore, the present system not only offers better speed of operation and versatility of quickly changing the camera system, but also better geometrical accuracy than conventional manually calibrated panoramic mounts.

[0054] In many photogrammetric applications, there is a need to cover the area being measured from many different viewpoints. Using non-motorized panoramic camera this requires careful planning because the amount of panoramic stations one can produce in a given time is limited. The present system offers stabilized image capture while moving so if the platform was mounted on a rover for example, the camera could be used to aid the system to navigate to a proper place to take the panoramic image while recording a video footage which could be used for SFM reconstruction or the like. That way the same system would produce mobile mapping data as well as high resolution panoramic images coupled with the versatility to change the optical configuration of the system due to field calibration possibilities offered by the motorized concentricity calibration.

[0055] Although the present disclosure has been primarily described in terms of mounting imaging device on to the platform, also other types of sensors devices such as laser ranging devices can be used. This makes it possible to construct a multi-sensor platform incorporating imaging technology and ranging. This way the platform could operating in similar fashion than laser scanner coupled with a camera. The present system together with laser ranging device would make it possible to aim the laser ranging device with the photogrammetric data acquired from the camera. This would make it possible to measure geometrical objects such as break-lines without the need of using very high point density typically required by commercial laser scanners. The motorized mount also makes it possible to implement a laser ranging device with variable point density as a function of some other parameter, such as the pitch angle of the platform. This would solve the typical problem of the laser scanner where the point density is the highest near the laser scanner and gets less dense when scanning objects that are further away.

[0056] Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is to provide methods capable of generating a concentric panoramic image which may be appreciated in photogrammetry applications. The present disclosure is described above with reference to block diagrams and flowchart illustrations of method and device embodying the present disclosure. It will be understood that various block of the block diagram and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, may be implemented by a set of computer program instructions. These set of instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to cause a device, such that the set of instructions when executed on the computer or other programmable data processing apparatus create a means for implementing the functions specified in the flowchart block or blocks. Although other means for implementing the functions including various combinations of hardware, firmware and software as described herein may also be employed.

[0057] Various embodiments described above may be implemented in software, hardware, application logic or a combination of software, hardware and application logic. The software, application logic and/or hardware may reside on at least one memory, at least one processor, an apparatus or, a non-transitory computer program product. In an example embodiment, the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media. In the context of this document, a "computer-readable medium" may be any non-transitory media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of a system described and depicted in FIG. 1. A computer- readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.

[0058] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the present disclosure and its practical application, to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstance may suggest or render expedient, but such are intended to cover the application or implementation without departing from the spirit or scope of the claims of the present disclosure.