Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
IMAGE CAPTURING APPARATUS FOR CAPTURING AN OMNIDIRECTIONAL VIEW
Document Type and Number:
WIPO Patent Application WO/2017/217925
Kind Code:
A1
Abstract:
An image capturing apparatus comprising: a first plurality of lenses arranged to capture an omnidirectional view of a surrounding of the image capturing apparatus; at least one reflective surface configured to reflect light captured by the first plurality of lenses respectively at an angle towards a single image plane; a second plurality of lenses for focusing the Sight reflected by the at least one reflective surface respectively onto the single image plane; and an image sensor comprising the single image plane, wherein patches of the light received by the single image plane are processed to form a single image containing the omnidirectional view of the surrounding of the image capturing apparatus.

Inventors:
TAN, Wei De (4 Kaki Bukit Ave 1,#06-08, Singapore 9, 417939, SG)
CHEE, Teck Lee (4 Kaki Bukit Ave 1,#06-08, Singapore 9, 417939, SG)
AU, King Shing, Affa (4 Kaki Bukit Ave 1,#06-06, Singapore 9, 417939, SG)
Application Number:
SG2016/050272
Publication Date:
December 21, 2017
Filing Date:
June 15, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SPACEMAP PTE LTD (4 Kaki Bukit Ave 1, #06-06, Singapore 9, 417939, SG)
MOVEON TECHNOLOGIES PTE. LTD. (4 Kaki Bukit Ave 1, #06-08, Singapore 9, 417939, SG)
International Classes:
H04N13/02; G03B37/04; H04N5/225; H04N5/232; H04N5/247
Domestic Patent References:
WO2016033452A12016-03-03
Foreign References:
US20100045773A12010-02-25
Other References:
None
Attorney, Agent or Firm:
CHANG JIAN MING (RODYK IP, 80 Raffles Place#16-01 UOB Plaza 1, Singapore 4, 048624, SG)
Download PDF:
Claims:
Cfaims

1. An image capturing apparatus comprising:

a first plurality of lenses arranged to capture an omnidirectional view of a surrounding of the image capturing apparatus;

at ieast one reflective surface configured to reflect light captured by the first plurality of lenses respectively at an angle towards a single image plane;

a second plurality of lenses for focusing the light reflected by the at least one reflective surface respectively onto the single image plane; and

an image sensor comprising the single image plane,

wherein patches of the light received by the single image plane are processed to form a single image containing the omnidirectional view of the surrounding of the image capturing apparatus.

2. The image capturing apparatus according to claim 1 , wherein the at ieast one reflective surface is configured to reflect orthogonally light captured by the first plurality of lenses respectively towards the single image plane.

3. The image capturing apparatus according to claim 1 or 2, wherein a single image signal processor is used to process the patches of light received by the single image plane.

4. The image capturing apparatus according to any one of claims 1 to 3, wherein a plurality of sets of the first plurality of lenses are arranged to face proportionally in different directions from one another to direct light captured to the at Ieast one reflective surface, wherein angles of field of views of the plurality of sets of the first plurality of lenses add up to more than 360 degrees, wherein number of sets of the plurality of sets of the first plurality of lenses is more than two.

5. The image capturing apparatus according to claim 4, wherein the number of sets of the plurality of sets of the first plurality of lenses is in multiples of 2.

6. The image capturing apparatus according to claim 4, wherein the number of sets of the plurality of sets of the first plurality of lenses is an odd number.

7. The image capturing apparatus according to any one of claims 1 to 6, wherein the image plane is disposed in an orientation for receiving fully the patches of light if a shape of the image plane cannot receive fully the patches of light in another orientation of the image plane.

8. The image capturing apparatus according to claim 7, wherein the orientation for receiving fully the patches of light is such that the image plane is rotated by 25 degrees to 40 degrees about an axis normal to the image plane from the orientation of the image plane that cannot receive fully the patches of light.

9. The image capturing apparatus according to any one of claims 1 to 8, wherein different sets of lenses of the second plurality of lenses focusing different patches of light on different areas in the image plane respectively are adhered together.

10. The image capturing apparatus according to any one of claims 1 to 9, wherein one of the at ieast one reflective surfaces is a reflective diffraction grating.

11. The image capturing apparatus according to any one of claims 1 to 10, wherein at ieast one lens of the first plurality of lenses or one lens of the second plurality of lenses is made of a material including Silicon, Germanium, CaF2, ZnS, ZnSe, or chaicogenide glasses.

12. The image capturing apparatus according to any one of claims 1 to 11 , wherein at least one iens of the lenses present in the first piuraiity of lenses operates at a short infrared operating wavelength of between 3000 - 5000 nm.

13. The image capturing apparatus according to any one of claims 1 to 11 , wherein at least one iens of the lenses present in the first plurality of lenses operates at a long infrared wavelength of between 8000 - 120Q0 nm.

14. The image capturing apparatus according to any one of claims 1 to 13, wherein at least one lens of the lenses present in the first plurality of lenses or the second plurality of lenses has a surface with aspheric and diffractive properties.

15. The image capturing apparatus according to any one of claims 1 to 15, wherein the number of lenses present in the total number of the first piuraiity of lenses and the second piuraiity of lenses is 8 or less.

16. The image capturing apparatus according to any one of claims 1 to 15, wherein a distance from a vertex of a lens receiving iight from the surrounding that is present in the first piuraiity of lenses to a iight reflecting surface of the at least one reflective surface

is greater than

a distance from the iight reflecting surface of the at least one reflective surface to the image plane.

17. The image capturing apparatus according to any one of ciaims 1 to 16, wherein a distance from a Iight reflecting surface of the at least one reflective surface to the image plane divided by

a distance from a vertex of a lens receiving iight from the surrounding that is present in the first piuraiity of lenses to the Iight reflecting surface of the at least one reflective surface

is less than 1.

18. The image capturing apparatus according to any one of ciaims 1 to 17, wherein ratio between a diameter of a Iight exiting surface of a lens receiving iight from the surrounding that is present in the first piuraiity of lenses and modulus of a radius of the light exiting surface is less than 1.92.

Description:
mage Capturing Apparatus for Capturing an Omnidirectional View

Field of the Invention

The present invention relates to an image capturing apparatus for capturing an omnidirectional view, in particular, for obtaining an omnidirectional image with a single image sensor, Backa round

Conventional omnidirectional imaging device or camera employs a plurality of wide angle lenses specially arranged in a manner such that the plurality of wide angle lenses are able to capture optical information within a 360 degrees sphere centered about the imaging device. Each of the plurality of wide angle lenses will then image the optical information captured onto a plurality of dedicated sensors. Thus the number of lenses used will determine the number of sensors present in the imaging device. The optical information from each of the plurality of dedicated sensors is then processed and stitched together to form an omnidirectional, 360 degrees or spherical image or video.

The conventional approach of having each lenses imaging onto a dedicated sensor is a convenient way of assembling an omnidirectional camera but has a number of draw backs. The more sensors there are in the omnidirectional camera, the more space is required to house the sensors which will lead to an increase in the footprint and weight of the omnidirectional camera. During the operation of the omnidirectional camera, the power consumption of the sensors and its required electronics are high and this reduces the battery lifetime. Conventional cameras having multiple sensors have battery lifetime limited to less than 60 minutes. As the large number of sensors would require additional product housing, this restricts the camera to larger batteries and footprints which would otherwise discourage the proliferation of such camera for mobile consumer applications. The large number of sensors would also result in more heat being generated within the camera which is transferred to the lenses assembly. A temperature built-up of a few degrees within the camera can result in a drop in image quality for omnidirectional devices with lenses fabricated from plastics which have coefficient of thermal expansion (CTEs) which are 10 times larger than lens made from glass. Plastic lenses which are designed to be used in such devices would either require athermalization or would need to have sufficient heat dissipation built into the housing of the device. Such approaches generally lead to larger device footprint and increased design and fabrication costs. For example, an application using a heat shield to block the heat generated by the sensors from reaching the lens assembly.

In conventional omnidirectional imaging device, images collected by different sensors in the omnidirectional imaging device would have to be calibrated for image quality metrics such distortion, white balancing, gain, focal distance, color temperature and exposure before stitching can be carried out. The calibration between the different sensors would in general require each sensor to have its own image signal processor (ISP) and accompanying electronics, which will lead to a larger imaging device footprint and fabrication cost. As optical information are collected from the different sensors, the calibration process is complex and time consuming.

Summary

According to one aspect of an example of the present invention, there is provided an image capturing apparatus comprising: a first plurality of lenses arranged to capture an omnidirectional view of a surrounding of the image capturing apparatus; at least one reflective surface configured to reflect light captured by the first plurality of lenses respectively at an angle towards a single image plane; a second plurality of lenses for focusing the light reflected by the at least one reflective surface respectively onto the single image plane; and an image sensor comprising the single image piane, wherein patches of the light received by the single image piane are processed to form a single image containing the omnidirectional view of the surrounding of the image capturing apparatus. The at least one reflective surface may be configured to reflect orthogonally light captured by the first plurality of lenses respectively towards the single image plane.

The single image signal processor may be used to process the patches of light received by the single image plane.

The plurality of sets of the first plurality of lenses may be arranged to face proportionally in different directions from one another to direct light captured to the at least one reflective surface, wherein angles of field of views of the plurality of sets of the first plurality of lenses add up to more than 36Q degrees, wherein number of sets of the plurality of sets of the first plurality of lenses is more than two.

The number of sets of the plurality of sets of the first plurality of lenses may be in multiples of 2, The number of sets of the plurality of sets of the first plurality of lenses may be an odd number.

The image plane may be disposed in an orientation for receiving fully the patches of light if a shape of the image plane cannot receive fully the patches of light in another orientation of the image plane.

The orientation for receiving fully the patches of light may be such that the image plane is rotated by 25 degrees to 40 degrees about an axis normal to the image plane from the orientation of the image plane that cannot receive fully the patches of light.

Different sets of lenses of the second plurality of lenses focusing different patches of light on different areas in the image plane respectively may be adhered together.

One of the at least one reflective surfaces may be provided by a prism or a freeform prism.

One of the at least one reflective surface may be a mirror. The mirror may be coated with either a reflective metallic or dielectric material. One of the at least one reflective surfaces may be a reflective diffraction grating.

At least one lens of the first plurality of lenses or one lens of the second plurality of lenses may be made of a material including Silicon, Germanium, CaF 2 , ZnS, ZnSe, or chalcogenide glasses. At least one lens of the lenses present in the first plurality of lenses may operate at a short infrared operating wavelength of between 3000 - 5000 nm. At least one lens of the lenses present in the first plurality of lenses may operate at a long infrared wavelength of between 8000 - 12000 nm.

At least one lens of the lenses present in the first plurality of Senses or the second plurality of lenses may have a surface with aspheric and diffractive properties.

The number of lenses present in the total number of the first plurality of lenses and the second plurality of lenses may be 8 or less.

A distance (L1 ) from a vertex of a lens receiving light from the surrounding that is present in the first plurality of lenses to a light reflecting surface of the at least one reflective surface may be greater than a distance (L2) from the light reflecting surface of the at least one reflective surface to the image plane.

A distance (L2) from a light reflecting surface of the at least one reflective surface to the image plane divided by a distance (L1 ) from a vertex of a lens receiving light from the surrounding that is present in the first plurality of lenses to the light reflecting surface of the at least one reflective surface may be less than 1.

A ratio between a diameter of a light exiting surface of a Sens receiving Sight from the surrounding that is present in the first plurality of Senses and modulus of a radius of the light exiting surface may be less than 1.92. Examples of the image capturing apparatus will now be described with reference to the accompanying figures in which:

Fig. 1 is a side view of a prior art arrangement of a plurality of lenses and a sensor in a camera where an optical axis is a straight line.

Fig. 2 is a side view of a prior art arrangement of a plurality of lenses and a sensor in a camera where an optical axis is bent at an angle of 90degrees.

Fig, 3 is a side view of a prior art arrangement of a plurality of lenses and two sensors in an image capturing apparatus.

Fig, 4 is a side view of an image capturing apparatus using prisms according to an example of the present disclosure,

Fig, 5 is a side view of an image capturing apparatus using mirrors according to an example of the present disclosure.

Fig. 6 is a top view of an image capturing apparatus using four sets of wide angled lenses according to an example of the present disclosure.

Fig. 7 is a top view of an image capturing apparatus using six sets of wide angled lenses according to an example of the present disclosure.

Fig. 8 is a side view of the image capturing apparatus of Fig. 4 illustrating a dead zone not captured by lenses of the image capturing apparatus.

Fig. 9 is a top view of an image capturing apparatus having a tilted image sensor according to an example of the present disclosure.

Fig. 10 is a top view of an image plane of an image sensor illustrating a problem with light capture.

Fig. 11 is a top view of an image plane of an image sensor illustrating a solution for the problem of Fig. 10,

Fig. 12 is a top view of a prior art image plane of an image sensor.

Fig. 13 is a top view of an image plane of an image sensor according to an example of the present disclosure.

Fig. 14 is a top view of an image plane of an image sensor according to an example of the present disclosure.

Fig. 15 is a top view of an image plane of an image sensor according to an example of the present disclosure.

Fig. 16 is a block diagram illustrating system architecture of an image capturing apparatus according to an example of the present disclosure.

Fig. 17 is a block diagram illustrating prior art system architecture of an image capturing apparatus,

Fig. 18 illustrates a first step of an image stitching process according to an example of the present disclosure.

Fig. 19 illustrates a second step of an image stitching process according to an example of the present disclosure.

Fig. 20 illustrates a third step of an image stitching process according to an example of the present disclosure. Detailed Description

An omnidirectional view in the present disclosure refers to a view of substantially 360 degrees of a surrounding or environment of where the image is captured by an image capturing apparatus or camera. An omnidirectional image refers to an image comprising such view. An omnidirectional camera shall be taken to refer to a camera for capturing images with such view. 360 degrees image or spherical image has the same meaning as omnidirectional image.

in the Figures 1 to 9 described as follows, a measurement scale drawn up to 20 mm is provided in the Figures to indicate the size/dimensions of the components/elements as shown and described. Although the actual readings of the sizes/dimensions of the components/elements in the Figures may not all be described herein, one can extract the readings by referring to the measurement scale in the Figures.

With reference to Fig. 1 , a conventional camera 100 employs multiple lenses 102, 104, 106, 108, 112, 1 14, 116 and 1 18 for imaging light through a filter 120 onto a sensor 122 to obtain images of a scene. Between lens 108 and lens 112 is an aperture stop 110. Light paths 199 travelling through the various elements are shown in Fig. 1. The lenses 102. 104, 106, 1 12, 114, 116 and 118 and sensor 120 are positioned such that optical axes of the lenses 102, 104, 106, 112, 1 14 116 and 118 are a common straight line that goes from a vertex of a first lens element of the lenses 102 to that of a last lens element 118 of the lenses. However, the conventional camera 100 with such configuration is made only to capture an optical image within a limited field of view (FOV). The FOV is typically a small fraction of a sphere of view (or at best 190 degrees or hemispherical for certain stereoscopic cameras used for capturing three dimensional (3D) images) centered about the camera. The FOV of conventional cameras or of certain stereoscopic cameras is hardly spherical or omnidirectional, which is the subject of the present disclosure.

Fig. 2 illustrates a side view of a conventional camera for capturing a wide angle image. Light paths 299 travelling through the various elements are shown in Fig. 2. Instead of arranging lenses to have a common straight optical axis, the optical axis is bent by 90 degrees about an optical reflecting device such as a triangular prism 208. This is referred to as folded optics. Light first passes through lenses 202, 204 and 206 before reaching the triangular prism 208. After the light is reflected by the triangular prism 208, the light passes through lenses 212, 214, 216 and 218 and is imaged onto a sensor 220 through a filter 220. Between the prism 208 and lens 212 is an aperture stop 210. Notably, conventional design teaches that each set of folded optics is meant for imaging light onto a dedicated sensor.

Fig. 3 illustrates a side view of a conventional omnidirectional camera for capturing an omnidirectional image. Light paths 399 travelling through the various elements are shown in Fig. 3. Similar to conventional practice, each set of folded optics is meant for imaging light onto a dedicated sensor. In Fig. 3, for two sets of folded optics comprising lenses 302, 304, 306, 312, 314, 316 and 318, two sensors 320 are used. Light is first collected by lens 302, 304 and 306. The light is then reflected orthogonally by reflecting prisms 308 which is then directed through an aperture stop 310. Thereafter, the light is directed by lens 312, 314 and 316 through filter 318 and finally onto each of the sensors 322. The two sensors 322 are arranged opposite to each other to avoid light directed to them respectively from affecting each other. A single prism 308 having two reflective surfaces may be used to reflect light captured by the two sets of lenses 302, 304, and 306 to the respective two sets of lenses 312, 314, 316 and 318 towards the two sensors 322 at their respective locations. It is appreciated that the use of more than one sensors results in a larger camera and shorter battery life. A need therefore arises to provide an omnidirectional camera that allows for smaller product footprints, less complexity, longer battery life and lighter weight.

Fig. 4 illustrates a side view of an example of an image capturing apparatus 400 for obtaining an image containing an omnidirectional view of a surrounding of the image capturing apparatus 400 (in other words "omnidirectional image"). Light paths 499 travelling through the various elements are shown in Fig. 4. The image capturing apparatus 400 comprises a first plurality of lenses 402, 404 and 406 that are arranged to capture the omnidirectional view of the surrounding of the image capturing apparatus 400. In the present example, the first plurality of lenses 402, 404 and 406 comprise of two sets of wide angle lenses. Light captured from different directions, in this case, two opposing directions, by the first plurality of lenses 402, 404 and 406 is reflected respectively by at least one reflective surface 408, at an angle, in this example 90 degrees (or orthogonally), towards a single image plane residing in an image sensor 420. in this case, there is a pair of the reflective surfaces 408. The image capturing apparatus 400 includes a second plurality of lenses 412, 414, and 416 for focusing the light reflected by the at ieast one reflective surfaces 408 respectively onto the single image plane of the image sensor 420. in the present example, the second plurality of Ienses 402, 404 and 406 comprise of two sets of wide angle Ienses. Filters 418, described in more details later, are disposed between the lens 416 and the image plane of the image sensor 420 for fi!tering the light.

With reference to Fig.4, the lens 402 and 406 are negative ienses while lens 404 is a positive lens. They are so designed to ensure that the have a FOV greater than 180 degrees. The Ienses 402, 406 and 404 direct light towards an aperture stop 410 disposed to receive light from the at ieast one reflective surfaces 408. The aperture stop 410 is a physical structure that limits the size of a bundle of ray and amount of light that falls onto the image sensor 420. The aperture stop 410 determines the size of the F-number of the Ienses 402, 406 and 404 which is preferably to be as small as possible to maximize the amount of light reaching the image sensor 420. The Ienses 402 and 406 can also be "negative lenses" in the other non-prior art figures. The lens 404 can be "positive" lens in the other non-prior art figures.

It is appreciated that in actual implementation, ail the image capturing apparatuses described in the present disclosure will comprise a housing and support structures for holding the first plurality of ienses 402, 404 and 406, the corresponding number of at least one reflective surfaces 408, the corresponding number of aperture stops 410, the corresponding number of the second plurality of ienses412, 414 and 416, the corresponding number of filters 418, and the single image sensor 420 in place as shown in the Figures of the present disclosure.

Each set of the ienses 402, 404, and 406 work in combination to provide wide angle image capture with a wide field of view (FOV). The two sets of wide angle Ienses 402, 404, and 406 are arranged back to back such that each set of the Ienses 402, 404, and 406 are facing a direction opposite to each other. The minimum FOV for each set of the wide angle ienses 402, 404, and 406 in the omnidirectional camera employing n sets of wide angle Ienses can be obtained from equation (1 ):

380 I n (1 )

For example, Fig. 4 shows that there are 2 sets of wide angle Ienses 402, 404, and 406. Hence, n takes the value of 2 and the minimum FOV to be provided by each set of the wide angle Ienses 402, 404, and 406 is 190 degrees. The total sum of the FOV of the Ienses exceeds 360 degrees because the overlapping portions of the FOV of the images captured by each of the set of the ienses are required to serve as reference positions during image stitching process to obtain the omnidirectional image.

The at ieast one reflective surfaces 408 can be provided by a pair of prisms 408 as shown in Fig. 4 or one of the at Ieast one refiective surfaces 408 is provided by one or more prisms. Each of the prisms can be coated with a refiective metallic or dielectric material on a slanted side of each of the one or more prisms. In another example, the at ieast one reflective surfaces 408 is provided by one or more freeform prisms or the at least one refiective surfaces. An example of a freeform prism is one with a combination of a freeform surface and a 90 degrees light direction prism. The freeform prism or prisms herein described can be coated with the reflective metallic or dielectric material. The surface of a freeform prism is not a rotationally symmetric and ean be described using nonuniform rotational B-spline (NURBs) mathematics available in a variety of computer aided design software (CAD) such as Soiidworks or Catia from Dassult Systems.

The at ieast one reflective surfaces 408 can also be provided by reflection type diffraction gratings (not shown in the Figures), which are periodic structures that can reflect incoming Sight into various orders as defined by an equation d Sin (9)=m λ where d is a distance between periodic structures, θ is the diffraction angle, m is the diffraction order which takes whole numbers and λ is the operation wavelength. The diffracting gratings can be designed such that most of the energy is diffracted into a selected order and direction. Both reflection type diffraction gratings and freeform surfaces offer more degrees of freedom during optica! design and can greatly increase the performance of the final lens design. However, their fabrication and assembly costs can be higher than prisms and may limit their applications in consumer electronics.

In the example of Fig. 4, the light captured by the first plurality of lenses 402, 404, and 406 is reflected respectively by the pair of prisms 408 at an angle of 90 degrees or orthogonally towards the single image plane of the image sensor 420. When the pair of prisms 408 is used as the at ieast one reflective surfaces, an optical path length of a lens is extended by a product of a length of each prism and n d , where n d is a refractive index of a material measured at a wavelength of 587 nm that is used to fabricate each of the pair of prisms. Advantages of extending the optical path length of the lenses are that the manufacturing tolerances of the lenses can be relaxed and the optical performance of the lenses can be improved. Hence, the use of a prism with a higher refractive index material would be more advantageous in the present example. Another advantage of using a prism is that it is easier to align the prism with the different lenses 402, 404, 406, 412, 414 and 416 during assembly of the image capturing apparatus 400.

Another example of an image capturing apparatus 500 similar to the image capturing apparatus 400 of Fig. 4 is illustrated in Fig. 5. The reference numerals of elements having similar features and functions in Fig. 4 are reused in Fig. 5 for convenience. The at Ieast one reflective surfaces 408 of Fig. 4 however is now one or more mirrors, or specifically, a pair of mirrors 502 as shown in Fig. 5. The one or more mirrors 502 can be coated with the reflective metallic or dielectric material, it is appreciated that the at Ieast one reflective surfaces 408 in Fig. 4 or 502 in Fig. 5 can also be any- optical device, for instance, including a diffraction grating, that can reflect light at an angle towards the single image plane of the image sensor 420. The optical device can be coated with the reflective metallic or dielectric material such that the reflectivity at an operating wavelength is at ieast 98%. In general, metallic coatings tend to have lower reflectivity while dielectric coatings have higher reflectivity and can be designed to have reflectivity in excess of 99.9% at an operating wavelength. Unlike prisms, the use of mirrors offers no advantages towards the optical design of the lens. The mirror is also more sensitive to alignment during assembly as compared to the alignment of a prism during assembly. However the cost of using mirrors is lower compared to the cost of using prisms and is thus still a viable option, it is appreciated that in another example, both prism and mirror can be used as the at least one reflective surfaces 408 in Fig. 4 or 502 in Fig. 5 in the same image capturing apparatus 400 in Fig. 4 or 500 in Fig. 5.

in the present example, the lens of the first plurality of lenses 402, 404, and 406 are used to direct light capture from a wide FOV into the aperture stop 410, and the second plurality of lenses 412, 414 and 416 are used to correct optical aberrations for the image capturing apparatus 400. Optical aberrations are imperfections of a lens system which results in the image quality of the image formed to be blurred. Through a careful selection of lens shapes and materials, optical aberrations can be minimized or balanced, resulting in a system with good image quality.

in the example of Fig. 5, the lenses of the second plurality of lenses 412, 414 and 416 are bonded or adhered together using an optical glue to form a lens doublet that is used for correcting chromatic aberrations. The filters 418 is a pair of optical low pass filter (OLPF) 418 disposed between the second plurality of lenses and an image sensor 420. The pair of OLPF 418 is also bonded or adhered together using optica! glue as shown in Fig. 5. Such bonding of the second plurality of lenses 412, 414 and 416 and the pair of OLPF 418 brings them closer to facilitate focusing of light on a limited space in the single image plane of the image sensor 420. One of the purposes of the OLPF 418 is to limit energy of objects having spatial frequencies higher than those that can be supported by the image sensor 420. This is to prevent Moire fringes from being present in an image on the single image plane of the image sensor 420. Another purpose of the OLPF 418 is to have infrared (!R) filtering coatings applied on the OLPF 418 so as to limit the amount of IR wavelengths that can enter into the image sensor 420. This is to avoid the colors of the image that forms on the image sensor 420 to appear much redder to the user than they really are. This phenomenon occurs because materials which are used to fabricate the image sensor 420 are sensitive to wavelengths in the IR spectrum and thus the light has to be filtered by additional coatings on the OLPF 418. The filters 418 in Fig. 4 and in other non-prior art Figures can also be a pair of OLPF.

The patches of the light received by the single image piane located on the image sensor 420 are subsequently processed by a processor, for instance, an image signal processer, to form a single image containing the omnidirectional view of the surrounding of the image capturing apparatus 400, The image sensor 420 may have different sizes and aspect ratios depending on the type of lenses and reflective surfaces being used for the image capturing apparatus 400, An example of the image sensor 420 in Fig. 4 is one having a size of 1/2.7" (e.g. 5,37 mm by 4.04 mm) and a generic 4:3 aspect ratio. Another example of the image sensor 420 in Fig. 5 is one having a size of 1/1.8" (e.g. 7.18 mm by 5.32 mm) and a generic 4:3 aspect ratio.

In the present example, the first plurality of lenses 402, 404, and 406 and the second plurality of lenses 412, 414 and 416 are made from materials having different optical properties. The optical properties of a transparent material are defined by n d and v d values. n d is the refractive index of the optical glass material measured at 587 nm used to fabricate a lens. n d is a measure of the ability of an optical material to bend light where higher values are indicative of stronger light bending capabilities. The v d is an Abbe number of the optical material. The Abbe number is a measure of dispersive properties of the optical material where a low value indicates strong dispersive properties while a large v d number indicates weaker dispersive properties. Examples of the optical properties (or prescription) of the first plurality of lenses 402, 404, and 406 and the second plurality of lenses 412, 414 and 416 using the pair of prisms 408 as the at least one reflective surface 408 as shown in Fig. 4 is shown in Table 1 as follows. The surface in table 1 refers to numbering of the surfaces in order of position (with respect to Fig. 4) of the optical elements in table 1 beginning from the front most light receiving surface facing the surrounding of the image capturing apparatus in Fig. 4 to the final light destination, which is the image plane of the image sensor 420. The radius in table 1 is a measure of curvature of the respective surface of the stated lens or prism, The thickness in table 1 refers to axial distance of the stated lens or prism, In the optical elements of Table 1 , the coordinates are rotated 90 degrees about the surface 8 of the prism 408 such the distances that follow are ne ative.

Table 1

The readings of the optical elements of table 1 aim to achieve Effective focal length = 0.7 mm, F- number - 2, and FOV = 190 degrees. F-number refers to focal length of a camera lens assembly divided by entrance pupil diameter. The entrance pupil diameter is an image of an aperture through lens 408, 404 and 402. For example, F2 indicates that the focal length is two times the diameter of the entrance pupil. Table 1 is the optical prescription of a lens set that can be used to realise an omnidirectional camera with two identical sets of lenses 402 to 406 and 410 to 418 (including the aperture stop 410 and filter 418) and two 90° reflecting prisms 408.

With reference to Table 1 and Fig. 4, L1 497 is the distance from the vertex of lens 402 to the Sight reflecting surface 8 of the prism 408 (or specifically, at a point just before light is reflected by the light reflecting surface 8) while L2 498 is the distance from the light reflecting surface of a prism 408 (or specifically, at a point just after light is reflected off the light reflecting surface 8) to the image plane of the image sensor 420. in an omnidirectional camera, the wide angle lenses used are always the components that take up the most space. For an omnidirectional camera to have a low profile and compact design, it would be desirable that the absolute distance of L1 497 is less than the absolute distance of L2 498 as given by equation (2) below:

With reference to table 1 , |L1 | is 18.35 mm while |L2| is 8.34 mm thus satisfying equation (2), ideally the ratio of |L2| and |L1 | should also satisfy equation (3), that is:

With reference to table 1 , equation (3) yields a value of 0.454 which is less than 1 thus satisfying equation (3). When both equations (2) and (3) are satisfied, an omnidirectional camera with a compact and low profile design can be realised, which are very suitable for omnidirectional cameras with a product design that has a housing resembling a circular, rectangular or a squarish shape.

With reference to table 1 and Fig. 4, lens 402 is, for example, a strongly negative lens or sometimes called a negative meniscus lens. The fabrication costs of such a lens within a lens assembly is always the highest when compared to the other lens as it is the largest lens and the concave shape of surface 2 (light exiting surface of 402) is very difficult to polish optically, thus resulting in low yield during fabrication, it would be desirable if the ratio between the diameter, D, of the surface 2 (light exiting surface of 402) and the modulus of the radius of surface 2 (light exiting surface of 402), |R|, can be controlled to be less than 1 .92, that is equation (4) below:

In the case of lens 402 in table 1 , D is 6.84 and |Rj is 3.567 mm resulting in a value of 1.918 which satisfies equation (4). As the ratio of D/jRj approaches 2, the shape of the surface 2 (light exiting surface of 402) will approach that of a hemisphere shape. Hemispherical shape for an optical lens is difficult to polish as the polishing pads during the polishing process cannot effectively remove optical material, resulting in low yield, in that respect, by ensuring that the ratio of D/jRj < 1.92, one can effectively control the shape of the concave surface therefore ensuring high yield.

Examples of the optical properties (or prescription) of the first plurality of lenses 402, 404, and 406 and the second plurality of lenses 412, 414 and 416 using the pair of mirrors as the at least one reflective surface 408 as shown in Fig. 5 is shown in Table 2 as follows. The parameters in table 1 are the same as the parameters in table 2 except that the readings in table 2 applies to the optical elements in Fig. 5. In table 2, the coordinates are rotated 90 degrees about the surface 9 of the mirror 502 such that the distances that follow are negative, it is appreciated that although the shape of the lenses in Fig. 4 and Fig. 5 looks similar but they are actually not the same due to the differences in the actual prescription of the optical element. The drawings in Fig. 4 and Fig. 5 are drawn to look similar for convenience to illustrate that they have similar optical elements with similar functions and purposes. Furthermore, it is appreciated that the readings in tables 1 and 2 are non-obvious to a skilled person because they have been obtained and optimized through many experimentations and calculations for directing light to a single image plane of an image sensor.

Table 2

Similariy, the readings of the opticai elements of Table 2 aim to achieve Effective focal length = 0.7 mm, F-number = 2, and FOV = 190 degrees.

With reference to Table 2 and Fig. 5, LI 497 is the distance from the vertex of iens 502 to the light reflecting surface 8 of the mirror 502 just before reflection while L2 498 is the distance from the light reflecting surface 8 of 502 just after light is reflected to the image plane of the image sensor 420. L1 497 and L2 498 are 17.78 mm and 6.59 mm respectively, which when applied to equations (2) and (3) satisfies both conditions. In the case of equation (3), the ratio of |L2| / |L1 | yields a value of 0.371 , which indicates that it would be possible to use suchlens design to get a compact and low profile omnidirectional camera that fits into a compact circular, squarish or rectangular product design. Application of equation (4) to the parameters of the surface 2 (light exiting surface of 402) in table 2 yields a D/|R| value of 1 .91 which satisfies the condition stated in equation (4), which is indicative that the yield of this iens 402 will be high.

The lens prescription presented in table 1 and table 2 are example lens fabricated purely from spherical lens with only six iens elements (402, 404, 406, 412, 414, and 416). Spherical iens are lens that have surfaces with surface radii that can fully be described by the surface of a sphere. Generally, omnidirectional cameras in prior art uses a mixture of spherical iens and at least two asphericaS iens. The surface equation of an asphericai lens is given as equation (5) below:

where z is the surface sag of the iens which is the distance measured from the vertex of the lens to the edge of the lens, r is the radial distance, R is the radius of curvature, K is the conic constant, A 2 is the second order aspheric constant, A 4 is the fourth order aspberie constant and A 6 is the sixth order and so on. The equation above can also be used to describe the shape of a spherical lens by considering only the first term and dropping the rest.

Prior art omnidirectional cameras tend to use at least seven iens elements of which at least two elements are asphericai lens to improve opticai performance and reduce the track length of the camera. However the application of asphericai lens in a iens assembly requires very tight tolerances to control the de-center of the lens which leads to increased assembly costs and yield losses. Different from prior art, by reducing the number of elements from seven elements to six elements or lesser. An example of six elements is the case of the lenses 402, 404, 406, 412, 414, and 416 in Fig. 4, 5, 6 and 7, the assembly process of the iens assembly of the present disclosure can be further simplified and the cost can be further reduced.

Optical elements having the optical properties in tables 1 and 2 have an effective focal length of 0.7 mm, an F-number of 2.0, a total track length measured from a vertex of the iens 402 to the image sensor 420 of 27.2 mm, a FOV of 190 degrees and an image circle size of diameter 2.06 mm. The mapping function of the lens in this case is f-θ. The lenses can be fabricated from commercially available opticai glass which comprises spherical surfaces. The mapping function herein described is how an image would map onto an image plane of an image sensor. For rectilinear lens, this is given by y = f fan (θ) where y is a half image height formed on the image plane of the image sensor. However for wide angle lens, this equation fails as any FOV greater than 90° results in an image to have an infinite height. Other commonly used mapping functions include y = f sin (θ), y = 2 f sin (θ/2), y = 3 f tan (θ/3), y = 2 f tan (θ/2) and y = f-θ.

In examples of the present disclosure, there can be a plurality of sets of lenses, each set comprising for instance, the first plurality of lenses 402, 404 and 406 of Fig. 4, facing in different directions of the surrounding of the image capturing apparatus 400 from one another, wherein angles of field of views of the plurality of sets of lenses add up to more than 360 degrees so as to provide sufficient coverage of the surrounding. The number of sets of the plurality of sets of lenses can be in multiples of 2 or an odd number that is more than 2. Fig. 6 shows a top view of another example of an image capturing apparatus 600 that has some similar features as the image capturing apparatuses 400 of Fig. 4 and 500 of Fig. 5. The reference numerals of elements having similar features and functions in Fig. 4 and Fig. 5 are reused in Fig. 6 for convenience. The main difference between Fig. 6 and Fig. 4 or Fig. 5 is in the number of the first plurality of !enses 402, 404 and 406, the corresponding number of at least one reflective surfaces 408, the corresponding number of aperture stops 410, the corresponding number of the second plurality of lenses 412, 414 and 416, and the corresponding number of filters 418. Instead of just two sets of lenses for the first plurality of lenses 402, 404 and 406, a pair of reflective surfaces 408, two aperture stops 410, two sets of lenses for the second plurality of lenses 412, 414 and 416 and a pair of filters 418, which is the case for Fig. 4 and Fig, 5, there are now four sets of lenses for the first plurality of lenses 402, 404 and 406. two pairs of reflective surfaces 408, two pairs of aperture stops 410, four sets of lenses for the second plurality of lenses 412, 414 and 416, and two pairs of filters 418. in Fig. 6, the four sets of lenses of the first plurality of lenses 402, 404, 406 are facing different directions orthogonal to one another and facing the surrounding of the image capturing apparatus 600 in a manner to capture light from the different directions for producing a 360 degrees or omnidirectional image of the surrounding with respect to the image capturing apparatus 600. An example of the image sensor 420 in Fig. 6 has a size of 4/3" (e.g. 17.3 mm by 13 mm) and a generic 4:3 aspect ratio. The optical elements 402, 404, 406, 408, 410 412, 414, 416, and 418 in Fig. 6 are arranged such that optical information of a sphere centred about the image capturing apparatus 600 can be imaged onto the single image plane of the image sensor 420. in the case of Fig. 6, as there are four sets of the first plurality of lenses 402, 404. and 406, n of equation 1 takes the value of 4 and the minimum FOV of each set of the lenses is 95 degrees.

Fig. 7 shows a top view of yet another example image capturing apparatus 700 that has some similar features as the image capturing apparatuses 400 of Fig. 4, 500 of Fig. 5 and 600 of Fig. 6. The reference numerals of elements having similar features and functions in Fig. 4, Fig. 5 and Fig. 6 are reused in Fig. 7 for convenience. The main difference between Fig. 7 and Fig. 4, Fig. 5 or Fig. 6 is in the number of the first plurality of lenses 402, 404 and 406, the corresponding number of at least one reflective surfaces 408, the corresponding number of aperture stops 410, the corresponding number of the second plurality of lenses 412, 414, and 416 and the corresponding number of filters 418. Instead of just two or four sets of lenses for the first plurality of lenses 402, 404 and 406, two or four reflective surfaces 408, two or four aperture stops 410, two or four sets of lenses for the second plurality of lenses 412, 414, and 416 and two or four filters 418, which is the case for Fig. 4, Fig. 5 or Fig. 6, there are now six sets of lenses for the first plurality of lenses 402, 404 and 406, six reflective surfaces 408, six aperture stops 410, six sets of lenses for the second plurality of lenses 412 414 and 416, and six filters 418. !n Fig. 7, the six sets of lenses of the first plurality of lenses 402, 404, 406 are evenly facing different directions of the surrounding of the image capturing apparatus 600 in a manner to capture light from the different directions for producing a 360 degrees or omnidirectional image of the surrounding with respect to the image capturing apparatus 700. An example of the single image sensor 420 in Fig. 7 has a size of 4/3" (e.g. 17.3 mm by 13 mm) and a generic 4:3 aspect ratio. The optical elements 402, 404, 406, 408, 410, 412, 414, 416, and 418 in Fig. 7 are arranged such that optical information of a sphere centred about the image capturing apparatus 700 can be imaged onto the single image plane of the image sensor 420. in the case of Fig. 7, as there are six sets of the first plurality of lenses 402, 404, and 406, n of equation 1 takes the value of 6 and the minimum FOV of each set of the lenses is approximately 63.3 degrees.

Fig. 8 shows the image capturing apparatus 400 of Fig. 4, Fig. 8 illustrates a "dead zone" or "blind spot" 800 of the image capturing apparatus 400 where nothing in this dead zone 800 can be captured as part of the omnidirectional image the image capturing apparatus 400 is configured to capture. This dead zone 800 is due to all lens in the image capturing apparatus 400 having a finite physical length from a vertex of the first lens 402 to the image sensor 420. The size of this dead zone 800 can be reduced by increasing the FOV of the first plurality of lenses 402, 404, and 406, and reducing physical size of the image capturing apparatus 400 to reduce product (product refers to the image capturing apparatus 400) footprint such that a distance d between vertexes of the two outer most facing Senses 402 are as short as possible. The dead zone 800 exists on two opposing sides of the image capturing apparatus 400 but only one side is drawn in Fig. 8 to keep the figure simple. To increase the FOV of the first plurality of lenses 402, 404, and 406 whilst trying to maintain short distance d between vertexes of the two outer most facing lenses 402 would require fairly complicated optical designs and the use of aspherics lens. This poses challenges during lens fabrication and assembly and increases manufacturing costs. By moving from a multiple sensors platform in existing omnidirectional cameras to a single image sensor and using folded optics involving use of the at least one reflective surfaces 408 to direct light orthogonally through the aperture stops 410. the second plurality of lenses 412, 414, and 416. and the filters 418 to the image sensor 420, the challenges during lens fabrication, assembly and on costs mentioned above can be sidestepped. The distance x between vertexes of the two outer most facing lenses 402 can be kept short by folding the optical axis but the total optical track length of the lenses 402, 404, 406, 412, 414 and 416, are maintained to be long. This further yields advantages in lens design as curvature of each optical surface of the lenses 402, 404, 408, 412, 414 and 418 can be more relaxed resulting in a design with less sensitive manufacturing tolerances compared to lenses not employing folded optics with similar optical performances but a lower optical track length and where multiple image sensors are used.

To minimize cost in the assembly of an omnidirectional camera, for instance, the image capturing apparatus 400, 500, 600 and 700, it would be ideal to be able to use an image sensor 420 that is as small as possible, however, due to opto-mechanicai constraints, it could sometimes be difficult to fit two images onto a single image plane of an image sensor 420. For example, in the image plane 1000 of an image sensor (for example, the image sensor 420) shown in Fig. 10, parts of light patches 1002 and 1004 directed to the image plane 1000 from lenses of the omnidirectional camera are outside the image plane 1000 due to opto-mechanical constraints of the omnidirectional camera. This will affect the omnidirectional image to be generated. In cases like this, with reference to Fig. 11 , it would be advantageous to dispose the image plane 1100 in an orientation for receiving fully patches of light 1102 and 1104, if shape of the image plane 1100 cannot receive fully the patches of light 1102 and 1104 in another orientation of the image plane 1100 (For example, the orientation of the image plane 1000 in Fig. 10). Specifically, the orientation of the image plane 1100 for receiving fully the patches of light 1102 and 1104 can be such that the image plane 1100 is rotated by 25 degrees to 45 degrees about an axis normal (or perpendicular) to the image plane 1100 from the orientation of the image plane 1100 that cannot receive fully the patches of light 1102 and 1104. In the case of image plane 1100, its orientation is rotated by 45 degrees. This will maximize the image plane surface provided by a smaller image sensor. Rotation of the image plane of the image sensor such that it is in a diagonal configuration as shown in Fig. 11 would allow a smaller image sensor that is of lower cost to capture multiple images in the form of the light patches 1102 and 1104.

if the rotation illustrated by the image plane 1100 is required, the exact rotation required depends on a combination of factors such as physical sizes of the folded optics that is the size of the first plurality of lenses 402, 404, and 406, the number of the sets of the first plurality of lenses 402, 404, and 406 used, the image size (relates to light patch size) to be formed on the image plane 1100, the aspect ratio of the image sensor 420, the design of the lenses used, and the size of the image sensor 420. For example, with reference to Fig. 9, an image capturing apparatus 900 is shown. The image capturing apparatus 900 has similar features as the image capturing apparatuses 400 of Fig. 4, 500 of Fig. 5, 600 of Fig. 6, and 700 of Fig. 7. The reference numerals of elements having similar features and functions in Fig. 4, Fig. 5, Fig. 6 and Fig. 7 are reused in Fig. 9 for convenience. The main difference between Fig. 9 and Fig. 4, Fig. 5, Fig. 6 and Fig. 7 is in the orientation of the image plane of the image sensor 420. Like the case in Fig. 11 , the image plane of the image sensor 420 in Fig. 9 is rotated. Specifically, the single image sensor 420 in Fig. 9 has a size of 4/3" (e.g. 17.3 mm by 13 mm) to accommodate a generic 4:3 aspect ratio. The same image sensor 420 in Fig. 9 can fit a smaller generic 4:3 aspect ratio sensor of size 1" (12.8 by 9.6 mm) if it was rotated by an angle of about 33 degrees. By rotating at an angle of 33 degrees, the image plane area of the image sensor 420 is maximised to accommodate two images or light patches, if the image sensor 420 is arranged in horizontal orientation and not rotated, a generic 4:3 aspect ratio sensor of size 1/2.7" would occur and the situation would be the same as that shown in Fig. 10.

Most commercially available sensors has an aspect ratio of 4:3 and it is most commonly used for mobile phone or low cost digital cameras. Although, the examples shown in the present disclosure are sensors of 4:3 aspect ratio, other types of commercially available sensors having aspect ratios of 3:2 and 16:9, which are typically used in professional grade cameras, can also be used. These sensors have a much longer width compared with 4:3 aspect ratio sensors. In cases like these, if much easier to fit multiple images on to a single image sensor even without having to rotate the image plane of the image sensor as the physical length of the image sensor is long enough to accommodate multiple images. This is illustrated by Fig. 13, 14, and 15. Fig. 12 shows how a single image 1202 (or light patch) is focused on an image plane 1200 having 16:9 ratio in conventional cameras.

Fig. 13 shows how two images 1302 and 1304 (or light patches) directed by two sets of lenses, for example, each set comprising a set of the first plurality of lenses 402, 404, and 406 in Fig. 4 or 5, one reflective surface 408 in Fig. 4 or 5, one aperture stop 410 in Fig. 4 or 5, a set of the second plurality of lenses 412, 414, and 416 in Fig. 4 or 5, and a filter 418 in Fig. 4 or 5, can be formed on an image plane 1300 of an image sensor with a 16:9 format.

Fig. 14 shows how four images 1402, 1404, 1406 and 1408 (or light patches) directed by four sets of lenses, for example, each set comprising a set of the first plurality of lenses 402, 404, and 406 in Fig, 6, one reflective surface 408 in Fig. 6, one aperture stop 410 in Fig. 6, a set of the second plurality of lenses 412, 414, and 416 in Fig. 6, and a filter 418 in Fig. 6, can he formed on an image plane 1400 of an image sensor with a 16:9 format.

Fig. 15 shows how six images 1502, 1504, 1506, 1508, 1510, and 1512 (or light patches) directed by two sets of lenses, for example, each set comprising a set of the first plurality of lenses 402, 404, and 406 in Fig. 7, one reflective surface 408 in Fig. 7, one aperture stop 410 in Fig. 7, a set of the second plurality of lenses 412, 414, and 416 in Fig. 7, and a filter 418 in Fig. 7, can be formed on an image plane 1500 on an image sensor with a 16:9 format.

As the aspect ratio of the image sensor increases, the area of the image plane of the image sensor increases and it becomes more advantageous to direct multiple images (or light patches) onto the image plane.

The examples in the present disclosure advocate imaging of images onto a common single image plane of an image sensor. As there is only a single image plane of the image sensor, the electronics and processing by the example image capturing apparatuses can be simplified as shown in a block diagram in Fig. 16. Fig. 16 shows the block diagram of components required to capture and process two images imaged onto a single image plane of an image sensor 1606 (similar to image sensor 420 in the earlier figures). The two images are each captured by a lens set 1602 or 1604 comprising the first plurality of lenses 402, 404 and 406 in the earlier figures, the corresponding number of at least one reflective surfaces 408 in the earlier figures, the corresponding number of aperture stops 410 in the earlier figures, the corresponding number of the second plurality of lenses 412, 414, and 416 in the earlier figures and the corresponding number of filters 418 in the earlier figures. Electronic components are housed on a plastic circuit board (PCB) know as a System on Chip (SoC) 1608. The SoC 1608 houses electronics for imaging processing such as a an image Signal Processor (ISP) 1610, a microprocessor 1614, memory 1612 and input/output interfaces such as a universal serial bus (USB) interface 1616 for wired data communications with external devices and a wireless connection interface 1618 (can be a wireless integrated chip) for wireless data communication. The features of each of the lens sets 1602 and 1604 and the image sensor 1606 in Fig. 16 are similar to each optical set comprising the optical elements 402, 404, 406, 408, 410, 412, 414, 416 418 as shown in Fig. 4, 5, 6 and 7. That is, each of the lens sets 1602 and 1604 directs optical information to be collected onto an image plane of the image sensor 1606. The optical information refers to data of the images (or light patches) directed onto the image sensor 1606. In the case of Fig. 16, there are two images or light patches captured by the image sensor 1606.

The ISP 1610 communicates with the image sensor 1606 and extracts the optical information stored on it. The images (or light patches) stored on the image sensor 1606 are each in the form of a circle due to the lens design. Images (or light patches) in omnidirectional cameras can be circular as lens providing such circular images are easier to design and lower cost to fabricate and assemble. The purpose of the ISP 1610 is to carry out a process known as dewrapping, which maps the circular image into a rectilinear form (rectangular form). The ISP 1610 will also carry out additional operations such as sharpening to make the images appear sharper to the human eye. Data of images (or light patches) directed or focused on the image plane of the image sensor 1606 are sent to a microprocessor 1614 to be stitched together to form an omnidirectional image. As the lens sets 1602 and 1604 used to capture the images have a wide FOV, there will be some overlap between the rectilinear images formed from the more than one circular images or light patches. This overlap between the images or light patches serves as a reference points or locations to indicate where stitching is to be carried out. Once the microprocessor 1614 determines where these reference points or locations are, the microprocessor 1614 would carry out all the required calibration such as hue, gain and white balancing to ensure that each image or light patch would appear similar to human eye after the stitching process has been carried out. Once that has been done, the images are stitched up to form a spherical or omnidirectional image. After the image has been formed, it is converted into JPEG format and can optionally be transferred to a computer 1620 (which can be a desktop computer, laptop, tablet computer, smart phone, and the like) via the USB port 1616 or through the wireless connection interface 1618,

Fig 17 shows a conventional omnidirectional camera that requires processing of two images imaged through use of two lens set 1702 and 1704 onto two different image sensors 1706 and 1708. Similar to the image capturing apparatus 300 in Fig. 3, each lens set 1702 or 1704 is responsible for directing light to a respective image sensor 1706 or 1708 respectively. For every image sensor 1702 or 1708 present, a corresponding ISP 1710 or 1724 respectively is used to carry out image processing to obtain an omnidirectional image. Furthermore, before image stitching process to obtain the omnidirectional image can be carried out, the individual images (or received light patches) processed by each ISP 1710 and 1724 would have to be combined into a single frame. Images are combined into the same frame by sending the processed images to a Field-Programmable Gate Array (FPGA) 1722. The FPGA 1722 and extra ISP 1724 are not required if imaging is carried on a common single sensor as the images are already collected within the same frame of the common single image sensor. Hence, the example shown in Fig. 16 has a better design over that of the conventional omnidirectional camera shown in Fig. 17. The memory 1712, microprocessor 1714, USB interface 1716, wireless connection interface 1718, and computer 1720 may be similar to the memory 1612, microprocessor 1614, USB interface 1616, wireless connection interface 1718, and computer 1620 and they are provided for completeness and illustration only. A actual conventional omnidirectional camera may not have any one of the memory 1712, microprocessor 1714, USB interface 1716, wireless connection interface 1718, and computer 1720.

Fig. 18 illustrates a simplified diagram of a stitching process as discussed earlier for an omnidirectional camera (for example, the image capturing apparatuses 400, 500, 600, and 700 shown In Fig. 4, 5, 6 and 7} with two lens sets (for example, the optical sets comprising the optical elements 402, 404, 406, 408, 410, 412, 414, 416 418 shown in Fig. 4, 5, 6 and 7) imaging onto a common image sensor (for example 420 in Fig. 4, 5, 6, and 7). 11 1802 and I2 1804 are circular images (or light patches) formed on an image plane of the common image sensor. The images 11 1802 and I2 1804 are thereafter dewrapped into rectilinear forms of R1 1902 and R2 1904 respectively as shown in Fig. 19, With reference to Fig. 20, overlapping regions in R1 1902 and R2 1904 are then detected and both the images R1 1902 and 1904 are stitched at the overlapping regions to form a spherical or omnidirectional image. From Fig. 16 to 20, , it is clear that by moving from a multiple sensor platform to one that uses a common image sensor, the stitching process can be simplified and the electronics required can be reduced. Thus, a simpler and better design that costs lower is realised.

In conventional omnidirectional cameras, sensors and the supporting electronics (ISP and FPGA) consume a large part of battery life. For some conventional omnidirectional devices that are commercially available, the battery life for continuous video shoot is no longer than 60 minutes. Thus, any reduction in the electronics required would prolong battery life and generate lesser heat. High heat built-up within an air and water tight device would result in the device heating up locally and this can result in a drop in imaging quality if plastic optics are being used, which is common in the case of consumer electronics, as the thermal expansion of plastics are about 10 times higher than glass (a costlier alternative to plastics).

The examples of image capturing apparatuses 400, 500, 600 and 700 described with reference to Fig. 4, 5, 6 and 7 respectively may be used for consumer electronics operating within a visible spectrum of 400-700 nm by employing the required type of glass and optics lens. However, the operating wavelength of these image capturing apparatuses 400, 500, 600 and 700 may be shifted to short wave infrared (SWiR) that operates between 3000-5000 nm or that of long wave infrared (LW!R) that operates between 8000-12000 nm. This can be achieved by replacing the lens (for example, 402, 404, 406, 412, 414 and 416 in Fig, 4, 5, 8 and 7) used with materials such as Silicon, Germanium, CaF2, ZnS, ZnSe and the chaicogenide glasses. A SWiR based or a LWIR based omnidirectional camera with a single image sensor according to the examples in the present disclosure would allow for the fabrication of compact devices for surveillance and detection applications. It is possible for the lens to make use of a special surface such as a aspheric, diffractive or hybrid surface, which within the context of this disclosure, is defined to be surface having aspheric and diffractive properties. These surfaces may be manufactured through conventional plastic or glass molding processes or directly diamond turned using single point diamond turning (SPOT) machining techniques. The application of these special surfaces can simplify the optical design of the lens by reducing the number of optical elements and at the same time improving the optica! performance of the lens. That is, in other examples, the lens 402, 404, 406, 412, 414 and 416 in Fig. 4, 5, 6 and 7 may be machined with such special surfaces.

Furthermore, between Fig. 4, 5, 6 and 7 and the prior art Fig. 1 , 2 and 3, clearly there are differences in the sizes and dimensions and the design in Fig. 4, 5, 6 and 7 have better form factor. Fig. 6 and 7 also have more elements but they provide better accuracy and have lesser dead zones.

Whilst there has been described in the foregoing description several examples, it wilt be understood by those skilled in the technology concerned that many variations or modifications in details of design or construction may be made without departing from the scope of the present disclosure in relation to the non-prior art subject matter.