Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A MULTI-IMAGE SENSOR APPARATUS
Document Type and Number:
WIPO Patent Application WO/2018/077446
Kind Code:
A1
Abstract:
An apparatus comprising: a first set of optical elements arranged in a single plane, wherein each optical element is configured to focus light from a different field of view onto an associated different sensor and wherein each optical element has an optical axis within a single plane; and a first set of sensors,not lying in any single plane, wherein each sensor is configured to sense light focused from an associated one of the first set of optical elements.

Inventors:
JÄRVENPÄÄ TONI JOHAN (FI)
SALMIMAA MARJA PAULIINA (FI)
Application Number:
PCT/EP2016/076262
Publication Date:
May 03, 2018
Filing Date:
October 31, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA TECHNOLOGIES OY (FI)
International Classes:
H04N5/232; G03B5/04; H04N5/225
Foreign References:
US20110069148A12011-03-24
US20150234198A12015-08-20
JP2013003756A2013-01-07
EP2037685A12009-03-18
Other References:
ANONYMOUS: "Tilt-shift photography - Wikipedia", 15 August 2016 (2016-08-15), XP055380864, Retrieved from the Internet [retrieved on 20170613]
TAO SUN ET AL: "A Novel Multi-Digital Camera System Based on Tilt-Shift Photography Technology", SENSORS, vol. 15, no. 4, 31 March 2015 (2015-03-31), pages 7823 - 7843, XP055380900, DOI: 10.3390/s150407823
Attorney, Agent or Firm:
HIGGIN, Paul et al. (GB)
Download PDF:
Claims:
CLAIMS

1 . An apparatus comprising:

a first set of optical elements arranged in a single plane, wherein each optical element is configured to focus light from a different field of view onto an associated different sensor and wherein each optical element has an optical axis within a single plane; and

a first set of sensors, not lying in any single plane, wherein each sensor is configured to sense light focused from an associated one of the first set of optical elements.

2. An apparatus as claimed in claim 1 , wherein each of the first set of optical elements has a field of view, and the fields of view of the first set of optical elements in combination provide a field of view that is 360° within the single plane and 180° out of the single plane.

3. An apparatus as claimed in any preceding claim, wherein each of the first set of optical elements has the same optical power. 4. An apparatus as claimed in any preceding claim, wherein each of the first set of optical elements is or comprises a wide-angle lens.

5. An apparatus as claimed in any preceding claim, wherein the first set of optical elements is arranged on a curve in the single plane.

6. An apparatus as claimed in claim 5, wherein the curve is a plane, closed, simple curve that curves in one sense only to form a closed loop.

7. An apparatus as claimed in any preceding claim, wherein a first plurality of the first set of sensors are offset in a first direction orthogonal to the single plane and a second plurality of the first set of sensors are offset in a second direction orthogonal to the single plane, wherein the second direction is opposite the first direction.

8. An apparatus as claimed in claim 7, wherein at least a first portion of each of the first plurality of the first set of sensors lies in the single plane and at least a second portion of each of the first plurality of the first set of sensors lies outside the single plane in the first direction, and at least a first portion of each of the second plurality of the first set of sensors lies in the single plane and at least a second portion of each of the second plurality of the first set of sensors lies outside the single plane in the second direction.

9. An apparatus as claimed in claim 8, wherein the second portions of the first plurality of the first set of sensors and the second portions of the second plurality of the first set of sensors form a repeating pattern.

10. An apparatus as claimed in claim 8 or 9, wherein the second portions of the first plurality of the first set of sensors are regularly spaced and the second portions of the second plurality of the first set of sensors are regularly spaced.

1 1 . An apparatus as claimed in any of claims 7 to 10, wherein each of the first plurality of the first set of sensors has as nearest neighbors from the second plurality of the first set of sensors, sensors on opposite sides and wherein each of the second plurality of the first set of sensors has as nearest neighbors from the first plurality of the first set of sensors, sensors on opposite sides.

12. An apparatus as claimed in any one of claims 7 to 1 1 , wherein each of the first plurality of the first set of sensors is positioned to lie on a first closed, plane, simple curve and wherein each of the second plurality of the first set of sensors is positioned to lie on a second closed, plane, simple curve wherein the first closed, plane, simple curve and the second closed, plane, simple curve are the same curve offset in a direction orthogonal to a plane parallel to each of the curves. 13. An apparatus as claimed in any preceding claim, wherein image circles defined by the first set of optical elements are not concentric with sensing areas of the associated sensors.

14. An apparatus as claimed in any preceding claim, wherein only a part of image circles defined by the first set of optical elements fall upon sensing areas of the associated sensors. 15. An apparatus as claimed in any preceding claim, wherein the sensing area of each sensor of the first set of sensors is fitted to a width of the image circle of its associated optical element but not fitted to a height of the image circle.

16. An apparatus as claimed in any preceding claim, wherein the sensing area of each sensor is a rectangle lying in a plane orthogonal to the single plane, wherein one of the edges of the sensing area parallel to the single plane transects the image circle defined by the optical element associated with the respective sensor.

17. An apparatus as claimed in any preceding claim, wherein a sensing area of each sensor of the first set of sensors is a rectangle lying in a plane orthogonal to the single plane wherein the edges of the sensing area orthogonal to the single plane are tangential to an image circle defined by the optical element associated with the respective sensor and wherein only one of the edges of the sensing area parallel to the single plane is tangential to the image circle.

18. An apparatus as claimed in claim 7, wherein the first set of optical elements defines an image plane, orthogonal to the single plane comprising a central portion where the single plane and the image plane intersect, a first portion, contiguous to the central portion of the image plane and offset from the central portion in the first direction and a second portion of the image plane, contiguous to the central portion of the image plane and offset from the central portion of the image plane in a second direction in the image plane, opposite the first direction,

wherein the first plurality of the first set of sensors are configured to capture central portions of the image plane and first portions of the image plane but not second portions of the image plane and wherein the second plurality of the first set of sensors are configured to capture central portions of the image plane and second portions of the image plane but not first portions of the image plane.

19. An apparatus as claimed in claim 18, wherein the first portions of the image plane captured by the first plurality of the first set of sensors cover all of the first portion of the image plane and wherein the second portions of the image plane captured by the second plurality of the first set of sensors covers all of the second portion of the image plane. 20. An apparatus as claimed in claim 18 or 19, wherein the first portions of the image plane captured by the first plurality of the first set of sensors overlap to cover a zenith and wherein the second portions of the image plane captured by the second plurality of the first set of sensors overlap to cover a nadir. 21 . An apparatus as claimed in any preceding claim comprising a controller configured to create a composite image of a scene, wherein the composite image comprises portions of images captured by the first set of sensors.

22. An apparatus as claimed in claim 21 , wherein the processor is configured to preferentially select which one of two images to use for each portion of an overlapping area, wherein the overlapping area is defined as where the two images, associated with different sensors, overlap.

23. An apparatus as claimed in claim 21 or 22, wherein the controller is configured to determine a putative boundary between two overlapping images wherein the composite image on one side of the boundary is or is predominantly determined by the first one of the two overlapping images and the composite image on the other side of the boundary or is predominantly determined by the other second one of the two overlapping images.

24. An apparatus as claimed in claim 23, wherein the processor is configured to adapt the boundary to avoid passing through an object in the first image and/or the second image. 25. An apparatus as claimed in claim 23 or 24, wherein the processor is configured to adapt the boundary to avoid passing through a perspective discontinuity arising from different perspectives associated with the different sensors that capture the first and second images.

26. An apparatus as claimed in any one of claims 21 to 25, wherein the processor is configured to control how an object is represented in the composite image, wherein when an object extends from a first portion of a first one of two overlapping images into a second portion of a second one of the two overlapping images, wherein the first portion is not in an overlapping area where the first one of the two overlapping images and the second one of the two overlapping images overlap and the second portion is in an overlapping area where the first one of the two overlapping images and the second one of the two overlapping images overlap, the processor being configured to define in a composite image the object using only the first one of the two overlapping images and not using the second one of the two overlapping images.

27. An apparatus as claimed in any one of claims 21 to 26, wherein the composite image provides a field of view that is 360° within the single plane and 180° out of the single plane.

28. An apparatus as claimed in any preceding claim, comprising a second set of sensors, not lying in any single plane wherein each sensor of the second set of sensors is configured to sense light focused from an associated one of the first set of optical elements.

29. An apparatus as claimed in claim 28, wherein the first set of sensors and the second set of sensors form a set of stereoscopic sensor pairs.

30. An apparatus as claimed in claim 29, wherein each optical element of the first set of optical elements operates as a left eye pupil for one stereoscopic sensor pair and operates as a right eye pupil for a different stereoscopic sensor pair.

31 . An apparatus as claimed in any preceding claim, wherein adjacent optical elements of the first set of optical elements have a separation of between 5 and 8 cm.

32. An apparatus as claimed in any preceding claim, configured to capture images for a virtual reality system.

33. An apparatus comprising:

a first set of optical elements wherein each optical element is fixedly configured to focus light from a different field of view onto an associated different sensor; and

a first set of sensors, wherein each sensor is fixedly configured to sense light focused from an associated one of the first set of optical elements,

wherein the relative positioning of the optical elements is different to the relative positioning of the sensors.

34. An apparatus comprising:

a first set of optical elements wherein each optical element is configured to focus light from a different field of view onto an associated different sensor; and

a first set of sensors, wherein each sensor is configured to sense light focused from an associated one of the first set of optical elements,

wherein image circles of the first set of optical elements and their associated sensors are not concentric. 35. An apparatus comprising:

a first set of optical elements wherein each optical element is configured to focus light from a different field of view onto an associated different sensor; and

a first set of sensors, wherein each sensor is configured to sense light focused from an associated one of the first set of optical elements,

wherein only a part of image circles defined by the first set of optical elements fall upon sensing areas of the associated sensors.

36. An apparatus comprising:

a first set of optical elements wherein each optical element is configured to focus light from a different field of view onto an associated different sensor; and

a first set of sensors, wherein each sensor is configured to sense light focused from an associated one of the first set of optical elements,,

wherein, for each of the first set of sensors, an edge of a sensing area for a sensor transects an image circle defined by the optical element associated with the sensor.

Description:
TITLE

A multi- image sensor apparatus TECHNOLOGICAL FIELD

Embodiments of the present invention relate to a multi-image sensor apparatus. BACKGROUND There is a current trend towards the use of multi-image sensor (multi-camera) systems that capture images simultaneously from different points of view.

In some implementations the fields of view of the sensors overlap at their edges to create a continuous field of view which may extend 360° around the apparatus and up to and beyond a zenith above the apparatus and down to and beyond a nadir below the apparatus.

The current apparatus currently use a constellation of cameras. Each camera comprises a paired optical element (lens) and an image sensor and points in a different direction.

It can be difficult to create the continuous field of view without either using a large number of cameras or using expensive optical arrangements. It would be desirable to provide an improved multi- image sensor apparatus.

BRIEF SUMMARY

According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: a first set of optical elements arranged in a single plane, wherein each optical element is configured to focus light from a different field of view onto an associated different sensor and wherein each optical element has an optical axis within a single plane; and a first set of sensors, not lying in any single plane, wherein each sensor is configured to sense light focused from an associated one of the first set of optical elements. According to various, but not necessarily all, embodiments of the invention there is provided examples as claimed in the appended claims. BRIEF DESCRIPTION

For a better understanding of various examples that are useful for understanding the detailed description, reference will now be made by way of example only to the accompanying drawings in which:

Fig 1 illustrates a cross-section through an example of an apparatus;

Fig 2A illustrates a cross-section through the apparatus of Fig 1 through the line X-X';

Fig 2B illustrates a cross-section through the apparatus of Fig 1 through the line Y-Y';

Fig 3 illustrates a camera comprising an optical element and an image sensor;

Fig 4 illustrates an example of an image circle;

Fig 5 illustrates a relationship between the image circles of the first set of optical elements and their associated sensors;

Fig 6 illustrates an example of an image plane comprising overlapping images captured by the first set of sensors;

Fig 7 illustrates an example of a controller configured to create a composite image; Fig 8 illustrates an example of a composite image;

Fig 9 illustrates an example of adapting a boundary between overlapping images in a composite image; and

Fig 10 illustrates an example of the apparatus configured to provide stereoscopic (3D) images.

DETAILED DESCRIPTION

Referring to Figs 1 , 2A, 2B and 3, there is illustrated an example of an apparatus 10 comprising: a first set of optical elements 20 arranged in a single plane (Figs 2A, 2B; 24) , wherein each optical element 20 is configured to focus light (Fig 3, 31 ) from a different field of view (Fig 3, 25) onto an associated different sensor 30 and wherein each optical element 20 has an optical axis 21 within a single plane 24; and a first set of sensors 30, not lying in any single plane, wherein each sensor 30 is configured to sense light 31 focused from an associated one of the first set of optical elements 20. The first set of optical elements 20 is arranged on a curve 22 in the single plane 24. In this example, the curve 22 is a plane, closed, simple curve. The curve 22 curves in one sense only to form a closed loop. In this example, but not necessarily all examples, the curve 22 is a circle.

In this example, each of the first set of optical elements 20 has a fixed position and orientation so that fixed perspective entrance pupils to the sensors 30 are defined.

The sensors 30 are image sensors. In this example, each of the sensors 30 has a fixed position and orientation so that fixed perspective entrance pupils to the sensors 30 are defined.

Fig 1 illustrates a cross-section through the apparatus 10 in the single plane 24. Fig 2A illustrates a cross-section through the apparatus 10 in a plane orthogonal to the single plane 24 and through the line X-X'. Fig 2B illustrates a cross-section through the apparatus 10 in a plane orthogonal to the single plane 24 and through the line Y- Y'.

As illustrated in Fig 2A, a first plurality of the first set of sensors 30 (first sensors 41 ) are offset in a first direction D1 orthogonal to the single plane 24. As illustrated in Fig 2B, a second plurality of the first set of sensors 30 (second sensors 42) are offset in a second direction D2 orthogonal to the single plane 24. The second direction D2 is opposite the first direction D1. In this example but not necessarily all examples, the sensors 30 are the same sensors, with sensing areas of the same aspect ratio, for example, 4:3. In other examples, the sensors 30 are different sensors, with sensing areas of different aspect ratio, for example, The sensing area is defined by an array of sensels in a sensing plane 34 orthogonal to the single plane 24. A sensel is a single sensor element of the array of sensor elements. The sensors 30 may be configured to capture video images or still images. In some but not necessarily all examples, the sensing area defined by the array of sensels is planar (flat). In some other examples, sensing area defined by the array of sensels is curved in two-dimensions e.g. it may be spherically curved. Each of the first set of optical elements 20 has the same or different optical power. In this example, each of the first set of optical elements 20 is or comprises a wide-angle lens. For example, each optical element may be a fisheye lens of short focal length, symmetric about a center optical axis 21. The optical element 20 causes symmetric, bending of light rays.

Referring to Fig 3, each of the first set of optical elements 20 has a field of view 23 centered on an optical axis 31 . Each optical element 20 re-directs light 31 from within the field of view 23 onto the sensor 30. The field of view 23 may, for in some examples, be less than or equal to 220 or 200 degrees.

The fields of view 23 of the first set of optical elements 20 in combination provide a field of view for the apparatus that is 360° or less within the single plane 24 and 180° or less out of the single plane 24. Referring back to Fig 1 , a polar axis extends orthogonally from the plane 24 out of the page through the origin O. The fields of view of the first set of optical elements 20 in combination provide a field of view for the apparatus 10 that extends all the way around the polar axis (the polar angle θ is 0≤θ<360° in a spherical coordinate system) and all the way from a zenith pole to a nadir pole (the azimuthal angle a is 0≤a<180° in a spherical coordinate system) Each optical element 20 causes symmetric, bending of light rays to form a light cone. Where this light cone falls on a sensor 30, an image circle 50 is defined as illustrated in Fig 4. The image circle 50 is a cross section of a cone of light transmitted by optical element 20 onto the sensing plane 34 defined by the associated sensor 30. Fig 5 illustrates a relationship between the image circles 50 of the first set of optical elements 20 and the sensors 30. The image circle 50 for each of the optical elements is centered on the single plane 24.

In Fig 5 the curve 22 on which the optical elements 20 are located is represented as a rectilinear line. As can be seen from Fig 5, the image circles 50 of the first set of optical elements 20 and the associated sensors 30 are not concentric. Only a part of the field of view 23 defined by an optical element 20 falls onto the associated sensor 30.

As has previously been described with reference to Figs 2A and 2B, in Fig 5 a first plurality of the first set of sensors 30 (first sensors 41 ) are offset in a first direction D1 orthogonal to the single plane 24 and a second plurality of the first set of sensors 30 (second sensors 42) are offset in a second direction D2 orthogonal to the single plane 24. The second direction D2 is opposite the first direction D1.

A first portion 41 -i of each of the first sensors 41 lies in the single plane 24 and a second portion 41 2 of each of the first sensors 41 lies outside the single plane 24 in the first direction D1. The center of each of the first sensors 41 is above the plane 24.

A first portion 42-i of each of the second sensors 42 lies in the single plane 24 and a second portion 42 2 of each of the second sensors 42 lies outside the single plane 24 in the second direction D2. The center of each of the first sensors 41 is below the plane 24.

The second portions 41 2 of the first sensors 41 and the second portions 42 2 of the second sensors 42 form a repeating pattern. The second portions 41 2 of the first sensors 41 are regularly spaced in a direction parallel to the single plane 24. The second portions 42 2 of the second sensors 42 are regularly spaced in a direction parallel to the single plane 24.

In the example illustrated, the first sensors 41 and the second sensors 42 alternate along the length of the curve 22. The first sensors 41 are separated from each other by a separation distance S along the curve 22 and the second sensors 42 are separated from each other by the same separation distance S along the curve 22. Each of the first sensors 41 is separated from a second sensor 42, to the left by a separation S/2 and is separated from a second sensor 42, to the right by a separation S/2. Therefore each of the first sensors 41 has as nearest neighbors from the second sensors 42, sensors on opposite sides. Each of the second sensors 42 has as nearest neighbors from the first sensors 41 , sensors on opposite sides. In this example, the separation distance between first sensors (and their associated optical elements 20) is constant but in other examples it may vary. In this example, the separation distance between second sensors (and their associated optical elements 20) is constant but in other examples it may vary. In this example, the separation distance between adjacent first and second sensors (and their associated optical elements 20) is constant but in other examples it may vary.

It will be appreciated from Fig 1 and Fig 5 that each of the first sensors 41 is positioned to lie on a first closed, plane, simple curve 43 that is parallel to the single plane 24 and each of the second sensors 42 is positioned to lie on a second closed, plane, simple curve 44 that is parallel to the single plane 24. The curves 43, 44 have the same shape. The first closed, plane, simple curve 43 and the second closed, plane, simple curve 44 are the same curve offset in a direction D1 , D2 orthogonal to a plane parallel to each of the curves 43, 44. Consequently the first sensors 41 and the second sensors 42, in the example of Fig 1 lie on surface of a virtual cylinder.

As mentioned above, the image circles 50 defined by the first set of optical elements 20 are not concentric with sensing areas 36 of the associated sensors. Consequently, only a part of image circles 50 defined by the first set of optical elements 20 fall upon sensing areas 36 of the associated sensors.

The sensing area 36 of each sensor 30 is fitted to a width of the image circle 50 of its associated optical element 20 but not fitted to a height of the image circle 50. In the illustrated example, the sensing area 36 of each sensor 30 is a rectangle lying in a sensing plane 34 orthogonal to the single plane 24. One of the horizontal edges of the sensing area 36 parallel to the single plane 24 transects the image circle 50 defined by the optical element 20 associated with the respective sensor 30. The vertical edges of the sensing area 36 orthogonal to the single plane 24 are tangential to the image circle 50 and the other horizontal edge of the sensing area 36 parallel to the single plane 24 is tangential to the image circle 50.

Referring to Fig 6, the first set of optical elements 20 defines an image plane 25, orthogonal to the single plane 24 comprising a central portion 60 where the single plane 24 and the image plane 25 intersect, outlying portions 62, 64 including a first outlying portion 62, contiguous to the central portion 60 of the image plane 25 and offset from the central portion 60 in the first direction D1 and a second outlying portion 64 of the image plane 25, contiguous to the central portion 60 of the image plane 25 and offset from the central portion 60 of the image plane 25 in a second direction D2 in the image plane 25, opposite the first direction D1 .

The first sensors 41 are configured to capture images 661 that in combination capture central portions 60 of the image plane 25 via the first portions 41 ^ of the first sensors

41 and capture first outlying portions 62 of the image plane 25 (but not second outlying portions 64 of the image plane 25) via the second portions 41 2 of the first sensors 41 .

The second sensors 42 are configured to capture images 66 2 that in combination capture central portions 60 of the image plane 25 via the first portions 42-i of the second sensors 41 and capture second outlying portions 64 of the image plane 25 (but not first outlying portions 62 of the image plane 25) via the second portions 42 2 of the second sensors 41.

The first outlying portions 62 of the image plane 25 captured by the first sensors 41 cover all of the first outlying portion 62 of the image plane 25. The first outlying portions 62 of the image plane 25 captured by the first sensors 41 overlap each other to cover a zenith and overlap sequentially to extend 360° around the polar axis.

The second outlying portions 64 of the image plane 25 captured by the second sensors 42 cover all of the second outlying portion of the image plane 25. The second outlying portions 64 of the image plane 25 captured by the second sensors

42 overlap each other to cover a nadir and overlap sequentially to extent 360° around the polar axis. The central portions 60 of the image plane 25 captured by the first sensors 41 and the central portions 60 of the image plane 25 captured by the second sensors 42 overlap each other sequentially to extend 360° around the polar axis.

In some examples, the first sensors 41 have the same resolution (number of sensels) as the second sensors 42. The images 661 captured by the first sensors 41 have the same resolution (number of pixels) as images 66 2 captured by the second sensors 42.

In other examples, the first sensors 41 have a different, higher resolution (number of sensels) than the second sensors 42. The images 661 captured by the first sensors 41 have a higher resolution (number of pixels) than the images 66 2 captured by the second sensors 42. The high resolution images 661 may be used to provide a high resolution panorama. The low resolution images 66 2 may be used to provide a low resolution panorama

Fig 7 illustrates an example of a controller 100 configured to create a composite image 70.

Fig 8 illustrates an example of a composite image 70. The composite image 70 comprises portions 72 of images 66 captured by the first set of sensors 30 and creates an image of a scene external to the apparatus 10. Boundaries 80 may separate the portions 72.

The composite image 70 provides a field of view that is 360° within the single plane 24 and 180° out of the single plane 24. It is a panoramic image.

The controller 100 is configured to preferentially select which one of two images 66 to use for each portion of an overlapping area- the overlapping area is defined as the image area where the two images, associated with different sensors, overlap.

An overlapping area may be a peripheral overlapping area where two first outlying portions 62 defined by different overlapping images 661 (captured by first sensors 41 ) overlap or where two second outlying portions 64 defined by different overlapping images 66 2 (captured by second sensors 42) overlap or a central overlapping area where two central portions 60 defined by different overlapping images 661 , 66 2 (captured by a first sensor 41 and second sensor 41 ) overlap.

The controller 100 is in some but not necessarily all examples configured to determine a putative boundary 80 between two overlapping images 66 a , 66 b . As described above the sub-scripts a, b may be respectively 1 ,1 ; 2, 2; 1 ,2; 2,1 . As illustrated in Fig 9, a composite image 70 on one side of the boundary 80 is or is predominantly determined by a first one of the two overlapping images 66 a and the composite image 70 on the other side of the boundary 80 is or is predominantly determined by the other second one of the two overlapping images 66 b .

The controller 100 in some but not necessarily all examples is configured to adapt the boundary 80 to avoid it passing through an object 90 in the first image 66 a and/or the second image 66 b (or through a perspective discontinuity arising from different perspectives associated with the different sensors that capture the first and second images 66 a , 66 b ). The object 90 may, for example, be in the foreground or middle ground of the image.

For example, the controller 100 may, in some examples, be configured to control how the object 90 is represented in the composite image 70.

In the example of Fig 9, the object 90 is in a first part of a first image 66 b (of two overlapping images 66 a , 66 b ) and is also in a second part of a second image 66 a (of the two overlapping images 66 a , 66 b ). The first part of the first image 66 b is not in an overlapping area where the first image 66 b and the second image 66 a overlap. The second part of the second image 66 a is in an overlapping area where the first image 66 b and the second image 66 a overlap. The controller 100 is configured to define in a composite image 70 how the object is rendered using only the first image 66 b and not using the second image 66 a . The boundary between the first image 66 b and the second image 66 a therefore extends around the object 90.

Fig 10 illustrates an example of the apparatus 10 configured to provide stereoscopic (3D) images. The apparatus 10 in this example additionally comprises a second set of sensors 30', not lying in any single plane 24. Each sensor 30' of the second set of sensors 30' is configured to sense light focused from an associated one of the first set of optical elements 20. A single optical element 20 is shared between a sensor 30 and a sensor 30'. A single sensor component may operate as the sensor 30 and the sensor 30' that share a common optical element 20. The first set of sensors 30 and the second set of sensors 30' form a set of stereoscopic sensor pairs 92. Each optical element 20 of the first set of optical elements 20 operates as a left eye pupil for one stereoscopic sensor pair 92 and operates as a right eye pupil for a different stereoscopic sensor pair 92. The adjacent optical elements 20 of the first set of optical elements 20 may have a separation of between 5 and 8 cm between the entrance pupils. They may for example have a separation of 6.4 or 6.5 cm which corresponds to human inter-ocular distance separation. The apparatus 10 may be used to capture 3D video which may, for example, be used in a virtual reality system.

A left-eye composite image 70 is created for the left eye using the images from the sensors 30 and a right-eye composite image is created for the right eye using the images from the sensors 30'. The rendering of the left-eye composite image to a viewer's left eye and the simultaneous rendering of the right-eye composite image to a viewer's right eye creates a stereoscopic (3D) panoramic image.

In some, but not necessarily all examples, the sensors 30 and sensors 30' record images at the same resolutions.

In some, but not necessarily all examples, the sensors 30 and sensors 30' record images at different resolutions. In some, but not necessarily all examples, the sensors 30 record images at first resolutions and sensors 30' record images at second resolutions.

The first resolution may be the same (common) for all sensors 30. The second resolution may be the same (common) for all sensors 30'.The common first resolution may be different from the common second resolution. The first resolution may be the different for different ones of the sensors 30'. The second resolution may be the different for different ones of the sensors 30'. The first resolution and the second resolution may be the same for sensors 30, 30' that share the same single optical element 20.

Implementation of a controller 100 may be as controller circuitry. The controller 100 may be implemented in hardware alone, have certain aspects in software including firmware alone or can be a combination of hardware and software (including firmware).

As illustrated in Fig 7 the controller 100 may be implemented using instructions that enable hardware functionality, for example, by using executable instructions of a computer program 106 in a general-purpose or special-purpose processor 102 that may be stored on a computer readable storage medium (disk, memory etc) to be executed by such a processor 102.

The processor 102 is configured to read from and write to the memory 104. The processor 102 may also comprise an output interface via which data and/or commands are output by the processor 102 and an input interface via which data and/or commands are input to the processor 102.

The memory 104 stores a computer program 106 comprising computer program instructions (computer program code) that controls the operation of the controller 100 when loaded into the processor 102. The computer program instructions, of the computer program 106, provide the logic and routines that enables the apparatus to perform the methods illustrated in Figs 1. The processor 102 by reading the memory 104 is able to load and execute the computer program 106.

The controller 100 therefore comprises:

at least one processor 102; and

at least one memory 104 including computer program code

the at least one memory 104 and the computer program code configured to, with the at least one processor 102, cause the controller 100 at least to perform:

the creation of one or more composite images 70 as described above. The computer program 106 may arrive at the controller 100 via any suitable delivery mechanism. The delivery mechanism may be, for example, a non-transitory computer-readable storage medium, a computer program product, a memory device, a record medium such as a compact disc read-only memory (CD-ROM) or digital versatile disc (DVD), an article of manufacture that tangibly embodies the computer program 106. The delivery mechanism may be a signal configured to reliably transfer the computer program 106. The controller 100 may propagate or transmit the computer program 106 as a computer data signal. Although the memory 104 is illustrated as a single component/circuitry it may be implemented as one or more separate components/circuitry some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage. Although the processor 102 is illustrated as a single component/circuitry it may be implemented as one or more separate components/circuitry some or all of which may be integrated/removable. The processor 102 may be a single core or multi-core processor. References to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single /multi- processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field- programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry. References to computer program, instructions, code etc. should be understood to encompass software for a

programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc.

As used in this application, the term 'circuitry' refers to all of the following:

(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and (b) to combinations of circuits and software (and/or firmware), such as (as

applicable): (i) to a combination of processor(s) or (ii) to portions of

processor(s)/software (including digital signal processor(s)), software, and

memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions and

(c) to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term "circuitry" would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term "circuitry" would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, or other network device.

The blocks illustrated in the Figs 1 may represent steps in a method and/or sections of code in the computer program 106. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted. It should be noted that while the example apparatus 10 in Fig 1 and corresponding examples in Figs 2A,2B, and Fig 5 have four optical elements in the first set of optical elements separated by 360 4, in other examples a different number N of optical elements 20 (and a corresponding number of sensors 30, 30') may be used separated for example by 360 N. In addition, the apparatus may comprise additional optical elements (and their corresponding sensors), e.g. for improving the composite image 70 quality, achieving larger stereoscopic coverage, or if the composite image 70 would not otherwise cover the "full sphere".

In addition to being a panoramic image, the composite image 70 can be projected, compressed, and stored in various different ways. For example, the composite image can be projected into a cube map or into a panorama image using equirectangular projection. Alternatively, the composition can be done e.g. on the player application (or a remote application) after streaming the raw camera data. In addition to image data, other related data may be captured and stored simultaneously and/or using the same setup, such as depth information captured from the surrounding environment.

Where a structural feature has been described, it may be replaced by means for performing one or more of the functions of the structural feature whether that function or those functions are explicitly or implicitly described.

As used here 'module' refers to a unit or apparatus that excludes certain

parts/components that would be added by an end manufacturer or a user. The apparatus 10 may be a module. The term 'comprise' is used in to X comprising Y indicates that X may comprise only one Y or may comprise more than one Y. If it is intended this document with an inclusive not an exclusive meaning. That is any reference to use 'comprise' with an exclusive meaning then it will be made clear in the context by referring to "comprising only one" or by using "consisting".

In this brief description, reference has been made to various examples. The description of features or functions in relation to an example indicates that those features or functions are present in that example. The use of the term 'example' or 'for example' or 'may' in the text denotes, whether explicitly stated or not, that such features or functions are present in at least the described example, whether described as an example or not, and that they can be, but are not necessarily, present in some of or all other examples. Thus 'example', 'for example' or 'may' refers to a particular instance in a class of examples. A property of the instance can be a property of only that instance or a property of the class or a property of a sub- class of the class that includes some but not all of the instances in the class. It is therefore implicitly disclosed that a features described with reference to one example but not with reference to another example, can where possible be used in that other example but does not necessarily have to be used in that other example. Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed.

Features described in the preceding description may be used in combinations other than the combinations explicitly described.

Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not.

Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not. Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon. l/we claim: