Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AN OPTICAL ARRANGEMENT FOR FOCUSING IMAGES OF A THREE-DIMENSIONAL SPACE FROM DIFFERENT PERSPECTIVES ONTO ONE OR MORE CAMERA SENSORS
Document Type and Number:
WIPO Patent Application WO/2018/149488
Kind Code:
A1
Abstract:
An apparatus comprising: a common optical arrangement for focusing images of a three-dimensional space from different perspectives onto a camera sensor; and multiple optical arrangements each configured to provide an optical path from a respective perspective, defined by the optical arrangement, towards the common optical arrangement such that the optical paths from a different perspective converge at the common optical arrangement, wherein the multiple optical arrangements (30) comprise at least three different optical arrangements.

Inventors:
SALMIMAA MARJA PAULIINA (FI)
JÄRVENPÄÄ TONI JOHAN (FI)
KIMMEL JYRKI SAKARI (FI)
Application Number:
PCT/EP2017/053327
Publication Date:
August 23, 2018
Filing Date:
February 14, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA TECHNOLOGIES OY (FI)
International Classes:
G03B35/10; G03B35/02; G03B5/06; G03B37/04
Domestic Patent References:
WO2015090960A12015-06-25
WO2015189403A12015-12-17
WO2012132088A12012-10-04
Foreign References:
US20120026417A12012-02-02
Other References:
None
Attorney, Agent or Firm:
HARRISON, Scott et al. (GB)
Download PDF:
Claims:
CLAIMS

1 . An apparatus comprising:

a common optical arrangement for focusing images of a three-dimensional space from different perspectives onto a camera sensor; and

multiple optical arrangements each configured to provide an optical path from a respective perspective, defined by the optical arrangement, towards the common optical arrangement such that the optical paths from different perspectives converge at the common optical arrangement,

wherein the multiple optical arrangements include at least three different optical arrangements.

2. An apparatus as claimed in claim 1 , wherein the common optical arrangement is configured to control optical paths from the different perspectives to be more convergent, less divergent, at the camera sensor by reducing an angular spread of output light paths from the common optical arrangement compared to input light paths to the common optical arrangement from different optical arrangements.

3. An apparatus as claimed in claim 1 or 2, wherein the common optical arrangement is configured for focusing at least three different images of a three- dimensional space from different perspectives and/or fields of view onto the camera sensor.

4. An apparatus as claimed in any preceding claim, wherein an image plane defined by the camera sensor is not orthogonal to an optical axis of at least some of the multiple optical arrangements, the common optical arrangement being configured to tilt a focal plane of at least some of the multiple optical arrangements so that it is parallel to the image plane of the camera sensor. 5. An apparatus as claimed in claim 4, wherein the optical axis of at least one of the multiple optical arrangements is displaced from a centre of the camera sensor.

6. An apparatus as claimed in any preceding claim, wherein the common optical arrangement comprises one or more peripheral, reflective portions configured to reduce an angular spread of output light paths from the common optical arrangement compared to divergent peripherally input light paths to the common optical arrangement.

7. An apparatus as claimed in any preceding claim, wherein the common optical arrangement has rotation symmetry about an axis.

8. An apparatus as claimed in any preceding claim, wherein the common optical arrangement comprises a central non-reflective portion configured to transmit centrally input light to the common optical arrangement without significant angular deviation.

9. An apparatus as claimed in any preceding claim, comprising the camera sensor, the camera sensor comprising a plurality of sensels that are configured to sense images from different perspectives focused by the common optical arrangement onto the camera sensor.

10. An apparatus as claimed in any preceding claim, wherein the camera sensor comprises alternating rows/columns of first sensels and second sensels, wherein the first sensels are configured to sense a first image from a first peripheral perspective and wherein the second sensels are configured to sense a second image from a second peripheral perspective that significantly diverges from the first peripheral perspective.

1 1 . An apparatus as claimed in any preceding claim, wherein the multiple optical arrangements comprise a first peripheral optical arrangement configured to provide a first peripheral optical path from a respective first peripheral perspective towards the common optical arrangement, a second peripheral optical arrangement configured to provide a second peripheral optical path from a respective second peripheral perspective towards the common optical arrangement, and a first central optical arrangement configured to provide a central optical path from a respective first central perspective between the first peripheral perspective and the second peripheral perspective, towards the common optical arrangement.

12. An apparatus as claimed in claim 1 1 , wherein the first peripheral optical arrangement and the second peripheral optical arrangement each have an equally sized field of view that is different to the size of the field of view of the first central optical arrangement.

13. An apparatus as claimed in claim 1 1 or 12, wherein the first peripheral optical arrangement has a first peripheral field of view and the second peripheral optical arrangement has a second peripheral field of view, and wherein the first peripheral perspective and the second peripheral perspective are configured such that the first peripheral field of view and the second peripheral field of view overlap. 14. An apparatus as claimed in any one of claims 1 1 to 13, wherein the first peripheral optical arrangement and the second peripheral optical arrangement both comprise wide-angle lenses having a field of view greater than 180°.

15. An apparatus as claimed in any one of claims 1 1 to 14, wherein the first peripheral optical arrangement and the second peripheral optical arrangement both comprise a telecentric back focal design.

16. An apparatus as claimed in any one of claims 1 1 to 15, wherein the first central optical arrangement is configured to provide a macro-lens imaging system.

17. An apparatus as claimed in any one of claims 1 1 to 15, wherein the first central optical arrangement is configured to provide a depth-mapping system.

18. An apparatus as claimed in claim 17, further comprising:

a light projector configured to project a projection pattern into the three- dimensional space to paint a scene; wherein the first central optical arrangement is configured to provide the central optical path from a central perspective, defined by the first central optical arrangement, towards the common optical arrangement and the common optical arrangement is configured to focus an image of the projected pattern from the central perspective onto the camera sensor, the apparatus further comprising a controller configured to:

cause projection by the light projector of an initial projection pattern into the three-dimensional space to paint a scene;

causing capture of an initial image of the initial projection pattern projected into the three-dimensional space to paint the scene; causing analysis of the captured initial image;

causing projection by the light projector of a new projection pattern into the three-dimensional space to paint the scene, wherein the new projection pattern is dependent upon said analysis of the captured initial image; and

causing capture of a new image of the new projection pattern projected into the three-dimensional space to paint the scene.

19. An apparatus as claimed in claim 18, wherein the controller is additionally configured to analyze the captured new image to create a depth map for the three- dimensional space.

20. An apparatus as claimed in claim 18 or 19, wherein the new projection pattern is a spatially variable pattern and wherein said spatial variation is dependent upon said analysis of the captured initial image.

21 . An apparatus as claimed in claim 18, 19 or 20, wherein the new projection pattern has dense areas and less dense areas, wherein the positions of the dense areas and the less dense areas relative to the scene are determined by analyzing the captured initial image.

22. An apparatus as claimed in any one of claims 18 to 21 , wherein the new projection pattern has dense areas and less dense areas, wherein the density of the projection pattern for an area is determined by said analysis of the captured initial image.

23. An apparatus as claimed in any one of claims 18 to 22, wherein the new projection pattern has dense areas and less dense areas, wherein there is a correlation between a density of the projection pattern and a proximity of the portion of the scene it is projected onto such that proximal portions of the scene are more likely to have a dense projection pattern projected onto them and distal portions of the scene are significantly less likely to have a dense projection pattern projected onto them, and/or wherein there is a correlation between a density of the projection pattern and a variation in depth of the portion of the scene it is projected onto such that portions of the scene that have quickly varying depth are more likely to have a dense projection pattern projected onto them and portions of the scene that have unvarying depth are significantly less likely to have dense projection patterns projected onto them, and/or wherein there is a correlation between a density of the projection pattern and image recognition of a portion of the scene it is projected onto such that particular portions of the scene that are tracked using image recognition are more likely to have a dense projection pattern projected onto them and portions of the scene that are not tracked using image recognition are less likely to have a dense projection pattern projected onto them, and/or wherein there is a correlation between a density of the projection pattern and a spatial variation in the portion of the scene it is projected onto such that particular portions of the scene that are spatially complex are more likely to have a dense projection pattern projected onto them and portions of the scene that are spatially simple are less likely to have a dense projection pattern projected onto them.

24. An apparatus as claimed in any one of claims 18 to 23, wherein the common optical arrangement is a dichroic arrangement that is transparent to infrared light transmitted by the light projector and is reflective to visible light focused by the first peripheral optical arrangement and the second peripheral optical arrangement.

25. An apparatus as claimed in any one of claims 18 to 24, wherein the controller is configured to control the new projection pattern in dependence upon feedback from a rendering engine for rendering a panoramic image of the three-dimensional space.

26. An apparatus as claimed in any one of claims 1 1 to 25, further comprising a timing controller configured to time-separate operation of capturing images from the central optical arrangement and capturing images from the first and second peripheral optical arrangements.

27. An apparatus as claimed in claim 26, wherein the timing controller is configured to time-separate operation of capturing images from the first peripheral optical arrangement and the second peripheral optical arrangement.

28. An apparatus as claimed in claims 26 or 27, wherein the timing controller is configured to switch transmission of light to the sensor between the first central optical arrangement and the peripheral optical arrangements.

29. An apparatus as claimed in any of claims 26 to 28, further comprising a memory controller synchronized with the timing controller to control storage of captured images in an addressable memory.

30. An apparatus as claimed in any of claim 1 1 to 29, additionally comprising: a first additional optical arrangement configured to provide an optical path from a first additional perspective, defined by the first additional optical arrangement, towards a first additional common optical arrangement for focusing images of the three- dimensional space from the first additional perspectives onto a first additional camera sensor;

a second additional optical arrangement configured to provide an optical path from a second additional perspective, defined by the second additional optical arrangement, towards a second additional common optical arrangement for focusing images of the three-dimensional space from the second additional perspectives onto a second additional camera sensor;

wherein the common optical arrangement, the first additional common optical arrangement and the second additional common optical arrangement are distinct, separated arrangements,

wherein the camera sensor, the first additional camera sensor, and the second additional camera sensor, are distinct, separated camera sensors

31 . An apparatus as claimed in claim 30 further comprising a first optical element shared by the first peripheral optical arrangement and the first additional optical arrangement; and

comprising a second optical element shared by the second peripheral optical arrangement and the second additional optical arrangement

32. A system comprising multiple apparatus as claimed in any preceding claim, wherein for each apparatus the fields of view of the multiple optical arrangements and the respective perspectives of the multiple optical arrangements create overlapping images that form a combined image for each apparatus, and wherein the combined images for the multiple apparatus overlap to form a panoramic image for the system.

33. A system as claimed in claim 32, wherein the multiple apparatus are arranged such that the panoramic image extends 360° in the horizontal.

34. A system as claimed in claim 33, wherein the multiple apparatus are arranged such that the panoramic image extends 180° in the vertical.

35. A system as claimed in any of claims 32 to 34, wherein at least some of the multiple apparatus lie within a common plane and are evenly distributed with the same angular separation within that plane and wherein at least some of the multiple optical arrangements of the multiple apparatus lie within the common plane and are evenly distributed with the same angular separation within that plane.

36. A system as claimed in any one of claims 32 to 35, comprising multiple optical elements shared by adjacent ones of the multiple optical arrangements of different apparatus.

37. A system as claimed in claim 36, wherein each of the multiple optical arrangements is symmetric about its respective optical path as it passes through the optical arrangement and the one or more shared optical elements is symmetrical with respect to each optical path.

38. A system as claimed in claim 36 or 37, wherein each of the optical arrangements is or comprises an objective lens arrangement, comprising compound lenses and configured to provide a wide angle field of view.

39. A system as claimed in any of claims 36 to 38, wherein the shared optical element is an outermost objective optical element of the optical arrangements.

40. A system as claimed in any one of claims 36 to 39, wherein the shared optical element has a common exterior surface for each of the different optical paths.

41 . A system as claimed in any of claims 36 to 40, wherein the shared optical element has a curved exterior surface and an interior surface comprising a concave cavity for each of the different optical paths.

42. An apparatus as claimed in any of claims 36 to 41 , wherein the shared optical element is positioned where the multiple optical paths intersect.

43. A system as claimed in claim 42, wherein the shared optical element is an intermediate optical element within a series of optical elements.

44. A system as claimed in any one of claims 36 to 42, wherein the shared optical refractive element is an innermost optical element of the multiple compound optical arrangements.

45. A method comprising:

projecting an initial projection pattern into a three-dimensional space to paint a scene;

capturing an initial image of the initial projection pattern projected into the three-dimensional space to paint the scene;

analyzing the captured initial image;

projecting a new projection pattern into the three-dimensional space to paint the scene, wherein the new projection pattern is determined by said analyzing the captured initial image; and

capturing a new image of the new projection pattern projected into the three- dimensional space to paint the scene.

46. An apparatus comprising:

a common optical arrangement for focusing images of a three-dimensional space from different perspectives onto a camera sensor; and

multiple peripheral optical arrangements each configured to provide an optical path from a respective peripheral perspective, defined by the peripheral optical arrangement, towards the common optical arrangement such that the optical paths from different peripheral perspectives converge at the common optical arrangement for focusing onto the camera sensor.

Description:
TITLE

An optical arrangement for focusing images of a three-dimensional space from different perspectives onto one or more camera sensors

TECHNOLOGICAL FIELD

Embodiments of the present invention relate to an apparatus and a method. In particular they relate to an optical arrangement for focusing images of a three- dimensional space from different perspectives onto a camera sensor.

BACKGROUND

A recent development has been the manufacture of multi-camera systems that allow images of a three-dimensional space to be captured from different perspectives. Typically, each of the multiple cameras of the multiple-camera system captures an image from a different perspective (point of view) and each of these perspectives diverges from an origin which may for example be a point, area or volume. The multiple cameras capture, potentially at the same time, either as still images or video images different portions of the three-dimensional space. In some examples the images that are captured may overlap and it may be possible to create a panoramic image.

The ability to capture an image of a three-dimensional space from the same position within that space but with a different perspective is particularly useful in a virtual reality or augmented reality application.

It would be desirable to provide an improved or different optical arrangement for focusing images of the three-dimensional space from different perspectives.

BRIEF SUMMARY

According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: a common optical arrangement for focusing images of a three-dimensional space from different perspectives onto a camera sensor; and multiple optical arrangements each configured to provide an optical path from a respective perspective, defined by the optical arrangement, towards the common optical arrangement such that the optical paths from different perspectives converge at the common optical arrangement, wherein the multiple optical arrangements include at least three different optical arrangements.

According to various, but not necessarily all, embodiments of the invention there is provided a method comprising: projecting an initial projection pattern into a three- dimensional space to paint a scene; capturing an initial image of the initial projection pattern projected into the three-dimensional space to paint the scene; analyzing the captured initial image; projecting a new projection pattern into the three-dimensional space to paint the scene, wherein the new projection pattern is determined by said analyzing the captured initial image; and capturing a new image of the new projection pattern projected into the three-dimensional space to paint the scene.

According to various, but not necessarily all, embodiments of the invention there is provided examples as claimed in the appended claims.

BRIEF DESCRIPTION

For a better understanding of various examples that are useful for understanding the detailed description, reference will now be made by way of example only to the accompanying drawings in which:

Fig. 1 illustrates an example of an apparatus that is configured to capture multiple images from multiple different perspectives using a common optical arrangement; Fig 2A illustrates an example of divergent perspectives;

Figs 2B and 2C illustrate examples of overlapping images;

Fig. 3 illustrates an example of a system that is configured to capture multiple images from multiple different perspectives using multiple common optical arrangements; Fig. 4 illustrates an example of a system that is configured to capture multiple images from multiple different perspectives using multiple common optical arrangements and using multiple shared refractive elements;

Figs 5A and 5B illustrate further examples of an apparatus comprising multiple compound optical arrangements that share one or more shared optical elements and Fig. 5C illustrates an example of a system that is configured to capture multiple images from multiple different perspectives using multiple common optical arrangements and using multiple shared refractive elements;

Fig 6 illustrates a single camera sensor configured to simultaneously sense two images from different perspectives; and

Figs 7A and 7B illustrate suitable examples of a common optical arrangement;

Fig 8 illustrates an example of method for depth mapping;

Fig 9A illustrates an example of an initial projection pattern and Fig 9B illustrates an example of a new projection pattern; and

Fig 10A illustrates an example of a controller and Fig 10B illustrates an example of a computer program.

DETAILED DESCRIPTION

Fig. 1 illustrates an example of an apparatus 100 that is configured to capture multiple images from multiple different perspectives 40.

The apparatus 100 comprises a common optical arrangement 80 for focusing images of a three-dimensional space 20 from different perspectives 40 onto a camera sensor 10. The apparatus 100 also comprises multiple optical arrangements 30 each configured to provide an optical path 50 from a respective perspective 40, defined by the optical arrangement 30, towards the common optical arrangement 80 such that optical paths 50 from the different perspectives 40 converge at the common optical arrangement 80. The term image is used before and at capture to refer to the collection of light rays focused on a camera sensor 10 and is used, after capture, to refer to the camera sensor output as a consequence of image capture. A captured image may, for example, be transiently stored in a buffer or permanently or semi-permanently stored as an addressable data structure in an addressable memory.

The term "divergent convergent" means divergent/convergent in relation to direction.

An optical path 50 is a path that light will take when travelling through the apparatus 100. The term visible light is used to refer to only visible light. The light may be visible light or it may be light outside the visible spectrum such as, but not limited to, near infrared, infrared, near ultraviolet.

The example of the apparatus 100 uses a common optical arrangement 80 that is shared between the multiple optical arrangements 30. This has the advantage of making the apparatus 100 more compact.

In some examples, the apparatus 100 may comprise a camera sensor 10 for capturing images 12 of the three-dimensional space 20 from different perspectives 40. However, in other examples the apparatus 100 does not comprise the camera sensor 10 and the camera sensor is added later during manufacture or by a user. In this example, the camera sensor 10 comprises a plurality of sensels and is configured to sense, using the sensels, images 12 from different perspectives 40. The common optical arrangement 80 is configured to control optical paths 50 from the different perspectives 40 to be more convergent, less divergent, at the camera sensor 10 by reducing an angular spread of output light paths compared to input light paths. The input light paths from the different multiple optical arrangements 30 may have a wide angular spread and may be significantly divergent/convergent, that is they are not parallel. However, the light path 50 output from the common optical arrangement 80 is substantially orthogonal to an imaging plane 14 of the image sensor 10 and this light path is substantially common for the light paths originating from the different multiple optical arrangements 30. The optical paths 50 from the different perspectives 40 of the different multiple optical arrangements 30 converge towards and meet at the common optical arrangement 80.

The image plane 14 defined by the camera sensor 10 is not orthogonal to an optical axis of at least some of the multiple optical arrangements 30. The common optical arrangement 80 is configured to tilt a focal plane of at least some of the multiple optical arrangements 30 so that it is parallel to the image plane 14 of the camera sensor. Any distortions that are introduced by the common optical arrangement 80 are well specified and may be corrected during post-processing of a captured image.

The optical axis of at least one of the multiple optical arrangements 30 may be displaced from a centre of the camera sensor 10 such that the light path 50 originating from this optical arrangement is displaced relative to a light path originating from a different optical arrangement. Although these light paths may be spatially displaced, they may also be parallel. In the example illustrated in Fig. 1 , the multiple optical arrangement 30 includes three different optical arrangements. The common optical arrangement 80 is configured for focusing the different images 12 from the different perspectives 40 of the different multiple optical arrangements 30 and/or fields of view of those multiple optical arrangements 30 onto the camera sensor 10. In this example the multiple optical arrangements 30 include a first peripheral optical arrangement 30i, a second peripheral optical arrangement 30 3 and a central optical arrangement 30 2 .

The first peripheral optical arrangement 30i is configured to provide a first peripheral optical path 50i from a first peripheral perspective 40i towards the common optical arrangement 80. The common optical arrangement 80 is configured to direct that first peripheral optical path 50i onto the camera sensor 10.

The second peripheral optical arrangement 30 3 is configured to provide a second peripheral optical path 50 3 from a respective second peripheral perspective 40 3 towards the common optical arrangement 80. The common optical arrangement 80 is configured to direct that second peripheral optical path 50 3 onto the camera sensor 10.

The central optical arrangement 30 2 is configured to provide a central optical path 50 2 from a respective central perspective 40 2 towards the common optical arrangement 80. The central perspective 40 2 is between the first peripheral perspective 40i and the second peripheral perspective 40 3 . In this example the central perspective 40 2 is exactly halfway between the first peripheral perspective 40i and the second peripheral perspective 40 3 . The common optical element is configured to direct the central optical path 50 2 towards the camera sensor 10.

The first peripheral optical arrangement 30i has a first peripheral field of view 13i centered on the first peripheral perspective 40i. The second peripheral optical arrangement 30 3 has a second peripheral field of view 13 3 centered on the second peripheral perspective 40 3 . The first peripheral perspective 40i and the second peripheral perspective 40 3 are configured such that the first peripheral field of view 13i and the second peripheral field of view 13 3 overlap. In this example, but not necessarily all examples, the first peripheral optical arrangement 30i and the second peripheral optical arrangement 30 3 have equally sized fields of view 13 that is different to the field of view 13 2 of the central optical arrangement 30 2 . In some examples, the first peripheral optical arrangement 30i and the second peripheral optical arrangement 30 3 both comprise wide angular lenses providing a horizontal field of view greater than 180°. In some but not necessarily all examples, the apparatus 100 may comprise a controller 170 that operates as a timing controller 120 for controlling operation of the apparatus 100 and, in particular, timings of operations of the apparatus 100. For example, in at least some examples, the timing controller 120 is configured to time- separate the capturing of images 12 from the first central optical arrangement 30 2 and the capturing of images 12 from the first and second peripheral optical arrangements 30i, 30 3 . In some examples, the timing controller 120 may additionally be configured to time-separate the capturing of images 12 from the first peripheral optical arrangement 30i and from the second peripheral optical arrangement 30 3 . The timing controller 120 may therefore prevent interference between the images focused onto the sensor 10 by the central optical arrangement 30 2 and by the peripheral optical arrangements 30i, 30 3 . The timing controller 120 may also, in some examples, prevent interference between the images focused onto the sensor 10 by the first peripheral optical arrangement 30i and the second peripheral optical arrangement 30 3 .

The timing controller 120 may for example use shutters or other optical elements such as controllable filters or refraction elements to control which light path from which optical arrangement 30 reaches the camera sensor 10 at what time and/or which light paths 50 from which optical arrangement 30 reach which parts of the camera sensor 10 at which times. The control of a light path 50 may for example be achieved by using a shutter which switches the opacity of the light path or it may be achieved by using a variable refractive element which refracts light in different directions in dependence upon a control signal from the timing controller 120. As the images 12 captured from the light paths 50 from the different optical arrangements 30 may be utilized by the apparatus 100 in different ways, it is important that they are correctly stored into an addressable memory 122 by a memory controller 124. The memory controller 124 may be synchronized with the timing controller such that the captured images 12 are stored with metadata indicating the origin, that is the perspective/optical arrangement, associated with that image.

In one example embodiment, the first central optical arrangement 30 2 may be configured to provide a macro-lens imaging system. This imaging system may have a short focal length and it may for example have a significantly shorter focal length than the focal length of the peripheral optical arrangements 30i, 30 3 . This may be used to enable "close-up" images to be captured from the three-dimensional space 20. In some examples, the macro-lens system may be a zoom system that provides a magnification greater than 1 :1 . It may be desirable to use the timing controller 120 to synchronize a shutter used for the macro-lens system 30 2 and operation of the camera sensor 10. The field of view of the central optical arrangement 30 2 will typically be much less than the field of view 13 of the peripheral optical arrangements.

The timing controller 120 may therefore be used to switch between a panoramic imaging mode using the first and second peripheral optical arrangements 30i, 30 3 to create a panoramic image captured by the camera sensor 10 and a macro or zoom optical mode that uses the central optical arrangement 30 2 (but not the peripheral optical arrangements 30i, 30 3 ) to capture an image at the camera sensor 10.

In another example of an embodiment, the first central optical arrangement 30 2 may be configured to provide a depth-mapping system. A depth-mapping system is a system that provides information concerning the distance to objects in the three- dimensional space 20. This may be useful, for example, if one wants to determine whether an object will be within an image from a particular perspective at a particular field of view or to accurately focus on an image or for tracking objects or for other reasons. The operation of the depth-mapping system will be explained in significantly more detail later in relation to Fig. 8. However, it should be recognized that the depth-mapping system may be configured to image infrared light rather than visible light. The timing controller 120 may be configured to switch between visual imaging using the first peripheral optical arrangement 30i and the second peripheral optical arrangement 30 3 and infrared imaging using the central optical arrangement 30 2 (but not the first and second peripheral optical arrangements 30i, 30 3 ) to capture images at the camera sensor 10.

In the example of Fig. 2A, the divergent perspectives 40 lie within a common plane 54 and are evenly distributed with the same angular separation Θ within that plane 54. It can be seen that the vectors defining the divergent perspectives 40 pass through the origin 52. In this example the optical paths 50 are rectilinear and are aligned with the different perspectives 40.

Each of the multiple optical arrangements 30 has an associated field of view 13. In the example illustrated in Figs. 2B and 2C, the fields of view of at least some of the multiple optical arrangements 30 overlap spatially so that they form a continuum (a panoramic image) 16.

In the example of Fig. 2B, the images 12 captured for different perspectives overlap in a horizontal direction and may be used to create a panoramic image 16. The field of view 13 of the panoramic image in the horizontal direction may be 360° or less.

In the example of Fig. 2C the images 12 captured for different perspectives overlap both horizontally and vertically and may be used to create a panoramic image that extends both vertically and horizontally. The field of view of the panoramic image in the horizontal direction may be 360° or less. The field of view of the panoramic image in the vertical direction may be 180° or less, for example up to 90° to a zenith above and down to minus 90° to a nadir below. It will therefore be appreciated that the capture of multiple images for multiple different divergent perspectives, for example using the peripheral optical arrangements 30i, 30 3 for one or more apparatus 100, allows a combined image of a three-dimensional space to be created by combining the images captured. Fig. 3 illustrates an example of a system 200 comprising multiple apparatus 100 m as previously described, each comprising multiple optical arrangements 30 n including a first peripheral optical arrangement 30i, a second peripheral optical arrangement 30 3 and a central optical arrangement 30 2 . Sets of peripheral optical arrangements 30 nm shares a common optical arrangement 80 m .

In this example the multiple optical arrangements 30 n , for each apparatus 100 m ( Common optical arrangement 80 m ) are labelled 30 nm - The respective perspectives 40 associated with an optical arrangement 30 nm is labelled 40 nm -

For each apparatus 100, the fields of view of the multiple peripheral optical arrangements 30 and the respective perspectives 40 of the multiple peripheral optical arrangements 30, create overlapping images that form a combined image for the apparatus 100. The combined images for each of the apparatus 100 overlap to form a panoramic image.

The panoramic image may have a field of view in the horizontal direction that is 360° or less. The field of view of the panoramic image in the vertical direction may be 180° or less. In the example illustrated in Fig. 3, the multiple apparatus 100 lie within a common plane and are evenly distributed with the same angular separation within that plane. In this particular arrangement, this causes the peripheral optical arrangements of the different multiple apparatus 100 to be evenly distributed with the same angular separation within the plane. Fig. 4 illustrates an example of the system 200 as illustrated in Fig 3. In this example, the system 200 comprises multiple apparatus 100 m as previously described, each comprising multiple optical arrangements 30 n including a first peripheral optical arrangement 30i, a second peripheral optical arrangement 30 3 and a central optical arrangement 30 2 .

Sets of peripheral optical arrangements 30 nm share a common optical arrangement 80 m.

Adjacent peripheral optical arrangements 30i a 30 3b , where a≠b, share optical element 60 a . In this example the multiple optical arrangements 30 n , for each shared optical element 60 a are labelled 30 na - The respective perspectives 40 associated with an optical arrangement 30 nm is labelled 40 nm - The respective light path 50 associated with an optical arrangement 30 nm is labelled 50 nm -

The common optical element 60i forms a part of a first peripheral optical arrangement 30n and also forms part of a second peripheral optical arrangement 30 3 3 of a third apparatus 100 3 adjacent the first apparatus 100i .

Another shared optical element 60 2 forms part of the second peripheral optical arrangement 30 3 i of the first apparatus 100i and also forms part of a first peripheral optical arrangement 30i 2 of a second apparatus 100 2 of the system 200. A further shared optical element 60 3 forms part of a second peripheral optical arrangement 30 32 of the second apparatus 100 2 of the system 200 and also forms a part of the first peripheral optical arrangement 30i 3 of the third apparatus 100 3 .

Each optical element 60 3 may be a refractive optical element, for example a lens or other element with refractive power or a reflective element, for example a mirror.

Referring to Fig 4, a first peripheral optical arrangement 30n is configured to provide an optical path 50n from a first perspective 40n , defined by the first peripheral optical arrangement 30n , towards a common optical arrangement 8Ο 1 for focusing images of the three-dimensional space from the first perspective 40n onto a camera sensor

A second peripheral optical arrangement 30 3 i is configured to provide an optical path 50 3 i from a second perspective 40 3 i , defined by the second peripheral optical arrangement 30 3 i , towards the common optical arrangement 8Ο 1 for focusing images of the three-dimensional space from the second perspective 40 3 i onto the camera sensor 10i .

A first additional peripheral optical arrangement 30 33 is configured to provide an optical path 50 33 from a first additional perspective 40 33 , defined by the first additional peripheral optical arrangement 30 33 , towards a first additional common optical arrangement 80 3 for focusing images of the three-dimensional space from the first additional perspective 40 3 onto a first additional camera sensor 10 3 . A second additional peripheral optical arrangement 30i 2 is configured to provide an optical path 50i 2 from a second additional perspective 40i 2 , defined by the second additional peripheral optical arrangement 30i 2 , towards a second additional common optical arrangement 80 2 for focusing images of the three-dimensional space from the second additional perspective 40i 2 onto a second additional camera sensor 10 2 .

A third additional peripheral optical arrangement 30i 3 is configured to provide an optical path 50i 3 from a third additional perspective 40i 3 , defined by the third additional peripheral optical arrangement 30i 3 , towards the first additional common optical arrangement 80 3 for focusing images of the three-dimensional space from the third additional perspective 40i 3 , onto the first additional camera sensor 10 3 .

A fourth additional peripheral optical arrangement 30 32 is configured to provide an optical path 50 32 from a fourth additional perspective 40 32 , defined by the fourth additional peripheral optical arrangement 30 32 , towards the second additional common optical arrangement 80 2 for focusing images of the three-dimensional space from the fourth additional perspective 40 3 onto the second additional camera sensor 10 2 .

The common optical arrangement 80i, the first additional common optical arrangement 80 3 and the second additional common optical arrangement 80 2 are distinct, separated arrangements.

The camera sensor 10i, the first additional camera sensor 10 2 , and the second additional camera sensor 10 3 , are distinct, separated camera sensors.

A first optical element 60i is shared by the first peripheral optical arrangement 30n and the first additional optical arrangement 30 33 .

A second optical element 60 2 is shared by the second peripheral optical arrangement 30 3 i and the second additional optical arrangement 30i 2 . A third optical element 60 3 is shared by the third additional peripheral optical arrangement 30i 3 and the fourth additional peripheral optical arrangement 30 32. The first optical element 60i , the second optical element 60 2 and the third optical element are distinct and separated.

Each of the common optical arrangement 80i, 80 2, 80 3 is additionally associated with a central optical arrangement 30 2 i, 30 22 , 30 23 ., Each is configured to provide an optical path 50 2 i, 50 22 , 50 23 from a respective perspective 40 2 i, 40 22 , 40 23 , defined by the central optical arrangement 30 2 i, 30 22 , 30 23 , towards the common optical arrangement 80i, 80 2, 80 3 and associated camera sensors 10i, 10 2, 10 3.

Figs 5A and 5B illustrate different configurations of adjacent peripheral optical arrangements 30i, 30 3 that share a optical element 60 in their respective light paths 50i, 50 3 but, as illustrated in Fig 4 and 5C, do not share the same common optical arrangement 80 and use different camera sensors 10.

Fig 5C illustrates an example of the system 200 illustrated in Fig 4 that uses at different perspectives the configuration of adjacent peripheral optical arrangements illustrated in Fig 5B.

In Figs 5A-5C, each peripheral optical arrangement 30 is configured to provide an optical path 50 n from a respective perspective 40 n , defined by the optical arrangement 30 n , through a optical element 60 that is shared with one adjacent peripheral optical arrangement 30, to a common optical arrangement 80 (and camera sensor 10) shared with a different adjacent peripheral optical arrangement 30 .

Adjacent peripheral optical arrangements 30 that share a common optical arrangement 80 (and camera sensor 10) are illustrated in Fig 1 , 4 and 5C.

Adjacent peripheral optical arrangements 30 that share a common optical element 60 are, for example, illustrated in Fig 5A-5C. The shared optical element 60 is common to the adjacent peripheral optical arrangements 30 and provides at least a part of each of the multiple optical paths 50 n from the multiple divergent perspectives 40 n to different respective spatially separated common optical arrangements 60 and camera sensors 10.

Where the optical path 50 meets its respective camera sensor 10 an image 12, defined by the perspective 40 is captured by the camera sensor 10.

The use of the term 'compound' in relation to an optical arrangement is the same as its use in relation to lens, it means that the overall optical effect is achieved using multiple optical elements. Each peripheral optical arrangement 30 may be a compound optical arrangement.

In the illustrated examples of Figs 5A-5C, but not necessarily all examples, each compound optical arrangement 30 is an objective lens arrangement, comprising compound lenses, and configured to provide a wide-angle field of view centered on the respective perspective 40..

Each compound optical arrangement 30 is configured to provide an optical path 50 from a respective perspective 40, defined by that compound optical arrangement 30, to a camera sensor 10 via a unique combination of shared optical element 60 and common optical arrangement 80. It will be appreciated that the shared optical element 60 is common to two compound optical arrangements 30 in that it is present in a first peripheral compound optical arrangement 30 which is aligned in a first direction (vertically in Figs 5A and 5B) and is simultaneously present in a second peripheral compound optical arrangement 30 which is arranged in a second different direction (horizontally in Figs 5A and 5B).

The shared optical element 60 is a central shared optical element 60 positioned where the multiple optical paths intersect. In the example of Fig 5A, the central shared optical element 60 is positioned at an origin 52. It is an innermost optical element 32 of the multiple compound optical arrangements 30. In the example of Fig 5B, the central shared optical element 60 that is located at the origin is an intermediate optical element 60 within a series of optical elements 32. In the example of Fig 5B, the apparatus 100 comprises not only an intermediate shared optical element 60 at the origin 52 but also comprises a peripheral shared optical element 60. In this example, the peripheral shared optical element 60 is an outermost objective optical element 60 (e.g. lens) of the compound optical arrangements 30. That is, the peripheral shared optical element 60 is that part of the compound optical arrangements 30 that is furthest from the image planes 14 of the camera sensors 10 and/or is closest to the three-dimensional space 20 being imaged. In this example, but not necessarily all examples, the peripheral shared optical element 60 is a refractive optical element and has a continuous exterior surface 62 for each of the different optical paths 50. That is, the exterior surface 62 of the peripheral shared optical element 60 is continuous and is a part of a first optical path 50 through the first compound optical arrangement 30 and is also part of a second optical path 50 through the second compound optical arrangement 30. It should be appreciated that different parts of the peripheral shared optical element 60 are used for the respective first and second optical paths 50.

In this example, but not necessarily all examples, the peripheral shared optical element 60 has a curved exterior surface 62 and an interior surface 64 comprising a concave cavity 66 for each of the different optical paths 50. The concave cavity 66 is aligned with each respective compound peripheral optical arrangement 30.

In this example, the curved exterior surface 62 has a spherical shape and the portions of the exterior surface 62 that are part of the first optical path and the second optical path lie on the same continuous spherical surface.

Also in this example, the concave cavity 66 and its interior surface 64 define a part of a sphere that has a smaller radius of curvature than the radius of curvature of the sphere defining the exterior curved surface 62.

It will be appreciated that in each of Figs 5A and 5B the apparatus 100 uses multiple compound optical arrangements 30 that are rectilinear arrangements defining substantially rectilinear optical paths and aligned with the respective perspective 40. Each of the multiple compound optical arrangements 30 is symmetrical about its respective perspective 40/rectilinear optical path 50 and the one or more shared optical elements 60 is symmetrical with respect to each perspective 40/optical path 50. The camera sensor 10 may comprise a plurality of sensels 1 1 and is configured to sense, different images from different perspectives 40 focused by the common optical arrangement 80 onto the camera sensor.

Sensels 1 1 may only be used to sense one image at a time. Therefore the same sensels may be used to sense different images at different times (time-division) and/or different sensels may be used to sense different images at the same (or different) times (spatial division). Where different sensels are used to sense different images at the same (or different) times (spatial division), the sensors used to detect different images may be overlapping and interleaved or they may be discrete non- overlapping areas of the sensor 10.

This provides spatially resolved sensing of the images 12A, 12B. The different images may be read-out of the camera sensor 10 separately in some examples. In the particular example illustrated in Fig 6, the single camera sensor 10 comprises alternating rows/columns of first sensels (dash) and second sensels (dot). The first sensels are configured to sense a first image 12A from a first peripheral perspective 40i and the second sensels (different from the first sensels) are configured to sense a second image 12 from a second different peripheral perspective 40 that is significantly divergent (e.g. the perspectives are separated by an angle greater than 45°, 60°,90°) from the first peripheral perspective 40i.

Each of the multiple optical arrangements may comprise a telecentric back focal design comprising a barrel lens 31. The telecentric back focal design provides a limited spread of light. A separation of the optical axis of the optical arrangements 30 greater than that spread of light enables the spatial resolution of the different images from the different optical arrangements 30.

The camera sensor 10 may therefore be used during first time periods for capturing images via the peripheral optical arrangements but not the central optical arrangement and the camera sensor 10 may be used during second time periods, different from and not overlapping the first time periods, for capturing images via the central optical arrangement but not the peripheral optical arrangements. Figs 7A and 7B illustrate suitable examples of a common optical arrangement 80 suitable for use as previously described.

The common optical arrangement 80 comprises one or more peripheral reflective portions 82 configured to reduce an angular spread of output light paths 50' compared to divergent peripherally input light paths 50.

The common optical arrangement 80 comprises one or more portions 84 configured to transmit centrally input light received along an input central light path 50 2 without significant angular deviation as output light along an output central light path 50 2 .

The common optical arrangement 80 may, for example, have rotation symmetry about an axis 86, this axis may be orthogonal to an image plane 14 of the camera sensor (not illustrated in this Fig). Fig 7A illustrates a cross-section through a common optical arrangement 80 comprising peripheral convexly-curved reflective portions 82 at its lateral edges configured to reduce an angular spread of output light paths 50' compared to divergent peripherally input light paths 50. The peripheral convexly-curved reflective portions 82 at its lateral edges are separated by a central portion 84 configured to transmit centrally input light received along an input central light path 50 2 without significant angular deviation as output light along an output central light path 50 2 . The central portion 84 may, for example, be a non-reflective portion.

Fig 7B illustrates a cross-section through a common optical arrangement 80 comprising peripheral reflective portions 82 at its lateral edges configured to reduce an angular spread of output light paths 50' compared to divergent peripherally input light paths 50. The peripheral reflective portions 82 at its lateral edges meet at a central portion. The peripheral reflective portions 82 are configured to transmit centrally input light received along an input central light path 50 2 without significant angular deviation as output light along an output central light path 50 2 . The common optical arrangement 80 may, for example, be a dichroic arrangement that is transparent to infrared light transmitted by a light projector used for depth mapping and is reflective to visible light focused by the first peripheral optical arrangement and the second peripheral optical arrangement.

As previously described, the first central optical arrangement 30 2 may be configured to provide a depth-mapping system. A depth-mapping system is a system that provides information concerning the distance to objects in the three-dimensional space 20.

Each depth mapping system comprises a light projector 160 configured to project a projection pattern into the three-dimensional space 20. The central optical arrangement 30 2, the common optical arrangement 80 and the camera sensors 10 are configured to detect the projected pattern, reflected from the three dimensional space. In some examples, the light projector 160 may transmit light of a particular frequency or range of frequencies. For example, the light projector 160 may transmit infra-red light or other light not visible to a human.

The apparatus 100 or system 200 therefore comprises a light projector 160 configured to project a projection pattern 170 into the three-dimensional space 20 to paint a scene; wherein the first central optical arrangement 30 2 is configured to provide an optical path 50 2 from a central perspective 40 2 , defined by the first central optical arrangement 30 2 , towards the common optical arrangement 80 and the common optical arrangement 80 is configured to focus an image of the projected pattern from the central perspective 40 2 onto the camera sensor 10.

The apparatus further comprises a controller 180 configured to perform the method 400 illustrated in Fig 8. Fig 8 illustrates a method 400 that may, for example, be performed by the controller 180.

At block 402, the method 400 comprises causing projection by the light projector 160 of an initial projection pattern 170 into the three-dimensional space 20 to paint a scene. At block 404, the method 400 comprises causing capture of an initial image 12 of the initial projection pattern 170 projected into the three-dimensional space 20 to paint the scene.

At block 406, the method 400 comprises causing analysis of the captured initial image 12.

At block 408, the method 400 comprises causing projection by the light projector 160 of a new projection pattern 170 into the three-dimensional space 20 to paint the scene, wherein the new projection pattern 170 is dependent upon said analysis of the captured initial image 12.

At block 410, the method 400 comprises causing capture of a new image 12 of the new projection pattern 170 projected into the three-dimensional space 20 to paint the scene.

Optionally, at block 412, the method 400 comprises causing analysis of the captured new image 12 to create a depth map for the three-dimensional space 20.

Fig 9A illustrates an example of an initial projection pattern 170 that could be projected into the three-dimensional space 20 to paint a scene at block 402. In this example, the pattern is a combination of distinct elements 172. In this example, the elements are symbols.

Fig 9B illustrates an example of a new projection pattern 170 that could be projected into the three-dimensional space 20 to paint a scene at block 408. In this example, the pattern is a combination of distinct elements 172. The new projection pattern 170 is a spatially variable pattern. The distinct elements 172 of the pattern are unevenly distributed in space. The spatial variation is dependent upon said analysis of the captured initial image that occurs at block 406.

The new projection pattern 170 has dense areas 174 (more distinct elements 172 per area) and less dense areas 176 (less distinct elements 172 per area). The positions of the dense areas 174 and the less dense areas 176 relative to the scene are determined by analyzing the captured initial image. The density of the projection pattern 170 for an area may also, for example, be determined by the analysis of the captured initial image.

There may be for example a correlation between a density of the new protection pattern 170 and a proximity of the portion of the scene it is projected onto such that proximal portions of the scene are more likely to have a dense projection pattern 174 projected onto them and distal portions of the scene are significantly less likely to have a dense projection pattern projected onto them.

Additionally or alternatively, there may for example be a correlation between a density of the new protection pattern and a variation in depth of the portion of the scene it is projected onto such that portions of the scene that have quickly varying depth are more likely to have a dense projection pattern 174 projected onto them and portions of the scene that have unvarying depth are significantly less likely to have dense projection patterns projected onto them.

Additionally or alternatively, there may for example be a correlation between a density of the new projection pattern and image recognition of a portion of the scene it is projected onto such that particular portions of the scene that are tracked using image recognition are more likely to have a dense projection pattern 174 projected onto them and portions of the scene that are not tracked using image recognition are less likely to have a dense projection pattern projected onto them.

Additionally or alternatively, there may for example be a correlation between a density of the new projection pattern 170 and a spatial variation in the portion of the scene it is projected onto such that particular portions of the scene that are spatially complex are more likely to have a dense projection pattern projected onto them and portions of the scene that are spatially simple are less likely to have a dense projection pattern 174 projected onto them. Spatial complexity may for example be determined by performing spatial-frequency analysis of the image, the spectral complexity may be used to indicate image complexity. Thus a portion of an image that has regularity in space (distinct spatial frequencies) is less complex than a portion of an image that varies abruptly and unpredictably (broad band of spatial frequencies of varying intensity).

In some but not necessarily all examples, the controller 180 is configured to control the new projection pattern 170 in dependence upon feedback from a rendering engine for rendering a panoramic image of the three-dimensional space.

Implementation of a controller 180 may be as controller circuitry. The controller 180 may be implemented in hardware alone, have certain aspects in software including firmware alone or can be a combination of hardware and software (including firmware).

As illustrated in Fig 10A the controller 180 may be implemented using instructions that enable hardware functionality, for example, by using executable instructions of a computer program 185 in a general-purpose or special-purpose processor 182 that may be stored on a computer readable storage medium (disk, memory etc) to be executed by such a processor 182.

The processor 182 is configured to read from and write to the memory 122. The processor 182 may also comprise an output interface via which data and/or commands are output by the processor 182 and an input interface via which data and/or commands are input to the processor 182.

The memory 122 stores a computer program 185 comprising computer program instructions (computer program code) that controls the operation of the apparatus 100 when loaded into the processor 182. The computer program instructions, of the computer program 185, provide the logic and routines that enables the apparatus to perform the methods illustrated in Fig 8. The processor 182 by reading the memory 122 is able to load and execute the computer program 185.

The apparatus 100 therefore comprises:

at least one processor 182; and

at least one memory 122 including computer program code

the at least one memory 122 and the computer program code configured to, with the at least one processor 182, cause the apparatus 100 at least to perform: causing projection by the light projector 160 of an initial projection pattern 170 into the three-dimensional space 20 to paint a scene;

causing capture of an initial image 12 of the initial projection pattern 170 projected into the three-dimensional space 20 to paint the scene;

causing analysis of the captured initial image 12;

causing projection by the light projector 160 of a new projection pattern 170 into the three-dimensional space 20 to paint the scene, wherein the new projection pattern 170 is dependent upon said analysis of the captured initial image 12; and

causing capture of a new image 12 of the new projection pattern 170 projected into the three-dimensional space 20 to paint the scene.

As illustrated in Fig 10B, the computer program 185 may arrive at the apparatus 100 via any suitable delivery mechanism 186. The delivery mechanism 186 may be, for example, a non-transitory computer-readable storage medium, a computer program product, a memory device, a record medium such as a compact disc read-only memory (CD-ROM) or digital versatile disc (DVD), an article of manufacture that tangibly embodies the computer program 185. The delivery mechanism may be a signal configured to reliably transfer the computer program 185. The apparatus 100 may propagate or transmit the computer program 185 as a computer data signal.

Although the memory 122 is illustrated as a single component/circuitry, it may be implemented as one or more separate components/circuitry some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage.

Although the processor 182 is illustrated as a single component/circuitry, it may be implemented as one or more separate components/circuitry some or all of which may be integrated/removable. The processor 182 may be a single core or multi-core processor.

References to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single /multi- processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field- programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry. References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc.

As used in this application, the term 'circuitry' refers to all of the following:

(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and

(b) to combinations of circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions and

(c) to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term "circuitry" would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term "circuitry" would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, or other network device.

The blocks illustrated in the Figs 8 may represent steps in a method and/or sections of code in the computer program 185. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted. Where a structural feature has been described, it may be replaced by means for performing one or more of the functions of the structural feature whether that function or those functions are explicitly or implicitly described. The term 'comprise' is used in this document with an inclusive not an exclusive meaning. That is any reference to X comprising Y indicates that X may comprise only one Y or may comprise more than one Y. If it is intended to use 'comprise' with an exclusive meaning then it will be made clear in the context by referring to "comprising only one" or by using "consisting".

In this brief description, reference has been made to various examples. The description of features or functions in relation to an example indicates that those features or functions are present in that example. The use of the term 'example' or 'for example' or 'may' in the text denotes, whether explicitly stated or not, that such features or functions are present in at least the described example, whether described as an example or not, and that they can be, but are not necessarily, present in some of or all other examples. Thus 'example', 'for example' or 'may' refers to a particular instance in a class of examples. A property of the instance can be a property of only that instance or a property of the class or a property of a sub- class of the class that includes some but not all of the instances in the class. It is therefore implicitly disclosed that a feature described with reference to one example but not with reference to another example, can where possible be used in that other example but does not necessarily have to be used in that other example. Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed. Features described in the preceding description may be used in combinations other than the combinations explicitly described.

Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not. Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not.

Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.

I/we claim