Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
TRANSFORMATIONS FOR IMAGE STABILIZATION AND REFOCUS
Document Type and Number:
WIPO Patent Application WO/2015/158953
Kind Code:
A1
Abstract:
Apparatus, methods and computer programs are provided. A method comprises: causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; determining the adjusted relative positioning of the image sensor and the one or more optical devices; and adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices. In one embodiment, the invention relates to correcting image distortions remaining after optical image stabilization. In another embodiment, it relates to scaling the image during refocusing.

Inventors:
WINDMARK JOHAN (SE)
MÅRTENSSON ANDERS (SE)
Application Number:
PCT/FI2015/050197
Publication Date:
October 22, 2015
Filing Date:
March 24, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA TECHNOLOGIES OY (FI)
International Classes:
G02B27/64; G02B7/04; G06T5/00; H04N5/232; G06T15/20
Foreign References:
US20130321650A12013-12-05
US20130044229A12013-02-21
US6396961B12002-05-28
US20070132856A12007-06-14
US20140111661A12014-04-24
Attorney, Agent or Firm:
NOKIA TECHNOLOGIES OY et al. (IPR DepartmentKarakaari 7, Espoo, FI)
Download PDF:
Claims:
A method, comprising:

causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning;

determining the adjusted relative positioning of the image sensor and the one or more optical devices; and

adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.

The method as claimed in claim 1 , wherein causing the adjustment to the relative positioning of the image sensor and the one or more optical devices comprises causing an adjustment to the position of the image sensor and/or causing an adjustment to the position of the one or more optical devices.

The method as claimed in claim 1 or 2, wherein causing the adjustment to the relative positioning of the image sensor and the one or more optical devices comprises causing a translational movement of the image sensor and/or the one or more optical devices.

The method as claimed in claim 1 , 2 or 3, wherein the transform performed on the image frame warps the image frame.

The method as claimed in any of the preceding claims, wherein the adjustment to the relative positioning of the image sensor and the one or more optical devices occurs when performing optical image stabilization.

The method as claimed in claim 5, wherein the adjustment to the relative positioning of the image sensor and the one or more optical devices is caused in response to a rotation of the camera, and the transform performed on the image frame adjusts the image frame to at least partially compensate for a change in the perspective of the camera.

The method as claimed in claim 1 , 2 or 3, wherein the adjustment to the relative positioning of the image sensor and the one or more optical devices occurs when performing refocusing.

8. The method as claimed in claim 7, wherein the transform performed on the image frame resizes the image frame.

9. A computer program comprising computer program code that, when performed by at least one processor, causes the method as claimed in any of the preceding claims to be performed.

10. An apparatus comprising means for performing the method as claimed in at least one of claims 1 to 8.

1 1 . An apparatus, comprising:

at least one processor; and

at least one memory storing computer program code configured, working with the at least one processor, to cause:

causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning;

determining the adjusted relative positioning of the image sensor and the one or more optical devices; and

adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.

12. The apparatus as claimed in claim 1 1 , wherein causing the adjustment to the relative positioning of the image sensor and the one or more optical devices comprises causing an adjustment to the position of the image sensor and/or causing an adjustment to the position of the one or more optical devices.

13. The apparatus as claimed in claim 1 1 or 12, wherein causing the adjustment to the relative positioning of the image sensor and the one or more optical devices comprises causing a translational movement of the image sensor and/or the one or more optical devices.

14. The apparatus as claimed in claim 1 1 , 12 or 13, wherein the transform performed on the image frame warps the image frame.

15. The apparatus as claimed in any of claims 1 1 to 14, wherein the adjustment to the relative positioning of the image sensor and the one or more optical devices occurs when performing optical image stabilization.

16. The apparatus as claimed in claim 15, wherein the adjustment to the relative positioning of the image sensor and the one or more optical devices is caused in response to a rotation of the camera, and the transform performed on the image frame adjusts the image frame to at least partially compensate for a change in the perspective of the camera.

17. The apparatus as claimed in claim 1 1 , 12 or 13, wherein the adjustment to the relative positioning of the image sensor and the one or more optical devices occurs when performing refocusing.

18. The apparatus as claimed in claim 17, wherein the transform performed on the image frame resizes the image frame. 19. A non-transitory computer readable medium storing computer program instructions that, when performed by at least one processor, cause:

causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning;

determining the adjusted relative positioning of the image sensor and the one or more optical devices; and

adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.

20. A method, comprising:

responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

21 . The method as claimed in claim 20, wherein the transform performed on the image frame warps the image frame.

22. An apparatus, comprising:

at least one processor; and

at least one memory storing computer program code configured, working with the at least one processor, to cause: responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

23. The apparatus as claimed in claim 22, wherein the transform performed on the image frame warps the image frame.

24. A non-transitory computer readable medium storing computer program instructions that, when performed by at least one processor, cause:

responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

25. The non-transitory computer readable medium as claimed in claim 24, wherein the transform performed on the image frame warps the image frame.

Description:
TITLE

TRANSFORMATIONS FOR IMAGE STABILIZATION AND REFOCUS TECHNOLOGICAL FIELD

Embodiments of the present invention relate to imaging. In particular, they relate to adjusting image frames by performing a transform. BACKGROUND

In a digital camera, one or more optical devices (such as one or more lenses) are configured to convey light to an image sensor. At least one optical device may be movable relative to the image sensor to allow for refocusing. The image sensor and/or at least one of the optical devices may be movable relative to one another in order to provide optical image stabilization.

BRIEF SUMMARY According to various, but not necessarily all, embodiments of the invention there is provided a method, comprising: causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; determining the adjusted relative positioning of the image sensor and the one or more optical devices; and adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices. According to various, but not necessarily all, embodiments of the invention there is provided an apparatus, comprising: means for causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; means for causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; means for determining the adjusted relative positioning of the image sensor and the one or more optical devices; and means for adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices. According to various, but not necessarily all, embodiments of the invention there is provided an apparatus, comprising: at least one processor; and at least one memory storing computer program code configured, working with the at least one processor, to cause: causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; determining the adjusted relative positioning of the image sensor and the one or more optical devices; and adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices. According to various, but not necessarily all, embodiments of the invention there is provided a non-transitory computer readable medium storing computer program instructions that, when performed by at least one processor, cause: causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; determining the adjusted relative positioning of the image sensor and the one or more optical devices; and adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.

According to various, but not necessarily all, embodiments of the invention there is provided a method, comprising: responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

According to various, but not necessarily all, embodiments of the invention there is provided an apparatus, comprising: means for responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

According to various, but not necessarily all, embodiments of the invention there is provided an apparatus, comprising: at least one processor; and at least one memory storing computer program code configured, working with the at least one processor, to cause: responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

According to various, but not necessarily all, embodiments of the invention there is provided a non-transitory computer readable medium storing computer program instructions that, when performed by at least one processor, cause: responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

BRIEF DESCRIPTION

For a better understanding of various examples that are useful for understanding the brief description, reference will now be made by way of example only to the accompanying drawings in which:

Fig. 1 illustrates an apparatus;

Fig. 2 illustrates a further apparatus;

Fig. 3 illustrates an optical arrangement comprising an image sensor and an optical device;

Fig. 4 illustrates rotational movement of a camera;

Fig. 5 illustrates a scene being imaged using optical image stabilization;

Fig. 6A illustrates a schematic showing an image sensor of the further apparatus imaging an object in a scene;

Fig. 6B illustrates the image sensor of the further apparatus shown in figure 6A undergoing translation movement for optical image stabilization, which is performed to compensate for rotational movement of the further apparatus;

Fig. 6C illustrates a virtual plane indicating how the image sensor shown in figure 6A would have had to have moved in order to fully compensate for the rotational movement of the further apparatus;

Fig. 6D is a schematic illustrating the image sensor illustrated in figure 6B, a first plane at the scene representing the scene imaged by the image sensor, the virtual plane illustrated in figure 6C and a second plane at the scene representing how the scene would have been imaged if the image sensor were positioned at the virtual plane;

Figures 7A and 7B illustrate an image plane of the further apparatus after optical image stabilization has been performed to compensate for rotational movement of the further apparatus, and a projection plane which represents where the image plane would have to be positioned in order to fully compensate for the rotational movement of the further apparatus; Figure 8 illustrates a first flow chart of a method; and

Figure 9 illustrates a second chart of a method.

DETAILED DESCRIPTION Embodiments of the invention relate to using adjusting an image frame by performing a mathematical transform on the image frame. In some embodiments, a transform is performed in order to at least partially compensate for rotational movement of a camera which may, for example, occur due to user handshake. In other embodiments, a transform is performed in response to a change in the field of view of the camera which occurs during refocusing of the camera. Figure 1 illustrates an apparatus 10 that may be a chip or a chipset. The apparatus 10 may form part of an electronic device such as that illustrated in figure 2.

The apparatus 10 comprises at least one processor 12 and at least one memory 14. The at least one processor 12 may, for example, include a central processing unit (CPU), a graphics processing unit (GPU) and/or an optical image stabilization controller. A single processor 12 and a single memory 14 are shown in figure 1 for illustrative purposes.

The processor 12 is configured to read from and write to the memory 14. The processor 12 may comprise an output interface via which data and/or commands are output by the processor 12 and an input interface via which data and/or commands are input to the processor 12.

The memory 14 is illustrated as storing a computer program 17 which comprises computer program instructions/code 18 that control the operation of the apparatus 10 when loaded into the processor 12. The processor 12, by reading the memory, is able to load and execute the computer program code 18. The computer program code 18 provides the logic and routines that enables the apparatus 10 to perform the methods illustrated in figure 8 and 9 and described below. In this regard, the processor 12 and the computer program code 18 provide means for performing the methods illustrated in figure 8 and 9 and described below.

Although the memory 14 is illustrated as a single component in figure 1 , it may be implemented as one or more separate components, some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/dynamic/cached storage.

The computer program code 18 may arrive at the apparatus 10 via any suitable delivery mechanism 28. The delivery mechanism 28 may be, for example, a non-transitory computer-readable storage medium such as an optical disc or a memory card. The delivery mechanism 28 may be a signal configured to reliably transfer the computer program code 18. The apparatus 10 may cause the propagation or transmission of the computer program code 18 as a computer data signal.

Figure 2 illustrates an apparatus 20 in the form of an electronic device. The apparatus 20 has camera functionality and may be referred to hereinafter as a camera. The electronic device 20 may have other functionality in addition to camera functionality. The apparatus 20 may, for example, function as a mobile telephone, a tablet computer, a games console or a personal music player. In some cases, the camera function may not be the primary function of the apparatus 20. The example of the apparatus 20 illustrated in figure 2 includes one or more optical devices 24, an image sensor 26, position detection circuitry 22, position adjustment circuitry 23 and the apparatus 10 illustrated in figure 1 co-located in a housing 25. The apparatus 20 optionally includes one or more motion detectors 27 and might, for example, comprise other elements such as a display and/or a radio frequency transceiver.

The elements 12, 14, 22, 23, 24, 26 and 27 are operationally coupled and any number or combination of intervening elements can exist between them (including no intervening elements).

The one or more optical devices 24 are configured to convey light to the image sensor 26. The one or more optical devices 24 may, for example, comprise one or more lenses and/or one or more mirrors. In some embodiments, the one or more optical devices 24 consist merely of a single lens.

Figure 2 schematically illustrates light 6 entering the housing 25 of the apparatus 20 via an aperture 21 and being conveyed to the image sensor 26. The image sensor 26 is configured to convert incident light into digital image data. The processor 12 is configured to retrieve image data captured by the image sensor 26 and process it. In this regard, the image sensor 26 is configured to provide an input to the processor 12.

The image sensor 26 may be any type of image sensor. It may, for example, be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) sensor. The position adjustment circuitry 23 is configured to adjust the relative positioning of the image sensor 26 and at least one of the optical devices 24. In this regard, the position adjustment circuitry 23 may adjust the position of the image sensor 26 and/or at least one of the optical devices 24. The position adjustment circuitry 23 may, for example, comprise at least one electric motor that is configured to move the image sensor 26 and/or at least one of the optical devices 24.

The image sensor 26 and/or at least one of the optical devices 24 may be moved for the purpose of performing optical image stabilization. Alternatively or additionally, the image sensor 26 and/or at least one of the optical devices 24 may be moved to perform focusing and refocusing.

The position detection circuitry 22 is configured to detect the position of the image sensor 26 and/or at least one of the optical devices 24. The position detector circuitry 22 might, for example, comprise a Hall effect sensor. The position detection circuitry 22 might be configured to determine the position of the image sensor 26 and/or at least one of the optical devices 24 periodically. The position detection circuitry 22 is also configured to provide a signal indicating the position of the image sensor 26 and/or at least one of the optical devices 24 to the processor 12. The one or more motion sensors 27 may, for example, comprise one or more accelerometers and/or one or gyroscopes. The one or more motion sensors 27 may provide inputs to the processor 12 that are indicative of the motion of the apparatus 20 (for example, during still image or video image capture). Figure 3 illustrates an example of an optical arrangement in which the one or more optical devices 24 consist of a single lens. The optical axis of the optical arrangement is illustrated by a dotted line 28 in figure 3. Cartesian coordinate axes 70 have also been illustrated in figure 3. The image sensor 26 illustrated in figure 3 is substantially planar in nature and extends in the x and y dimensions.

As mentioned above, the position adjustment circuitry 23 may be configured to move the image sensor 26 and/or the optical device 24 to perform optical image stabilization. In this regard, the position adjustment circuitry 23 may be configured to move the image sensor 26 laterally relative to the optical device 24 and relative to the direction of propagation of light from the optical device 24 (that is, in the +x, -x, +y and -y directions). Alternatively or additionally, the position adjustment circuitry 23 might be configured to move the optical device 24 laterally relative to the image sensor 26 and relative to the direction of propagation of light from the optical device 24 (that is, in the +x, -x, +y and -y directions).

The position adjustment circuitry 23 might be configured such that it cannot or does not rotate the image sensor 26 or the optical device(s) 24. The position adjustment circuitry 23 might be configured in this way in order to save space inside the housing 25.

The position adjustment circuitry 23 may be configured to move the image sensor 26 and/or the optical device 24 to perform focusing or refocusing. In this regard, the position adjustment circuitry 23 might be configured to move the image sensor 26 to and from the optical device 24 (that is, in the +z and the -z directions). Alternatively or additionally, the position adjustment circuitry 23 might be configured to move the optical device 24 to and from the image sensor 26 (that is, in the -z and the +z directions).

Figure 4 illustrates the further apparatus/camera 20 being directed towards a scene 50. In this example, the user is capturing a video of the scene 50 using the camera 20. The triangle labeled with the reference numeral 33 represents a cone of light which enters the aperture 21 of the camera 20 when the camera 20 is in its initial position. Due to user handshake, the camera 20 rotates in a direction indicated by the arrow labeled with the reference numeral 34 while video is being captured. The cone of light entering the camera 20 after the rotation has occurred is labeled with the reference numeral 35 in figure 4. Figure 5 illustrates how optical image stabilization compensates for the rotational motion illustrated in figure 4. In figure 5, a plane 36 is illustrated which is representative of a region of the scene 50 that is imaged by the camera 20 prior to the rotation of the camera. As the camera 20 rotates, the processor 12 responds by causing the position adjustment circuitry 23 to make an adjustment to the relative positioning of the image sensor 26 and at least one optical device 24 to try to stabilize the video being captured.

The relative positioning of the image sensor 26 and at least one optical device 24 is adjusted by causing the image sensor 26 and/or at least one optical device 24 to undergo a translational movement (and not a rotational movement) as explained above in relation to figure 3. This means that adjustment of the relative positioning of the image sensor 26 and at least one optical device 24 cannot fully compensate for the rotation of the camera 20 using optical image stabilization.

The plane labeled with the reference numeral 37 indicates the imaged region of the scene 50 following rotation of the camera 20 and following adjustment of the relative positioning of the image sensor 26 and the optical device(s) 24. The misalignment of the planes 36, 37 indicates that the image sensor 26 is effectively viewing the scene 50 from a slightly different perspective in each instance, showing that the optical image stabilization has not fully compensated for the rotation of the camera 20. This can manifest itself as "jitter" or "wobble" during video capture. This "jitter" is particularly prevalent at the periphery of the captured video.

In embodiments of the invention, this is remedied, at least partially, by adjusting captured image frames by performing a transform on the image frames. In this example, the transform which is performed warps the image, such that the image seems as if it was captured from the same perspective as that "seen" by the image sensor 26 prior to the rotation of the camera 20.

An example of how the transform may be performed is described below in relation to figures 6A to 7B. In this example, the relative positioning of the image sensor 26 and an optical device 24 is adjusted by moving the image sensor 26. It will be appreciated from reading the above, however, that the optical device 24 could be moved instead of or in addition to the image sensor 26 to achieve the same effect. Figure 6A illustrates an image sensor 26 positioned in a housing 25 of a camera 20 that is imaging an object 51 in a scene 50. The image sensor 26 is co-incident with the image plane of the one or more optical devices 24 in the camera 20. Figure 6B illustrates a rotation of the camera 20 being simulated by moving the image sensor 26 in the direction indicated by the arrow 31 .

Figure 6C illustrates a virtual plane 39 which represents where the image plane/image sensor 26 would have to be positioned in order to fully compensate for the rotation that is simulated in figure 6B. In figure 6C, the virtual plane 39 is a rotation of the image plane/image sensor 26 in the direction illustrated by the arrow labeled with the reference numeral 32.

Figure 6D is a schematic illustrating the misalignment between the image plane/image sensor 26 shown in figure 6B and the virtual plane 39 shown in figure 6C, in two dimensions. Figure 6D also illustrates a plane 63 which represents a region of the scene 50 that would be captured by an image sensor positioned at the virtual plane 39 and a plane 64 representing a region of the scene 50 that would be captured when the image sensor 26 is in the adjusted position illustrated in figure 6B.

It can be seen in figure 6D that the plane 63 is positioned at an angle A relative to the plane 64 from by the presence of the dotted normal line 62 in figure 6D. In order to compensate for the misalignment between the position of the adjusted image plane/image sensor 26 in figure 6B and the virtual plane 39 representing the ideal position of the image sensor, a transform should be performed on the image which projects the image at the image plane/image sensor 26 onto the virtual plane 39 by mapping each pixel at the image plane/image sensor 26 onto the virtual plane 39.

Referring now to figure 7A, the camera projection x p of a point x on an image plane/image sensor 26/P can be written as:

Xp — KRTx (1 ) where K is a projection matrix, R is a rotation matrix and 7 is a translation.

The translation of the image sensor 26 is introduced into the projection matrix Ki as ΔΧ and ΔΥ, along with the focal length f of the optical device(s) 24: 0

/ Rx (3)

0 where: x p i is a point on the image plane/image sensor 26/P which corresponds with a point Xpo on the projection/virtual plane 39/Po, as illustrated in figure 7B.

From equations (2) and (3) we have:

Xpi — P X— K-^x (4)

Simplifying this we have:

Rewriting, we have:

It therefore follows that:

1 K 0 1 Xp Q — K- X.pi (7)

In order to determine how to transform x p i to x p o, the following equation can therefore be used:

If we consider the rotation to be around x and y axes which correspond with the dimensions in which the image sensor 26 is translated, it follows that:

R— RgRy— (9) where: the angle γ relates to a rotation around the y axis and the angle Θ relates to a rotation around the x axis.

With equations (8) and (9) we have a set of equations that we can use to map each pixel on the image plane/image sensor 26/Pi to the virtual/projection plane 39/Po. The only unknowns are the angles Θ and γ. If we apply the constraint that the center of the captured image is projected to the center of the projection, we have:

By using equations (8), (9) and (10) we can determine the angles Θ and γ. Equation (8) can then be used to project the pixels on the image plane/image sensor 26/Po (in a captured image) onto a virtual/projection plane 39/Po in order to digitally adjust the image to compensate for the rotation of the camera 20.

Advantageously, embodiments of the invention provide a method of digitally compensating for the rotation of a camera 20 using a relatively straightforward computational technique. In this regard, those skilled in the art will realize from reading the above that the only variable inputs to the above process for projecting a captured image onto the virtual/projection plane 39/Pi are ΔΧ and ΔΥ, which relate to the translational movement the image sensor 26 and/or the optical device 24 undergoes during optical image stabilization. The processor 12 of the camera 20 can therefore perform the transform using a simple input from the position detection circuitry 22. A first example of a method according to embodiments of the invention will now be described in relation to figure 8. In this first method, a transform is performed in accordance with that described above in relation to figures 4 to 7B.

In the first example, the user provides user input to cause the camera 20 to begin capturing video. The expression "capturing video" is intended to encompass both transient capture of video when the camera 20 is in a "viewfinder mode" and more permanent capture of video.

The processor 12 responds to the user input by commencing a video capture process in which it begins to read image data from the image sensor 26. When the processor 12 begins the video capture process, the image sensor 26 and the one or more optical devices 24 are in their initial positions.

The position detection circuitry 22 may be configured to detect the relative positioning of the image sensor 26 and the one or more optical devices 24 at least once per captured image frame and in some cases it may detect the relative positioning of the image sensor 26 and the one or more optical devices 24 multiple times per captured image frame.

After commencement of the video capturing process and subsequent to the capture of one or more image frames, the user then inadvertently causes the camera 20 to rotate, as described above, through user handshake. At block 801 in figure 8, the processor 12 responds to the rotation of the camera 20 by initiating optical image stabilization. In this regard, the processor 12 causes an adjustment to the relative positioning of the image sensor 26 of the camera 20 and the one or more optical devices 24 of the camera 20 while an image frame is being captured. The processor 12 may, for example, cause a translational movement of the image sensor 26 and/or the one or more optical devices 24, as described above. This is done by controlling the position adjustment circuitry 23 to adjust the position of the image sensor 26 and/or the one or more optical devices 24.

In block 802 in figure 8, the processor 12 causes the image sensor 26 to capture at least part of an image frame, while the image sensor 26 and the one or more optical devices 24 are in the adjusted relative positioning.

In block 803 in figure 8, the processor 12 determines the adjusted relative positioning of the image sensor 26 and the one or more optical devices 24. For example, the processor 12 may determine the distance ΔΧ, ΔΥ that the image sensor 26 and/or the one or more optical devices 24 have moved from their respective initial positions.

In practice, in blocks 802 and 803 in figure 8, at least part of the image frame may be formed as the relative positioning of the image sensor 26 and the one or more optical devices 24 is being adjusted. If so, and the position detection circuitry 22 is detecting the relative positioning of the image sensor 26 and the one or more optical devices 24 multiple times per captured image frame, the "adjusted relative positioning" can be considered to be the average adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 while the image frame was being captured.

In block 804 in figure 8, the processor 802 adjusts the image frame captured in block 802 by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 determined in block 803. The processor 12 may, for example, warp the image frame captured in block 802 by projecting the image frame onto a virtual/projection plane as described above. The transform that is performed on the image frame at least partially compensates for the change in the perspective of the camera 20 that results from the rotation of the camera 20.

A second example of a method according to embodiments of the invention will now be described in conjunction with figure 8. The second example is different from the first example in that there is not necessarily any rotation of the camera 20 while capturing video and the transform that is performed does not compensate for any such rotation. In the second example, a transform is performed on image frames to digitally stabilize the image frames during refocusing of the camera 20. In the second example, the user provides user input to cause the camera 20 to begin capturing video. The processor 12 may cause an adjustment to the relative positioning of the image sensor 26 and the one or more optical devices 24 in order to focus on the scene being captured. In this regard, the processor 12 may alter the distance between the image sensor 26 and at least one optical device 24 by moving the image sensor 26 and/or an optical device 24 in the z-dimension illustrated in figure 3.

Changes to the scene being imaged then cause the processor 12 to attempt to refocus on the scene by causing a further adjustment to the relative positioning of the image sensor 26 and the optical device 24 in block 801 of figure 8.

In block 802 in figure 8, the processor 12 causes the image sensor 26 to capture at least part of an image frame, while the image sensor 26 and the one or more optical devices 24 are in the adjusted relative positioning.

In block 803 in figure 8, the processor 12 determines the adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 is being adjusted. For example, the processor 12 may determine the distance ΔΖ that the image sensor 26 and/or the one or more optical devices 24 have moved from their respective initial positions.

In practice, as mentioned above, in blocks 802 and 803 in figure 8, at least part of the image frame may be formed as the relative positioning of the image sensor 26 and the one or more optical devices 24. If so, and the position detection circuitry 22 is detecting the relative positioning of the image sensor 26 and the one or more optical devices 24 multiple times per captured image frame, the "adjusted relative positioning" can be considered to be the average adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 while the image frame was being captured.

In block 804 in figure 8, the processor 802 adjusts the image frame captured in block 802 by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 determined in block 803. In this second example of the method, however, the transform that is performed on the image is a resizing/scaling of the image rather than a projection. Advantageously, this stabilizes the video being captured by mitigating the zooming in and out that occurs when the camera 20 is refocusing.

In this second example of the method, when relative movement between the image sensor 26 and an optical device 24 causes the field of view of the camera 20 to widen, a portion of the captured image frame containing the wider field of view may be cropped, and the transform causes the portion of the captured image frame which corresponds with the field of view in the previously captured image frame to be enlarged. Figure 9 illustrates a third example of a method according to embodiments of the invention. The third example is similar to the first example in that a transform is performed on an image frame which at least partially compensates for rotational movement of the camera 20.

The third example may incorporate any of the features of the first example described above in relation to figure 8. However, the third example differs from the first example in that there need not be any adjustment to the relative positioning of the image sensor 26 and the one or more optical devices 24. For example, the camera 20 need not have an optical image stabilization system. The processor 12 may, for instance, receive inputs from the one or more motion detectors 27, rather than from the position detection circuitry 22, for use in the projection of the captured image onto the projection plane.

In the third example of the method, rotational movement of the camera 20 is detected, for example, while video of a scene is being captured by the camera 20. The rotational movement of the camera 20 causes a change in the perspective from which the scene is being imaged. Optical image stabilization may or may not be performed, as described in the first example. At block 901 in figure 9, the processor 12 responds to rotational movement of the camera 20 by performing a transform on a captured image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.

The transform that is performed may be the same as that described above in relation to figures 4 to 7B. If no optical image stabilization is performed, the angles γ and Θ referred to in equation (9) above may be determined from inputs provided by the one or more motion detectors 27. In this regard, the processor 12 may determine how the position of the camera 20 is changing over a period of time by receiving multiple readings from the motion detector(s) 27 per image frame. If so, averages of those readings may be used as inputs to the transform.

References to 'non-transitory computer-readable storage medium', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single/multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry. References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc. As used in this application, the term 'circuitry' refers to all of the following:

(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and

(b) to combinations of circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and

(c) to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term "circuitry" would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term "circuitry" would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, or other network device.

The blocks illustrated in figures 8 and 9 may represent steps in a method and/or sections of code in the computer program 17. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.

Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed.

Features described in the preceding description may be used in combinations other than the combinations explicitly described.

Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not. Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not.

Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.

I/we claim: