Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
HAND HELD DEVICE FOR CONTROLLING DIGITAL MAGNIFICATION ON A PORTABLE DISPLAY
Document Type and Number:
WIPO Patent Application WO/2019/239162
Kind Code:
A1
Abstract:
Asmart glass system (10) arranged to permit a visually impaired userto controlmagnificationof animage provided by the system according to their eye condition,comprising: aportable computing device (100) comprising a motion sensor; a smart glass based wearable device(160) comprising a display portion(162), thedisplay portion (162)being provided in a field of view of theuser; and an image capture device(163), wherein the portable computing device (100) isoperatively coupled with the smart glass based wearable device(160), the system (10) being configured to display on the display portion (162) an image corresponding to at least a portion of an image captured by the image capture device(163), whereinthe system (10) is configured to detectrotational movementof the portable computingdevice (100) in thehand of theuserby means ofthe motion sensor(102), wherein magnification of the image displayed on thewearable device (160) is controlledbased on the rotational movement of the portable computing device(100).

Inventors:
HICKS STEPHEN LLOYD FREDERICK (GB)
RUSSELL NOAH AARON (GB)
Application Number:
PCT/GB2019/051688
Publication Date:
December 19, 2019
Filing Date:
June 17, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
OXSIGHT LTD (GB)
International Classes:
G09G3/00; G02B27/01; G06F3/01; G06F3/0346; G06F3/0485; G09G5/391
Foreign References:
US20110221656A12011-09-15
US20170336882A12017-11-23
Attorney, Agent or Firm:
YEADON IP LIMITED (GB)
Download PDF:
Claims:
CLAIMS

1. A smart glass system (10) arranged to permit a visually impaired user to control magnification of an image provided by the system (10) according to their eye condition, comprising:

a portable computing device (100) comprising a motion sensor (102);

a smart glass based wearable device (160) comprising a display portion (162), the display portion (162) being provided in a field of view of the user; and

an image capture device (163),

wherein the portable computing device (100) is operatively coupled with the smart glass based wearable device (160),

the system (10) being configured to display on the display portion (162) an image corresponding to at least a portion of an image captured by the image capture device (163),

wherein the system (10) is configured to detect rotational movement of the portable computing device (100) in the hand of the user by means of the motion sensor (102), wherein magnification of the image displayed on the wearable device (160) is controlled based on the rotational movement of the portable computing device (100).

2. The system of claim 1 wherein the image capture device is configured to capture a scene having at least a portion in a direction a person wearing the wearable device is facing, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.

3. The system of any preceding claim, wherein the motion sensor comprises a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, magnification of the image displayed on the wearable device is controlled based on rotational velocity computed using said gyroscope.

4. The system of any preceding claim, wherein the motion sensor comprises an accelerometer and is arranged to transmit magnitude of linear acceleration of the portable computing device relative to gravity.

5. The system of any preceding claim, wherein the motion sensor comprises a magnetometer that determines and transmits instantaneous orientation of the portable computing device relative to Earth’s magnetic field.

6. The system of claim 5 as depending through claims 4 and 3, wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.

7. The system of any preceding claim, wherein the portable computing device is actuated by pressing of a button present in/on said computing device, wherein the magnification is controlled only during the time the button is kept pressed.

8. The system of any preceding claim, wherein clockwise hand rotation results in zooming in of the image, and anti-clockwise hand rotation results in zooming out of the image.

9. The system of any preceding claim, wherein the extent of hand rotation is proportional to the extent of magnification.

10. The system of claim 1 , wherein the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the motion sensor, the one or more components comprising at least an accelerometer.

11. The system of any preceding claim, wherein when the magnification operation is paused, user of said wearable device can view said magnified image and/or pan around said magnified image in X and Y axis and/or scroll around said magnified image.

12. The system of any preceding claim, wherein an absolute position of said portable computing device is configured to be indicative of level of zoom.

13. The portable computing device according to any preceding claim, wherein the magnification operation comprises any or a combination of stepped zooming or dynamic zooming.

14. A system according to any preceding claim wherein the wearable device comprises the image capture device.

15. A system according to any one of claims 1 to 13 wherein the image capture device is provided external to the wearable device, the image capture device being operatively coupled to the wearable device.

16. A system according to claim 15 wherein the image capture device is operatively coupled to the wearable device by means of a wireless connection or a wired connection.

17. A system according to any preceding claim wherein the motion sensor is provided by an inertial measurement unit (IMU).

18. A method of enabling a visually impaired user to control magnification operation of an image being displayed in a smart glass based wearable device by means of a portable computing device, said method comprising the step of:

detecting rotational movement of a portable computing device in the hand of a user by means of a motion sensor comprised by the device, the portable computing device being operatively coupled to the wearable device;

capturing by means of the image capture device an image of a scene;

displaying on the display portion an image corresponding to at least a portion of the image captured by the image capture device,

the method comprising controlling magnification of the image displayed on the wearable device based on the rotational movement of the portable computing device.

19. The method of claim 18 comprising capturing by means of the image capture device a scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising displaying on the display portion of the wearable device at least a portion of the scene captured by the image capture device.

20. The method of claim 19 whereby displaying on the display portion of the wearable device at least a portion of the scene comprises displaying on the display portion at least a portion of the scene corresponding to the location of the display in the field of view of the person wearing the wearable device.

21. The method of any one of claims 18 to 20 whereby the motion sensor comprises a gyroscope sensor, the method comprising: receiving from the gyroscope sensor a change in gyroscope value indicative of extent of hand rotation of said portable computing device;

determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and

generating, from the portable computing device, a magnification control signal to said smart glass based wearable device based on the change in gyroscope value and the determined orientation.

22. The method of claim 21, wherein said method further comprises the step of:

receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values;

normalizing said smoothened gyroscope values; and

accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

23. The method of any one of claims 18 to 22, wherein, during zoom-out of the image being displayed in the smart glass based wearable device, the magnification control signal increases center crop image so as to enlarge display configured in the wearable device.

24. The method of any one of claims 18 to 23 wherein, during zoom-in of the image being displayed in the smart glass based wearable device, the magnification control signal reduces center crop image so as to enlarge display configured in the wearable device.

Description:
HAND HELD DEVICE FOR CONTROLLING DIGITAL MAGNIFICATION ON A

PORTABLE DISPLAY

FIELD OF THE INVENTION

[0001] The present disclosure pertains to a system, device and method thereof for controlling digital magnification on a portable display, such as that of a smart glass.

BACKGROUND

[0002] The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.

[0003] Human vision is capable of perceiving a very broad range of scales from the largest field of view such as an entire room, to very narrow such as reading letters or concentrating on smaller objects.

[0004] Wearable display systems such as augmented reality (AR) or virtual reality (VR) devices typically have a fixed field of view that is determined by the resolution of imaging system (e.g. LCD, OLED, or LCoS display panel) and optics that has been devised to bring the image into focus (in the case of a near-eye display). These mechanisms necessarily limit the amount of information that can be displayed at any one time.

[0005] Due to this field limitation, such systems must rely on the user being able to quickly and accurately change the magnification of the image on the display. The magnification, or zoom, can be used to increase visibility of small objects (i.e. magnify or zoom in), or to expand the field of view (i.e. minify or zoom out).

[0006] For a personal device that is designed for long-term use, particularly in the case of a vision enhancement device (use, for instance, by people with severely impaired sight), there is an advantage in making the process of zoom to be as quick, intuitive and as accurate as possible.

[0007] Few techniques do exist to enable controlling of the zoom level of a wearable display such as of a transparent display configured in a smart glass. One such existing technique includes having a separate button for zoom in and zoom out (e.g. + and -), whereas another technique includes incorporation of a“Pinch and zoom” on a tactile display (e.g. mobile phones). Yet another existing technique includes incorporation of hand gestures (e.g. expand or contract). Each of these techniques have significant limitations when using a device discretely, quickly and accurately in a real-life scenario. This is especially true for systems with video-pass-through, i.e. when the video from a camera is displayed live on a wearable transparent display to help with daily tasks. One such limitation includes requirement of separate buttons that can be slow for large range changes (a button for each zoom step) and have limited magnification steps (the smaller the steps, the longer it will take to get to a maximum or minimum zoom). In addition, small buttons can be difficult to press for elderly people or those with reduced sensitivity in the fingers. Pinch and zoom technique, on the other hand, is only intuitive for a tactile display where visual feedback of the scale of motion of the fingers is directly related to resizing of the image. Also, large hand gestures are impractical in crowded situations. They are obvious to those around the user and may be a source of embarrassment. Hand gestures are also challenging to accurately detect by mechanical or computer systems in difficult lighting situations.

[0008] There is therefore a need in the art for a device and a method of operating the same that enables efficient and quick implementation of zooming functionality.

[0009] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.

SUMMARY

[0010] The present disclosure pertains to a device and method thereof for controlling digital magnification on a portable display, such as that of a smart glass. By smart glass is meant wearable computer glasses (or‘spectacles’) that provide visual information in a user’s field of view in addition to that which the user is able to view substantially directly (either with or without an intermediate optical element such as a lens, which may be substantially transparent or partially transparent). The provision of visual information in addition to that which the user views substantially directly may be by superimposing information onto the user’s field of view, for example on a display element in the user’s field of view (which display element may be substantially transparent, at least partially transparent or opaque). For example, the display may be a substantially opaque LED or LCD display of the type used in mobile telephones or partially transparent. In some embodiments an image may be projected onto the display from a light source in the form or a projector device, for example of the type used in head-up display (HUD) displays or augmented reality (AR) overlays and the reflected image viewed by the user.

[0011] Embodiments of the present invention may be understood by reference to the appended claims.

[0012] In an aspect of the invention there is provided a smart glass system arranged to permit a visually impaired user to control magnification of an image provided by the system according to their eye condition, comprising:

a portable computing device comprising a motion sensor;

a smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user; and

an image capture device, wherein the portable computing device is operatively coupled with the smart glass based wearable device,

the system being configured to display on the display portion an image corresponding to at least a portion of an image captured by the image capture device,

wherein the system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein magnification of the image displayed on the wearable device is controlled based on the rotational movement of the portable computing device.

[0013] This has the advantage that a user may adjust the image provided to them by the system in order to enhance their ability to view a scene. The user may endeavour to optimise the magnification of a scene as viewed using the smart glass based wearable device. Optionally, the system may be configured wherein the user may adjust the magnification in substantially real time. The system may be configured to permit optimisation of the magnification in substantially real time, the magnification being changed dynamically as the user manipulates the portable computing device in their hand.

[0014] The image capture device may comprise a video image capture device. The image capture device may comprise at least one CMOS image capture device and/or at least one CCD image capture device. Other image capture devices may be useful. The wearable device may generate a substantially real time stream of images captured by the image capture device.

[0015] Optionally, the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device. In some embodiments a zoomed image (or 1 :1 scale image with no magnification or demagnification with respect to the remainder of the user’s field of view) may appear inset in the user’s field of view, e.g. to one side or corner of the field of view, or in a portion away from an edge of the field of view. It is to be understood that by field of view is meant the view in a direction a user is facing.

[0016] It is to be understood that reference to the“field of view of a person wearing the wearable device” is to be understood to be with respect to a person wearing the wearable device such that the display portion is in their field of view, optionally their field of view when looking substantially directly ahead, optionally their field of view with their eyes directed in a prescribed direction, the direction being any prescribed direction from directly upwards (a Ί2 o’clock’ direction), directly downwards (a‘6 o’clock direction’) or any prescribed direction from 12 o’clock clockwise around to 12 o’clock.

[0017] Optionally, the motion sensor comprises a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, magnification of the image displayed on the wearable device is controlled based on rotational velocity computed using said gyroscope.

[0018] Optionally, the motion sensor comprises an accelerometer and is arranged to transmit magnitude of linear acceleration of the portable computing device relative to gravity.

[0019] Optionally, the motion sensor comprises a magnetometer that determines and transmits instantaneous orientation of the portable computing device relative to Earth’s magnetic field.

[0020] Optionally, respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.

[0021] Optionally, the portable computing device is actuated by pressing of a button present in/on said computing device, wherein the magnification is controlled only during the time the button is kept pressed. [0022] Optionally, clockwise hand rotation results in zooming in of the image, and anti clockwise hand rotation results in zooming out of the image. Alternatively, clockwise hand rotation results in zooming out of the image, and anti-clockwise hand rotation results in zooming in of the image.

[0023] Optionally, the extent of hand rotation is proportional to the extent of magnification.

[0024] Optionally, the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the motion sensor, the one or more components comprising at least an accelerometer.

[0025] Optionally, when the magnification operation is paused, user of said wearable device can view said magnified image and/or pan around said magnified image in X and Y axis and/or scroll around said magnified image.

[0026] Optionally, an absolute position of said portable computing device is configured to be indicative of level of zoom.

[0027] Optionally, the magnification operation comprises any or a combination of stepped zooming or dynamic zooming.

[0028] Optionally, the wearable device comprises the image capture device.

[0029] The image capture device may be an integral part of the wearable device. The image capture device may be arranged to be sufficiently forward facing so as to include at least a portion of a user’s field of view. The image capture device may be arranged to be substantially fully forward facing.

[0030] Optionally, the image capture device is provided external to the wearable device, the image capture device being operatively coupled to the wearable device.

[0031] Optionally, the image capture device is operatively coupled to the wearable device by means of a wireless connection or a wired connection.

[0032] Optionally, the motion sensor is provided by an inertial measurement unit (IMU). [0033] The IMU may comprise a gyroscope, an accelerometer and/or a magnetometer as described above.

[0034] In a further aspect of the invention there is provided a method of enabling a visually impaired user to control magnification operation of an image being displayed in a smart glass based wearable device by means of a portable computing device, said method comprising the step of:

detecting rotational movement of the portable computing device in the hand of a user by means of a motion sensor comprised by the portable computing device, the portable computing device being operatively coupled to the wearable device;

capturing by means of the image capture device an image of a scene;

displaying on the display portion an image corresponding to at least a portion of the image captured by the image capture device,

the method comprising controlling magnification of the image displayed on the wearable device based on the rotational movement of the portable computing device.

[0035] Optionally, the method comprises capturing by means of the image capture device a scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising displaying on the display portion of the wearable device at least a portion of the scene captured by the image capture device.

[0036] Thus, the person wearing the wearable device will see the image captured by the image capture device within their field of view.

[0037] Optionally, displaying on the display portion of the wearable device at least a portion of the scene comprises displaying on the display portion at least a portion of the scene corresponding to the location of the display in the field of view of the person wearing the wearable device.

[0038] Optionally, the image displayed by the display portion occupies a portion and not the whole of the field of view of the user. Optionally, at a magnification corresponding to unity, i.e. at substantially no magnification, the image displayed is substantially continuous with a remainder of a field of view of the user such that the image displayed appears to be superimposed upon or part of the remainder of the field of view. It is to be understood that the display portion may be at least partially transparent, allowing the user to see objects in the portion of the field of view occupied by the display portion through the display portion as well as information displayed on the display portion by the system.

[0039] Optionally, the motion sensor comprises a gyroscope sensor, the method comprising: receiving from the gyroscope sensor a change in gyroscope value indicative of extent of hand rotation of said portable computing device;

determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and

generating, from the portable computing device, a magnification control signal to said smart glass based wearable device based on the change in gyroscope value and the determined orientation.

[0040] Optionally, said method further comprises the step of:

receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values;

normalizing said smoothened gyroscope values; and

accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

[0041] Optionally, during zoom-out of the image being displayed in the smart glass based wearable device, the magnification control signal increases center crop image so as to enlarge display configured in the wearable device.

[0042] Optionally, during zoom-in of the image being displayed in the smart glass based wearable device, the magnification control signal reduces center crop image so as to enlarge display configured in the wearable device. [0043] In an aspect of the invention there is provided a portable computing device for use with a smart glass system arranged to permit a visually impaired user to control magnification of an image provided by the system according to their eye condition, the portable computing device comprising a motion sensor; the portable computing device being arranged to be operatively coupled with a smart glass based wearable device of the system, the smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user, the smart glass system further comprising an image capture device, the system being configured to display on the display portion of the wearable device an image corresponding to at least a portion of an image captured by the image capture device, wherein the system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein magnification of the image displayed on the wearable device is controlled based on the rotational movement of the portable computing device.

[0044] In an aspect of the invention there is provided a smart glass based wearable device arranged to be operatively coupled with the portable computing device of the preceding aspect, the system being arranged to permit a visually impaired user to control magnification of an image provided by the system according to their eye condition.

[0045] In an aspect, the present disclosure relates to portable computing device (interchangeably referred to as hand-held device or zoom-controlling computing device hereinafter) that is operatively coupled with a smart glass based wearable device (interchangeably referred to as portable display), said portable computing device comprising an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, magnification of image displayed on the wearable device is controlled based on rotational velocity computed using said gyroscope. [0046] In an aspect, the IMU can further include an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity. The IMU can further include a magnetometer that can determine and transmit instantaneous orientation of the portable computing device relative to Earth’s magnetic field. In an aspect, respective outputs from the gyroscope, the accelerometer, and the magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.

[0047] In an aspect, the portable computing device can be actuated by pressing of a button present in/on said computing device, wherein the magnification is controlled only during the time the button is kept pressed.

[0048] In another aspect, clockwise hand rotation can result in zooming in of the image, and anti-clockwise hand rotation results in zooming out of the image.

[0049] In yet another aspect, the extent of hand rotation can be proportional to the extent of magnification.

[0050] In another aspect, the orientation of the portable computing device can be determined based on fusion of positional data from one or more components of the IMU, said one or more components comprising at least an accelerometer.

[0051] In another aspect, when the magnification operation is paused, user of said wearable device can view said magnified image and/or pan around said magnified image in X and Y axis and/or scroll around said magnified image.

[0052] In another aspect, an absolute position of said portable computing device can be configured to be indicative of level of zoom.

[0053] In yet another aspect, the magnification operation can include any or a combination of stepped zooming or dynamic zooming.

[0054] The present disclosure further relates to a method of controlling magnification operation on an image being displayed in a smart glass based wearable device by a portable computing device, said method comprising the step of: receiving, at the portable computing device, from a gyroscope sensor configured in the portable computing device, a change in gyroscope value indicative of extent of hand rotation of said portable computing device; determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and generating, from the portable computing device, a magnification control signal to said glass based wearable device based on the change in gyroscope value and the determined orientation.

[0055] In an aspect, the method can further include steps of receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window fdter, the received new gyroscope values; normalizing said smoothened gyroscope values; and accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

[0056] In another aspect, during zoom-out of the image being displayed in the smart glass based wearable device, the magnification control signal increases center crop image so as to enlarge display configured in the wearable device. Similarly, during zoom-in of the image being displayed in the smart glass based wearable device, the magnification control signal reduces center crop image so as to enlarge display configured in the wearable device.

[0057] Some embodiments of the present invention provide a portable computing device that is operatively coupled with a smart glass based wearable device. In an aspect, the portable computing device can include an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of the portable computing device, orientation of the portable computing device is determined, and upon hand rotation of the portable computing device, magnification of image displayed on the wearable device is controlled based on rotational velocity computed using the gyroscope.

BRIEF DESCRIPTION OF DRAWINGS [0058] FIGs. 1 and 2 illustrate exemplary representations of the proposed device in accordance with an embodiment of the present disclosure.

[0059] FIG. 3 illustrates an exemplary flow diagram to enable zooming of captured videos/images in accordance with an embodiment of the present disclosure.

[0060] FIG. 4 illustrates an exemplary flow diagram to enable head pan operation during pause action during magnification in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION OF DRAWINGS

[0061] Embodiments of the present disclosure include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine- executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, firmware and/or by human operators.

[0062] Embodiments of the present disclosure may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).

[0063] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present disclosure with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present disclosure may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the disclosure could be accomplished by modules, routines, subroutines, or subparts of a computer program product.

[0064] If the specification states a component or feature“may”,“can”,“could”, or“might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.

[0065] Arrangements and embodiments may now be described more fully with reference to the accompanying drawings, in which exemplary embodiments may be shown. Embodiments may, however, be embodied in many different forms and should not be construed as being limited to embodiments set forth herein; rather, embodiments may be provided so that this disclosure will be thorough and complete, and will fully convey the concept to those skilled in the art.

[0066] The suffixes ‘module’, ‘unit’ and ‘part’ may be used for elements in order to facilitate the disclosure. Significant meanings or roles may not be given to the suffixes themselves and it is understood that the‘module’,‘unit’ and‘part’ may be used together or interchangeably.

[0067] In an aspect, the present disclosure relates to a device (also interchangeably referred to as apparatus or system) that permits rapid, accurate, and finely tuned zooming of a live image with a simple physical behaviour that is reminiscent of common actions, such as that used when one“turns up the volume” on a traditional sound system. By developing a control based on this common action, the proposed device and associated mechanism/behaviour improves on the current state of art.

[0068] In an aspect, the present disclosure provides an electronic system that contains an inertial measurement device (IMU) having a gyroscope and an accelerometer (it may also alternatively or additionally include a magnetometer), wherein during implementation/operation, a user can press and hold a button on the proposed device and then rotate his/her hand as if controlling the volume knob on an audio device. Rotations to the right cause the image to be magnified, whereas rotations to the left cause the image to be minified. These directions can of course, if desired, be reversed, and all such variations are well within the scope of the present invention. In an aspect, the degree of rotation determines the degree to which the image is resized. A small rotation makes a small change to the image, whereas a larger rotation causes the image to be changed by a greater magnitude.

[0069] In an exemplary aspect, the proposed system can be activated by a button press, wherein at a first press, orientation of the portable device/handset can be calculated. In an exemplary implementation, the orientation can be determined based on fusion of positional data from one or more components of the IMU. The accelerometer can then give real-time value of the device/handset’s position relative to gravity. The accelerometer can also be configured to gives/transmit/send the magnitude of any linear acceleration in three dimensions. The gyroscope, on the other hand, can be configured to indicate instantaneous rotational velocity in three dimensions. The magnetometer can be configured to determine and transmit instantaneous orientation of the proposed device/handset relative to Earth’s magnetic field (i.e. a compass). These three sources of data can be combined, or“fused” to give the orientation and motion of the handset in any direction. This data fusion can then be derived through any number of well- known algorithms, such as a Kalman filter.

[0070] As a next step, once the button is pressed, initial orientation of the handset becomes set zero. Any rotation about a defined axis of the handset can be interpreted as an increase or a decrease of the level of zoom. In an exemplary embodiment, axis of rotation can be defined to be along the length of the device, which is the same axis as the wrist. A clockwise roll increases zoom, and where anticlockwise roll decreases zoom. [0071] In an exemplary aspect, zoom control can be only active while the button is depressed. When the button is released, the level of zoom can either return to its previous pre activation level, or it can remain at the new zoom level.

[0072] In an aspect, ratio of zoom level per degree of rotation can be adjusted to allow a greater degree of zoom accuracy (e.g. larger rotation equates to small zoom steps), or low fidelity and greater speed (e.g. smaller rotations equates to large zoom steps).

[0073] As would be appreciated, the proposed device allows the speed of zoom to be very fast. For instance, turning the wrist 45 degrees could be sufficient to achieve maximum zoom. On the other hand, in existing solutions/techniques, a button would have to be pressed many times to achieve the same effect. The proposed zoom enabling product offers the user a much finer resolution of zoom control over individually set zoom steps using buttons. This can allow a person to magnify objects so that they are very easy to see. This is particularly important for people with impaired vision, for instance, it may be necessary to magnify text to a particular size in order to see it. Buttons would have to have predefined steps and hence it might be challenging to find the right level that way. Additionally, using the proposed rotation product/technique can allow the zoom level to move smoothly from one level to another, which is more comfortable and less disorienting than if the user experiences large disconnected and discrete jumps from one zoom level to another.

[0074] In an exemplary aspect, it would be appreciated that the zoom level steps can be configured for different eye conditions.

[0075] Furthermore, proposed device of the present disclosure can also enable functions/operations of any or a combination of pause, zoom and pan. The proposed device can include an architecture that can be built on top of rotational zoom control wherein when the button is pressed, a live image is captured and paused and the user can employ the rotational zoom control as described above. The pause operation/function can allow a person with severely impaired sight to spend a longer time viewing the scene. This is necessary as often sight impaired people require a longer time to see certain objects, for example text in the environment, such as train times. When the user zooms in to the paused image, he/she can then move his/her head to pan around the image in X and Y. The headset such as a smart glass that includes a camera and a display can also include an IMU that is similar to the one in the proposed handheld device. Rotations or translations of the head can be converted into pan signals in X and Y, and through this method, an individual is able to scroll around the image, as well as zoom in and out.

[0076] In a different implementation, the absolute position of the handset/smart glass can be taken as indicating the level of zoom. Hence, a person can learn that a rotation to, say, 2 o’clock will increase the magnitude of zoom to 2x.

[0077] Aspects of the present disclosure can also enable dynamic zoom (continuous not stepped) delivered quickly and intuitively means that the operator experiences a shorter feedback loop so as to avoid disorientation, allow fine tuning (particularly useful when reading text), and enable quick correction (zoom in, out and then relocate zoom) to find object being searched for.

[0078] FIGs. 1 and 2 illustrate exemplary representation of the proposed device in accordance with an embodiment of the present disclosure, wherein the device can either be physically connected to the smart glass/wearable device or can be wirelessly coupled through Bluetooth or can be mounted onto the frame of the smart glass/wearable device, or any other configuration, all of which are well within the scope of the present disclosure.

[0079] In the embodiment of FIG. 1 (a) a system 10 is provided with a smart glass based wearable device 160 that has a display screen 162 and an image capture device in the form of a video camera 163. FIG. 1 (b) shows a corresponding embodiment in which the image capture device of the system 10A is not provided integral to the wearable device 160. Rather, it may be coupled to the device 160 via a wireless connection. In some embodiments the image capture device may be coupled by means of a wired connection in addition or instead. In the embodiments of FIG. 1 (a) and (b) the display screen 162 is a transparent waveguide with diffractive optics arranged to direct an image from an organic light emitting diode (OLED) micro display, into the user’s eye. Other arrangements may be useful such as a transparent waveguide with a beamsplitter instead of diffractive optics. Other displays may be useful such as liquid crystal on silicon (LCOS) displays or liquid crystal displays (LCDs). In some embodiments, an opaque display including a high resolution OLED panel and one or more optical elements such as a biconvex or Fresnel lens arrangement may be employed to direct the image into the user’s eye. In the embodiment of FIG. 1(a) the video camera 163 is a CMOS (complementary metal oxide semiconductor) camera but other cameras may be useful in some embodiments.

[0080] As mentioned above, the present disclosure provides an electronic system/product 100 that can include an inertial measurement device (IMU) 102 having a gyroscope 104 and an accelerometer 106 (it may also alternatively or additionally include a magnetometer 108), wherein during implementation/operation, a user can press and hold a button 150 on the proposed device 100, and then rotate his/her hand as if controlling the dial on a volume control. Rotations to the right cause the image presented on the display 162 of the coupled smart glass/wearable device 160 to be magnified, whereas rotations to the left cause the image to be minified. These directions can of course, if desired, be reversed, and all such variations are well within the scope of the present invention. In an aspect, the degree of rotation dictates the amount that the image is resized. A small rotation makes a small change to the image, whereas a larger rotation causes the image to be changed by a greater magnitude. The system/product 100 in combination with the smart glass/wearable device 160 may be referred to in combination as a smart glass system or smart glasses system or‘system’ 10, 10A.

[0081] During implementation, once the button 150 is pressed, initial orientation of the handset 100 becomes set zero. Any rotation about a defined axis of the handset can be interpreted as an increase or a decrease of the level of zoom. In an exemplary embodiment, axis of rotation (see FIG. 2) can be defined to be along the length of the device 100, which is the same axis as the wrist. A clockwise roll increases zoom, and where anticlockwise roll decreases zoom. [0082] In an exemplary aspect, zoom control can be only active while the button 150 is depressed. When the button 150 is released, the level of zoom can either return to its previous pre-activation level, or it can remain at the new zoom level.

[0083] In an aspect, ratio of zoom level per degree of rotation can adjusted to allow a greater degree of zoom accuracy (e.g. larger rotation equates to small zoom steps), or low fidelity and greater speed (e.g. smaller rotations equates to large zoom steps).

[0084] FIG. 3 illustrates an exemplary flow diagram to enable zooming of captured videos/images in accordance with an embodiment of the present disclosure. As can be seen, at step 302, a smart glass/wearable device/portable display can be operatively coupled with the proposed zoom-enabling computing device (or simply referred to as hand-held device or computing device or portable device), wherein the computing/hand-held device can listen to a sensor that is configured therein, based on which at step 304, the zoom-enabling computing device can receive a change in y-axis gyroscope value (from a gyroscope sensor), output of which can be smoothened, for instance, with a sliding window fdter, at step 306. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device/portable display as well, or in any desired combination of the smart glass/wearable device and the zoom-enabling computing device, all of which possible combinations are therefore well within the scope of the present invention.

[0085] At step 308, the output of gyroscope values can be normalized, followed by, at step 310, accumulation of the gyro values. At step 312, the zoom-enabling computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 312, increase center crop image and perform zoom-out operation to enlarge the display of the wearable device, and at step 314, reduce center crop image so as to zoom-in and enlarge to display. At step 316, a key up instruction can be received by the zoom-enabling computing device, based on which, at step 318, zoom values can return back to the original setting. [0086] FIG. 4 illustrates an exemplary flow diagram 400 to enable head pan operation during pause action during magnification in accordance with an embodiment of the present disclosure. As can be seen, at step 402, during the magnification/zooming operation, when the operation is paused through key down on the button of the portable computing device, at step 404, on the portable computing device (handset), the portable computing device can listen to the sensor that is configured in in the (zoom-enabling) portable computing device, based on which at step 406, the zoom-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothed with a sliding window filter at step 408. At step 410, the output can be normalized, followed by, at step 412, accumulating the gyro values. At step 414, the zoom enabling computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 414, increase center crop image so as to zoom-out and enlarge the fill screen, and at step 416, reduce center crop image so as to zoom-in and enlarge to fill screen. At step 418, a key up instruction is received by the zoom-enabling computing device, based on which at step 420, zoom values can return back to the original setting.

[0087] At the headset (wearable device/smart glass), on the other hand, at step 450, a signal from the sensor is waited for, based on which at step 452, changes to X and Y axes gyroscope values are received, and at step 454, such received gyroscope values are smoothed, and at step 456, X and Y gyro values are mapped into 2D X and Y pan values, post which, at step 458, the enlarged image across the view window can be translocated.

[0088] Some embodiments of the present invention may be understood by reference to the following numbered clauses:

1. A portable computing device operatively coupled with a smart glass based wearable device, said portable computing device comprising an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, magnification of image displayed on the wearable device is controlled based on rotational velocity computed using said gyroscope.

2. The portable computing device as described in clause 1, wherein said IMU further comprises an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity.

3. The portable computing device as described in clause 2, wherein said IMU further comprises a magnetometer that determines and transmits instantaneous orientation of the portable computing device relative to Earth’s magnetic field.

4. The portable computing device as described in clause 3, wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.

5. The portable computing device as described in clause 1 , wherein said portable computing device is actuated by pressing of a button present in/on said computing device, wherein the magnification is controlled only during the time the button is kept pressed.

6. The portable computing device as described in clause 1, wherein clockwise hand rotation results in zooming in of the image, and anti-clockwise hand rotation results in zooming out of the image.

7. The portable computing device as described in clause 1 , wherein the extent of hand rotation is proportional to the extent of magnification.

8. The portable computing device as described in clause 1 , wherein the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the IMU, said one or more components comprising at least an accelerometer.

9. The portable computing device as described in clause 1, wherein when the magnification operation is paused, user of said wearable device can view said magnified image and/or pan around said magnified image in X and Y axis and/or scroll around said magnified image. 10. The portable computing device as described in clause 1, wherein an absolute position of said portable computing device is configured to be indicative of level of zoom.

11. The portable computing device as described in clause 1, wherein the magnification operation comprises any or a combination of stepped zooming or dynamic zooming.

12. A method of controlling magnification operation on an image being displayed in a smart glass based wearable device by a portable computing device, said method comprising the step of: receiving, at the portable computing device, from a gyroscope sensor configured in the portable computing device, a change in gyroscope value indicative of extent of hand rotation of said portable computing device;

determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and

generating, from the portable computing device, a magnification control signal to said glass based wearable device based on the change in gyroscope value and the determined orientation.

13. The method of clause 12, wherein said method further comprises the step of:

receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values;

normalizing said smoothened gyroscope values; and

accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

14. The method of clause 12, wherein, during zoom-out of the image being displayed in the smart glass based wearable device, the magnification control signal increases center crop image so as to enlarge display configured in the wearable device.

15. The method of clause 12, wherein, during zoom-in of the image being displayed in the smart glass based wearable device, the magnification control signal reduces center crop image so as to enlarge display configured in the wearable device. [0089] As used herein, and unless the context dictates otherwise, the term“coupled to” is intended to include both direct coupling; in which two elements that are coupled to each other contact each other, and indirect coupling; in which at least one additional element is located between the two elements. Therefore, the terms“coupled to” and“coupled with” are used synonymously. Within the context of this document terms“coupled to” and“coupled with” are also used euphemistically to mean“communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.

[0090] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and“comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ... and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.

[0091] While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.