Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR MODIFYING IMAGE ENHANCEMENT PARAMETERS FOR A PORTABLE DISPLAY
Document Type and Number:
WIPO Patent Application WO/2019/239161
Kind Code:
A1
Abstract:
A smart glass system (10) arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system (10) according to their eye condition, comprising: a portable computing device (100) comprising a motion sensor; a smart glass based wearable device(160) comprising a display portion (162), the display portion being provided in a field of view of the user; and an image capture device (163), wherein the portable computing device (100) is operatively coupled with the smart glass based wearable device (160), the system (10) being configured to display on the display portion (162) an image corresponding to at least a portion of an image captured by the image capture device (163), wherein the system (10) is configured to detect rotational movement of the portable computing device (100) in the hand of the user by means of the motion sensor (102), wherein one or more control parameters of the image that is displayed on the wearable device (160) are modified based on the rotational movement of the portable computing device (100).

Inventors:
HICKS STEPHEN LLOYD FREDERICK (GB)
RUSSELL NOAH AARON (GB)
Application Number:
PCT/GB2019/051687
Publication Date:
December 19, 2019
Filing Date:
June 17, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
OXSIGHT LTD (GB)
International Classes:
G06F3/01; G06F3/0484
Foreign References:
US20110221656A12011-09-15
US20170336882A12017-11-23
Other References:
MICHAEL KALLONIATISCHARLES LUU, VISUAL ACUITY, Retrieved from the Internet
Attorney, Agent or Firm:
YEADON IP LIMITED (GB)
Download PDF:
Claims:
CLAIMS

1. A smart glass system (10) arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system (10) according to their eye condition, comprising:

a portable computing device (100) comprising a motion sensor;

a smart glass based wearable device (160) comprising a display portion (162), the display portion being provided in a field of view of the user; and

an image capture device (163),

wherein the portable computing device (100) is operatively coupled with the smart glass based wearable device (160),

the system (10) being configured to display on the display portion (162) an image corresponding to at least a portion of an image captured by the image capture device (163), wherein the system (10) is configured to detect rotational movement of the portable computing device (100) in the hand of the user by means of the motion sensor (102), wherein one or more control parameters of the image that is displayed on the wearable device (160) are modified based on the rotational movement of the portable computing device (100).

2. The system of claim 1 wherein the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.

3. The system of claim 1 or claim 2, wherein said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.

4. The system of any preceding claim, wherein said device comprises a selection interface that allows user of said wearable device to select a set of control parameters from said one or more control parameters that need to be modified for said image.

5. The system of any preceding claim, wherein the motion sensor comprises a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope.

6. The system of any preceding claim, wherein the motion sensor comprises an accelerometer arranged to determine information indicative of linear acceleration of the portable computing device relative to gravity.

7. The system of claim 6 configured to transmit information indicative of the linear acceleration of the portable computing device relative to gravity to the wearable device.

8. The system of any preceding claim, wherein the motion sensor comprises a magnetometer that determines information indicative of instantaneous orientation of the portable computing device relative to Earth’s magnetic field.

9. The system of claim 8 configured to transmit the information indicative of the instantaneous orientation of the portable computing device relative to Earth’s magnetic field to the wearable device.

10. The system of claim 8 or 9 as depending through claim 6 as dependent on claim 5, wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.

11. The system of any preceding claim, wherein the portable computing device is actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.

12. The system of any preceding claim configured wherein the extent of changes in said one or more control parameters is proportional to the extent of hand rotation.

13. The system of claim 6 or any one of claims 7 to 12 depending through claim 6, wherein the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the motion sensor.

14. The system of any preceding claim, wherein when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.

15. The system of any preceding claim, wherein an absolute position of said portable computing device is configured to be indicative of level of control parameter.

16. The system of claim 15 as depending through claim 11, wherein when the button is pressed, the system determines the absolute position of said portable computing device and sets the level of the control parameter in dependence on the absolute position.

17. The system of any preceding claim wherein the motion sensor comprises an inertial measurement unit (IMU).

18. The system of any preceding claim wherein the motion sensor is an inertial measurement unit (IMU).

19. A system according to any preceding claim wherein the wearable device comprises the image capture device.

20. A system according to any one of claims 1 to 18 wherein the image capture device is provided external to the wearable device, the image capture device being operatively coupled to the wearable device.

21. A system according to claim 20 wherein the image capture device is operatively coupled to the wearable device by means of a wireless connection or a wired connection.

22. A method of controlling an image displayed on a display of a smart glass based wearable device of a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of the image according to their eye condition, the display being provided in the field of view of a user, comprising:

detecting rotational movement of a portable computing device of the system in the hand of a user by means of a motion sensor comprised by the device, the portable computing device being operatively coupled to the wearable device;

capturing by means of an image capture device of the system an image of a scene;

displaying on the display portion an image corresponding to at least a portion of the image captured by the image capture device,

the method comprising modifying one or more control parameters of the image displayed on the display portion based on the rotational movement of the portable computing device.

23. The method of claim 22 comprising capturing by means of the image capture device a scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising displaying on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.

24. The method of claim 22 or 23 whereby the method further comprises the step of:

receiving from the motion sensor new gyroscope values as part of a change in gyroscope value due to movement of the portable computing device;

smoothing, using a sliding window filter, the received new gyroscope values;

normalizing said smoothened gyroscope values; and

accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

25. The method of any one of claims 22 to 24, whereby the one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.

Description:
SYSTEM AND METHOD FOR MODIFYING IMAGE ENHANCEMENT

PARAMETERS FOR A PORTABLE DISPLAY

FIELD OF THE INVENTION

[0001] The present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass.

BACKGROUND

[0002] The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.

[0003] There exist a number of image enhancement techniques for improving vision in sight impaired individuals. Such techniques include but are not limited to video passthrough of a colour/RGB image, edge detection and presentation of these edges as white on a black background, application of white edges on top of a colour or grayscale image, presentation of a black and white high-contrast image with a global threshold that applies to the entire screen, presentation of a black and white high-contrast image with multiple regional thresholds to compensate for lighting changes across a screen, and an algorithm, for instance, to detect large regions of similar hues (regardless of brightness) and then re-drawing these regions as high brightness swatches of the same colour, to aid low vision.

[0004] These image processing methods are good at improving visibility of objects in a real- world scenario, particularly for people with poor vision. These methods/techniques have parameters that can be set to find an optimal setting for many visual scenes. However, the visible world is highly dynamic and hence the preset parameters of each method/technique may not be suitable at all times. Examples of the dynamic nature of the visual world include ambient lighting

changes by many orders of magnitude as people move between environments or as they turn their heads, contrast of surface details on objects varying dramatically (and hence detection parameters set edge detection algorithms are not appropriate in all situations), specific objects such as faces and text having highly different contrast spectrums (and hence automatic thresholding algorithms will not optimally enhance the visibility of key features on different objects).

[0005] There is therefore a need in the art for a system and method for modifying image enhancement parameters for a portable display in real-world scenario.

[0006] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.

SUMMARY

[0007] The present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass. By smart glass or‘smart glasses’ is meant wearable computer glasses (or‘spectacles’) that provide visual information in a user’s field of view in addition to that which the user is able to view substantially directly (either with or without an intermediate optical element such as a lens, which may be substantially transparent or partially transparent). The provision of visual information in addition to that which the user views substantially directly may be by superimposing information onto the user’s field of view, for example on a display element in the user’s field of view (which display element may be substantially transparent, at least partially transparent or opaque). For example, the display may be a substantially opaque LED or LCD display of the type used in mobile telephones or partially transparent. In some embodiments an image may be projected onto the display from a light source in the form or a projector device, for example of the type used in head-up display (HUD) displays or augmented reality (AR) overlays and the reflected image viewed by the user.

[0008] A smart glass system arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition, comprising:

a portable computing device comprising a motion sensor;

a smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user; and

an image capture device,

wherein the portable computing device is operatively coupled with the smart glass based wearable device,

the system being configured to display on the display portion an image corresponding to at least a portion of an image captured by the image capture device,

wherein the system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein one or more control parameters of the image that is displayed on the wearable device are modified based on the rotational movement of the portable computing device.

[0009] This has the advantage that a user may adjust the image provided to them by the system in order to enhance their ability to view a scene. The user may endeavour to optimise the scene as viewed using the smart glass based wearable device. Optionally, the system may be configured wherein the user may adjust the parameters in substantially real time.

[0010] The image capture device may comprise a video image capture device. The image capture device may comprise at least one CMOS image capture device and/or at least one CCD image capture device. Other image capture devices may be useful. The wearable device may generate a substantially real time stream of images captured by the image capture device. [0011] Optionally, the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.

[0012] It is to be understood that reference to the“field of view of a person wearing the wearable device” is to be understood to be with respect to a person wearing the wearable device such that the display portion is in their field of view, optionally their field of view when looking substantially directly ahead, optionally their field of view with their eyes directed in a prescribed direction, the direction being any prescribed direction from directly upwards (a Ί2 o’clock’ direction), directly downwards (a‘6 o’clock direction’) or any prescribed direction from 12 o’clock clockwise around to 12 o’clock.

[0013] Optionally, said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.

[0014] Optionally, said device comprises a selection interface that allows user of said wearable device to select a set of control parameters from said one or more control parameters that need to be modified for said image.

[0015] Optionally, the motion sensor comprises a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope. [0016] Optionally, the motion sensor comprises an accelerometer arranged to determine information indicative of linear acceleration of the portable computing device relative to gravity.

[0017] Optionally, the system is configured to transmit information indicative of the linear acceleration of the portable computing device relative to gravity to the wearable device.

[0018] Optionally, the motion sensor comprises a magnetometer that determines information indicative of instantaneous orientation of the portable computing device relative to Earth’s magnetic field.

[0019] Optionally, the system is configured to transmit the information indicative of the instantaneous orientation of the portable computing device relative to Earth’s magnetic field to the wearable device.

[0020] Optionally, the system is configured wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.

[0021] Optionally, the portable computing device is actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.

[0022] Optionally, the extent of changes in said one or more control parameters is proportional to the extent of hand rotation.

[0023] Optionally, the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the motion sensor.

[0024] Optionally, when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.

[0025] Optionally, an absolute position of said portable computing device is configured to be indicative of level of control parameter. [0026] Optionally, when the button is pressed, the system determines the absolute position of said portable computing device and sets the level of the control parameter in dependence on the absolute position.

[0027] This feature has the advantage that a user may substantially instantly set the level of the control panel by first setting their hand in the orientation corresponding to the desired value of the control parameter and pressing the button.

[0028] Optionally, the motion sensor comprises an inertial measurement unit (IMU).

[0029] Optionally, the motion sensor is an inertial measurement unit (IMU).

[0030] Optionally, the wearable device comprises the image capture device.

[0031 ] The image capture device may be an integral part of the wearable device.

[0032] Optionally, the image capture device is provided external to the wearable device, the image capture device being operatively coupled to the wearable device.

[0033] Optionally, the image capture device is operatively coupled to the wearable device by means of a wireless connection or a wired connection.

[0034] In an aspect of the invention there is provided a method of controlling an image displayed on a display of a smart glass based wearable device of a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of the image according to their eye condition, the display being provided in the field of view of a user, comprising: detecting rotational movement of a portable computing device of the system in the hand of a user by means of a motion sensor comprised by the device, the portable computing device being operatively coupled to the wearable device;

capturing by means of an image capture device of the system an image of a scene;

displaying on the display portion an image corresponding to at least a portion of the image captured by the image capture device,

the method comprising modifying one or more control parameters of the image displayed on the display portion based on the rotational movement of the portable computing device. [0035] Optionally, the method comprises capturing by means of the image capture device a scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising displaying on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.

[0036] Thus, the person wearing the wearable device will see the image captured by the image capture device within their field of view.

[0037] Optionally, the image displayed by the display portion occupies a portion and not the whole of the field of view of the user wherein the image displayed is substantially continuous with a remainder of a field of view of the user such that the image displayed appears to be superimposed upon the scene. It is to be understood that the display portion may be at least partially transparent, allowing the user to see objects in the portion of the field of view occupied by the display portion through the display portion as well as information displayed on the display portion by the system.

[0038] Optionally, the method further comprises the step of:

receiving from the motion sensor new gyroscope values as part of a change in gyroscope value due to movement of the portable computing device;

smoothing, using a sliding window filter, the received new gyroscope values;

normalizing said smoothened gyroscope values; and

accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

[0039] Optionally, the one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white: black ratio in said image. [0040] In an aspect of the invention there is provided a portable computing device for use with a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition, the portable computing device comprising a motion sensor; the portable computing device being arranged to be operatively coupled with the smart glass based wearable device, the smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user, the smart glass system further comprising an image capture device, the system being configured to display on the display portion of the wearable device an image corresponding to at least a portion of an image captured by the image capture device, wherein the system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein one or more control parameters of the image that is displayed on the wearable device are modified based on the rotational movement of the portable computing device.

[0041] In an aspect of the invention there is provided a smart glass based wearable device arranged to be operatively coupled with the portable computing device of the preceding aspect, the system being arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition.

[0042] In an aspect, the present disclosure relates to a portable computing device that is operatively coupled with a smart glass based wearable device, wherein the portable computing device can include an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of the portable computing device, orientation of the portable computing device can be determined, and upon hand rotation of the portable computing device, one or more control parameters of an image that is displayed on the wearable device can be modified based on rotational velocity computed using the gyroscope.

[0043] In an aspect, the one or more control parameters can be selected from any or a combination of passthrough of the image, colour or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, colour enhancement of the image, line thickness in the image, enhancement of text that forms part of the image, lighting that forms part of the image, and white: black ratio in the image.

[0044] In an aspect, the device can include a selection interface that allows user of the wearable device to select a set of control parameters from said one or more control parameters that need to be modified for the image.

[0045] In an aspect, the IMU can further include an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity. In another aspect, the IMU can further include a magnetometer that determines and transmits instantaneous orientation of the portable computing device relative to Earth’s magnetic field. In yet another aspect, respective outputs from the gyroscope, the accelerometer, and the magnetometer can be fused to yield the orientation and motion of the portable computing device in any direction.

[0046] In an aspect, the portable computing device can be actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.

[0047] In another aspect, the extent of hand rotation is proportional to the extent of changes in said one or more control parameters.

[0048] In yet another aspect, the orientation of the portable computing device can be determined based on fusion of positional data from one or more components of the IMU, said one or more components comprising at least an accelerometer.

[0049] In yet another aspect, when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.

[0050] In an aspect, an absolute position of said portable computing device is configured to be indicative of level of control parameter. [0051] In another aspect, the present disclosure relates to a method of modifying, by a portable computing device, one or more control parameters on an image that is displayed in a smart glass based wearable device, said method comprising the step of: receiving, at the portable computing device, from a gyroscope sensor configured in the portable computing device, a change in gyroscope value indicative of extent of hand rotation of said portable computing device, said hand rotation being mapped to one or more control parameters; determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and generating, from the portable computing device, an image modification signal to said glass based wearable device based on the change in gyroscope value and the determined orientation, wherein said image is modified with respect to said one or more control parameters based on said image modification signal.

[0052] In an aspect, the method can further include the step of: receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values; normalizing said smoothened gyroscope values; and accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

[0053] In an aspect of the invention there is provided a portable computing device that is operatively coupled with a smart glass based wearable device. In an aspect, the portable computing device can include an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of the portable computing device, orientation of the portable computing device can be determined, and upon hand rotation of the portable computing device, one or more control parameters of an image that is displayed on the wearable device can be modified based on rotational velocity computed using said gyroscope.

BRIEF DESCRIPTION OF DRAWINGS [0054] FIGs. 1 and 2 illustrate exemplary representation of the proposed device in accordance with an embodiment of the present disclosure.

[0055] FIGs. 3A-3E illustrate exemplary representation showing how the proposed device can be used for controlling at least one parameter of imaging/image enhancement techniques.

[0056] FIGs. 4A-4E illustrate exemplary flow diagrams to enable tuning of different attributes of captured videos/images in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION OF DRAWINGS

[0057] Embodiments of the present disclosure include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine- executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, firmware and/or by human operators.

[0058] Embodiments of the present disclosure may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).

[0059] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present disclosure with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present disclosure may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the disclosure could be accomplished by modules, routines, subroutines, or subparts of a computer program product.

[0060] If the specification states a component or feature“may”,“can”,“could”, or“might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.

[0061] Arrangements and embodiments may now be described more fully with reference to the accompanying drawings, in which exemplary embodiments may be shown. Embodiments may, however, be embodied in many different forms and should not be construed as being limited to embodiments set forth herein; rather, embodiments may be provided so that this disclosure will be thorough and complete, and will fully convey the concept to those skilled in the art.

[0062] The suffixes ‘module’, ‘unit’ and ‘part’ may be used for elements in order to facilitate the disclosure. Significant meanings or roles may not be given to the suffixes themselves and it is understood that the‘module’,‘unit’ and‘part’ may be used together or interchangeably.

[0063] The present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass.

[0064] The present disclosure pertains to a real-time image processing system that is designed to improve vision for people who are severely sight impaired. The proposed system can include a video input mechanism that can be a wired or a wireless camera or can include an externally streamed video or a video that is a file on the device, wherein the video input mechanism can be presented to the user via a head-mounted screen such as an augmented or virtual reality transparent display, of for instance, a smart glass. In the embodiment 10 of FIG. 1(a) a smart glass based wearable device 160 has a display screen 162 and an image capture device in the form of a video camera 163. FIG. 1(b) shows a corresponding embodiment 10A in which the image capture device is not provided integral to the wearable device 160. Rather, it may be coupled to the device 160 via a wireless connection. In some embodiments the image capture device may be coupled by means of a wired connection in addition or instead. In the embodiment of FIG. 1 the display screen 162 is a transparent waveguide with diffractive optics arranged to direct an image from an organic light emitting diode (OLED) micro display, into the user’s eye. Other arrangements may be useful such as a transparent waveguide with a beamsplitter instead of diffractive optics. Other displays may be useful such as liquid crystal on silicon (LCOS) displays or liquid crystal displays (LCDs). In some embodiments, an opaque display including a high resolution OLED panel and one or more optical elements such as a biconvex or Fresnel lens arrangement may be employed to direct the image into the user’s eye. In the embodiment of FIG. 1(a) the video camera 163 is a CMOS (complementary metal oxide semiconductor) camera but other cameras may be useful in some embodiments.

[0065] In an aspect, the present disclosure relates to a physical device that can be given to a user to modify primary control parameter(s) of each of existing methods/techniques as mentioned above A. video passthrough of a colour/RGB image, B. edge detection and presentation of these edges as white on a black background, C. application of white edges on top of a colour or grayscale image, D. presentation of a black and white high-contrast image with a global threshold that applies to the entire screen, E. presentation of a black and white high- contrast image with multiple regional thresholds to compensate for lighting changes across a screen, and F. an algorithm, for instance, to detect large regions of similar hues (regardless of brightness) and then re-drawing these regions as high brightness swatches of the same colour, to aid low vision).

[0066] In an aspect, the proposed device can be configured to receive one parameter from each of the above-mentioned techniques (A-F), and make the received respective parameter as adjustable. In an aspect, the proposed device can be configured as a portable gesture device and can enable provision of an intuitive control that mimics other well-known control mechanisms such as a volume knob on an audio device. The proposed device can rapidly and intuitively change parameters for one or more existing image enhancement techniques over a wide range, due to the relationship between movement of the device and the rate of parameter change.

[0067] In an aspect, the proposed device system can be activated by a button press, wherein at the first press, orientation of the device can be calculated. In an exemplary implementation, orientation of the device can be determined based on fusion of positional data from components of an inertial measurement device (IMU) that comprises an accelerometer configured to determine and transmit real-time values of the devices’ position relative to gravity. The accelerometer can also be configured to transmit magnitude of any linear acceleration in three dimensions. IMU of the present disclosure can further include a gyroscope that indicates instantaneous rotational velocity in three dimensions. An optional magnetometer can also be configured in the IMU and configured to give instantaneous orientation of the proposed device/handset relative to Earth’s magnetic field (i.e. a compass). These three sources of data can be combined, or“fused” to give orientation and motion of the device/handset in any direction. This data fusion can be derived through any number of well-known algorithms, such as a Kalman filter.

[0068] In an aspect, once the button on the proposed device is pressed, initial orientation of the handset becomes set zero. Any rotation about a defined axis of the handset can be interpreted as an increase or a decrease of the primary control parameter. In an exemplary embodiment, the axis of rotation can be defined to be along the length of the device, which is the same axis as the wrist. For instance, a clockwise roll can increase the white:black ratio on a high-contrast display. An anticlockwise roll, on the other hand, can decrease the white:black ratio on a high-contrast display. Alternatively, an anticlockwise roll can increase the white: black ratio on a high-contrast display and a clockwise roll can decrease the white:black ratio on a high-contrast display. [0069] In an aspect, user can be given the ability to watch parameter change in real-time as they rotate the proposed device/handset, which can create an intuitive feedback system, allowing the user to be very specific with their modifications to the image. During the“tuning” phase, the video can continue to be passed in real-time to the display.

[0070] In an exemplary implementation, upon release of the button, the modified imaging parameter can be set. Thus, the modified imaging parameter can be set as the new state of the system, becoming the state in which the system continues to operate until the state is changed. In some embodiments the state of contrast may revert to a default state upon release of the button. In some embodiments the state of contrast may revert to a default state after a predetermined time period has elapsed. Other arrangements may be useful. In some embodiments a user may select how the system behaves when the button is released, for example whether the changed parameter, such as the instant blackwhite contrast setting, is maintained, or whether the system reverts to a default value of the parameter.

[0071] As mentioned above, each imaging/image enhancement technique can have at least one parameter that can be modified by the proposed tuning device. For instance, for Video pass through (colour or grayscale display) technique, the parameter can include the tuning that increases or decreases general image brightness. Similarly, for white edges on black technique, the parameter can include the tuning that modifies the threshold for edge detection. Decreasing this threshold increases the number of edges displayed. Increasing this threshold decreases the number of edges displayed. For the video pass-through plus white edges technique, the same parameter as for white edges on black technique can be used except that the edges are displayed on a live video (colour or grayscale). On the other hand, for high contrast, global threshold technique, the parameter can include tuning that moves the blackwhite threshold towards the white or towards the black. This either increases the amount of white on the screen, or increases the amount of black. For high contrast, multiple regional thresholds technique, the parameter can include tuning that modifies the erode and dilate parameters, subsequently increasing line thickness (process called“erode”) or decreasing line thickness (process called“dilate”). Finally, for colour detection and saturation image enhancement technique, the parameter can include tuning that rotates the detection window through the colour spectrum, allowing for specific colours to be saturated.

[0072] It is to be understood that Contrast is an important parameter in assessing vision clincally. Clinical visual acuity measurements typically use high contrast images such as black letters on a white background. In reality, contrast between objects and their surroundings varies. The relationship between visual acuity and contrast allows a more detailed understanding of visual perception.

[0073] The resolving power of the eye may be measured by means of sinusoidal grating patterns having adjustable spacing (spatial periodicity). The contrast of the grating is the differential intensity threshold of a grating, which is defined as the ratio:

C = (Lmax - Lmin) / (Lmax + Lmin) where L is the luminance of the grating pattern as a function of spatial distance in a direction normal to the orientation of the parallel elements of the grating and C may be referred to as the modulation or Raleigh or Michelson contrast. C can have a value between 0.0 and 1.0. Further details may be found in“Visual Acuity” by Michael Kalloniatis and Charles Luu, available at https://webvision.med.utah.edu/book/part-viii-psvchophvsics- of-vision/visual-acuitv/ portions of which are discussed below.

[0074] As the spatial frequency of a set of black/white lines increases, i.e. the thickness of the lines decreases, they become harder to resolve and begin to look like a homogenous grey area. The sensitivity of a person’s eyes to contrast can be measured by determining the minimum grating spacing that each eye can resolve as a function of image contrast. This may be done, for example, by lowering the contrast for a given spatial frequency until the person can no longer detect the grating - this value is the‘contrast threshold’ for that grating size (spatial frequency). The reciprocal of this contrast threshold is called the‘contrast sensitivity’. The contrast threshold can be expressed as a sensitivity on a decibel (dB) scale: contrast sensitivity in dB = -20 loglOC where C is the threshold value of modulation contrast (described above). A plot of (contrast) sensitivity versus spatial frequency is called the spatial contrast sensitivity function (referred to as the SCSF or simply CSF).

[0075] FIG. 1(b) illustrates schematically the manner in which the contrast sensitivity function (CSF) of an individual may be affected depending upon their medical condition. The plot shows log (contrast sensitivity) as a function of log (spatial frequency) (c/deg). Trace N represents the expected CSF of a healthy individual. Trace A represents that of an individual with contrast losses in the mid to low region of (log spatial frequency), characteristic of individuals having multiple sclerosis; trace B represents the CSR of individuals with an overall reduction in CSF across the range of spatial frequencies, characteristic of cataract patients, whilst trace C represents the CSF of individuals with mild refractive error or mild amblyopia (trace B being characteristic of individuals with more severe cases of either).

[0076] Further information may also be found at: https://www.semanticscholar.org/paper/Comparing-the-Shape-of -Contrast-Sensitivitv-for-and-

Chung-Legge/92c9647ee47507ce50e2792eb9504l06734d37ea

[0077] In an aspect, the proposed device can be coupled to any portable display such as a smart glass that is operatively coupled with a camera that receives a range of different video sources. Other exemplary portable display devices to which the proposed device can be applied can include but are not limited to head mounted camera, external wireless camera, video streaming from a broadcast source e.g. TV, closed-loop video, such as a theatre, concert or live sport event, and on device video source, e.g. a movie fde, internet-streamed video etc. In each of these cases, the proposed device can apply any of the image enhancement algorithms previously listed, and each of these enhancements can be modified in real-time by the“Tuning” device outlined in this disclosure.

[0078] FIGs. 1(a) and 2 illustrate exemplary representation of the proposed system in accordance with an embodiment of the present disclosure, wherein a portable, handheld computing device 100 having a motion sensor 102 can be either physically connected to the smart glass based wearable device 160 or can be wirelessly coupled through Bluetooth or can be mounted onto the frame of the smart glass/wearable device, or any other configuration, all of which are well within the scope of the present disclosure.

[0079] As mentioned above, the present disclosure provides an electronic system/product 100 that can include an inertial measurement device (IMU) 102 having a gyroscope 104 and an accelerometer 106 (it may also alternatively or additionally include a magnetometer 108), wherein during implementation/operation, a user can press and hold a button 150 on the proposed device 100, and then rotate his/her hand as if controlling the dial on a volume control.

[0080] In an aspect, once the button 150 on the proposed device 100 is pressed, initial orientation of the device 100 becomes set zero. Any rotation about a defined axis of the device/handset 100 can be interpreted as an increase or a decrease of the primary control parameter. In an exemplary embodiment, the axis of rotation (refer to FIG. 2) can be defined to be along the length of the device, which is the same axis as the wrist. For instance, a clockwise roll can increase the white:black ratio on a high-contrast display, say transparent display 162 of a smart glass 160. An anticlockwise roll, on the other hand, can decrease the white:black ratio on a high-contrast display 162.

[0081] In an aspect, user can be given the ability to watch parameter change in real-time as they rotate the proposed device/handset, which can create an intuitive feedback system, allowing the user to be very specific with their modifications to the image. During the“tuning” phase, the video can continue to be passed in real-time to the display 162. In an exemplary implementation, upon release of the button, the modified imaging parameter can be set. [0082] FIGs. 3A-3E illustrate exemplary representation showing how the proposed device can be used for controlling at least one parameter of imaging/image enhancement techniques. For instance, for Video pass through (colour or grayscale display) technique, as shown in FIG. 3A, the parameter can include the tuning that increases or decreases general image brightness. Similarly, for white edges on black technique, as shown in FIG. 3B, the parameter can include the tuning that modifies the threshold for edge detection. Decreasing this threshold increases the number of edges displayed. Increasing this threshold decreases the number of edges displayed. For the video pass-through plus white edges technique, as shown in FIG. 3C, the same parameter as for white edges on black technique can be used except that the edges are displayed on a live video (colour or grayscale). On the other hand, for high contrast global threshold technique, as shown in FIG. 3D, the parameter can include tuning that moves the black: white threshold towards the white or towards the black. This either increases the amount of white on the screen, or increases the amount of black. For high contrast, multiple regional thresholds technique, as shown in FIG. 3E, the parameter can include tuning that modifies the erode and dilate parameters, subsequently increasing line thickness (process called“erode”) or decreasing line thickness (process called “dilate”). Finally, for colour detection and saturation image enhancement technique, the parameter can include tuning that rotates the detection window through the colour spectrum, allowing for specific colours to be saturated.

[0083] As would be appreciated, using the present invention, a user simply rotates a handheld device, making it easy to perform image enhancement and intuitive to describe. In addition, the ratio of rotation to parameter control can be varied so that either: small rotations lead to large changes for highly dynamic environments, or large rotations lead to small changes for fine tuning of an image parameter.

[0084] In an aspect, the proposed device can also be configured to, for other axis of rotation of the device, modify different parameters of the image. For instance, roll axis of the proposed device can change the primary control parameter; pitch axis can change ratio of rotation primary control parameter, which can allow a person to firstly make a large change to the general image, and then increase the sensitivity in order to fine tune the adjustment to suit the environment and the user’s level of vision.

[0085] FIGs. 4A-4E illustrate exemplary flow diagrams to enable tuning of different attributes of captured videos/images in accordance with an embodiment of the present disclosure.

[0086] With reference to FIG. 4A which illustrates brightness based tuning operation 400, as can be seen, at step 402, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 404, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 406. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 408, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 410, accumulating the gyro values. At step 412, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 412, decrease brightness variable, and at step 414, increase brightness variable. At step 416, a key up instruction is received by the tuning-enabling computing device, based on which at step 418, current effect values can be set as default.

[0087] With reference to FIG. 4B which illustrates edge enhancement based tuning operation 420, as can be seen, at step 422, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 424, the tuning enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 426. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 428, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 430, accumulating the gyro values. At step 432, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 432, increase the threshold for line detection, and at step 434, decrease the threshold for line detection. At step 436, a key up instruction is received by the tuning-enabling computing device, based on which at step 438, current effect values can be set as default.

[0088] With reference to FIG. 4C which illustrates contrast based tuning operation 440, as can be seen, at step 442, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 444, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 446. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 448, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 450, accumulating the gyro values. At step 452, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 452, increase the threshold for white (increase % black), and at step 454, decrease the threshold for white (increase % white). At step 456, a key up instruction is received by the tuning-enabling computing device, based on which at step 458, current effect values can be set as default.

[0089] With reference to FIG. 4D which illustrates colour based tuning operation 460, as can be seen, at step 462, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 464, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 466. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 468, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 470, accumulating the gyro values. At step 472, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 472, increase the colour display in blue-green range (as an example), and at step 474, display colours within the yellow-red range (for instance). At step 476, a key up instruction is received by the tuning-enabling computing device, based on which at step 478, current effect values can be set as default.

[0090] With reference to FIG. 4E which illustrates enhanced text based tuning operation 480, as can be seen, at step 481, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 482, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 483. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 484, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 485, accumulating the gyro values. At step 486, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 486, thicken text by increasing erode variables, and at step 487, thin text by increasing dilate variables. At step 488, a key up instruction is received by the tuning-enabling computing device, based on which at step 489, current effect values can be set as default.

[0091] Some aspects of the present invention may be understood by reference to the following numbered clauses:

1. A portable computing device operatively coupled with a smart glass based wearable device, said portable computing device comprising: an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope.

2. The portable computing device of clause 1, wherein said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.

3. The portable computing device of clause 1 , wherein said device comprises a selection interface that allows user of said wearable device to select a set of control parameters from said one or more control parameters that need to be modified for said image.

4. The portable computing device as described in clause 1, wherein said IMU further comprises an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity.

5. The portable computing device as described in clause 4, wherein said IMU further comprises a magnetometer that determines and transmits instantaneous orientation of the portable computing device relative to Earth’s magnetic field.

6. The portable computing device as described in clause 5, wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.

7. The portable computing device as described in clause 1 , wherein said portable computing device is actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.

8. The portable computing device as described in clause 1 , wherein the extent of hand rotation is proportional to the extent of changes in said one or more control parameters.

9. The portable computing device as described in clause 1 , wherein the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the IMU, said one or more components comprising at least an accelerometer.

10. The portable computing device as described in clause 1, wherein when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.

11. The portable computing device as described in clause 1, wherein an absolute position of said portable computing device is configured to be indicative of level of control parameter.

12. A method of modifying, by a portable computing device, one or more control parameters on an image that is displayed in a smart glass based wearable device, said method comprising the step of:

receiving, at the portable computing device, from a gyroscope sensor configured in the portable computing device, a change in gyroscope value indicative of extent of hand rotation of said portable computing device, said hand rotation being mapped to one or more control parameters; determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and

generating, from the portable computing device, an image modification signal to said glass based wearable device based on the change in gyroscope value and the determined orientation, wherein said image is modified with respect to said one or more control parameters based on said image modification signal.

13. The method of clause 12, wherein said method further comprises the step of:

receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values;

normalizing said smoothened gyroscope values; and

accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.

14. The method of clause 12, wherein said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white: black ratio in said image.

[0092] As used herein, and unless the context dictates otherwise, the term“coupled to” is intended to include both direct coupling; in which two elements that are coupled to each other contact each other, and indirect coupling; in which at least one additional element is located between the two elements. Therefore, the terms“coupled to” and“coupled with” are used synonymously. Within the context of this document terms“coupled to” and“coupled with” are also used euphemistically to mean“communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device. [0093] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and“comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ... and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.

[0094] While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.