Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
WEARABLE VIBRO-TACTILE FEEDBACK DEVICE
Document Type and Number:
WIPO Patent Application WO/2022/011432
Kind Code:
A1
Abstract:
Described herein is a wearable tactile feedback device (100, 300, 500) for attachment to a body part such as a forearm (102) of a user. The device (100, 300, 500) comprises a body (104) adapted to fit to the forearm (102) of the user. The device (100, 300, 500) also includes a plurality of vibration motors (106-121) attached to or embedded within the body (104) at spaced apart locations. A microcontroller (124) is configured to send control signals to the plurality of vibration motors (106-121). The plurality of vibration motors (106-121) are responsive to respective ones of the control signals to vibrate at predetermined intensities, frequencies and durations to provide vibratory tactile feedback to the user in the form of three dimensional direction cues for movement of the forearm (102). (Figure 1)

Inventors:
PRABHU DEEPA (AU)
MCCARTHY CHRISTOPHER (AU)
HASAN MUHAMMAD (AU)
WISE LISA (AU)
MACMAHON CLARE (AU)
Application Number:
PCT/AU2021/050768
Publication Date:
January 20, 2022
Filing Date:
July 16, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV SWINBURNE TECHNOLOGY (AU)
International Classes:
G06F3/01; A61B5/00; A61F9/08; A61H3/06; G01C21/16; G06T19/00; G08B6/00
Foreign References:
US20180303702A12018-10-25
US20180321056A12018-11-08
US20130218456A12013-08-22
US20140180582A12014-06-26
US20190385424A12019-12-19
US9311789B12016-04-12
Attorney, Agent or Firm:
PHILLIPS ORMONDE FITZPATRICK (AU)
Download PDF:
Claims:
What is claimed is:

1. A wearable tactile feedback device for attachment to a body part of a user, the device comprising: a body adapted to fit to the body part of the user; a plurality of vibration motors attached to or embedded within the body at spaced apart locations; and a microcontroller configured to send control signals to the plurality of vibration motors; wherein, the plurality of vibration motors are responsive to respective ones of the control signals to vibrate at predetermined intensities, frequencies and durations to provide vibratory tactile feedback to the user in the form of three dimensional direction cues for movement of the body part.

2. The device according to claim 1 wherein the body comprises a compression fabric.

3. The device according to claim 2 wherein the vibration motors are woven into the compression fabric.

4. The device according to any one of the preceding claims wherein the body is in the form of a sleeve to be sleeved around the body part.

5. The device according to claim 4 wherein the body part is an arm.

6. The device according to claim 5 wherein the plurality of vibration motors are disposed into for four linear arrays, each comprising four vibration motors.

7. The device according to claim 6 wherein the four linear arrays are disposed on the sleeve at separate circumferential locations such that, when worn by the user, the linear motor arrays extend along a medial, dorsal, ventral and lateral side of the arm.

8. The device according to any one of the preceding claims wherein the vibration motors are spaced apart by a distance in the range of 25 mm to 40 mm.

9. The device according to claim 8 wherein the vibration motors are spaced apart by a distance in the range of 25 mm to 30 mm. 10. The device according to any one of the preceding claims including a wireless communication device for wirelessly connecting the microcontroller to other electronic devices.

11. The device according to any one of the preceding claims comprising one or more sensors to sense the position or motion of the body part.

12. The device according to claim 11 wherein the one or more sensors comprise one or more inertial measurement units.

13. The device according to claim 11 or claim 12 wherein the one or more sensors comprise one or more accelerometers.

14. The device according to any one of claims 11 to 13 wherein the one or more sensors comprise an image sensor in electrical communication with the microcontroller to image and track the position of the body part, and/or objects of interest, in three dimensional space.

15. The device according to any one of claims 11 to 14 wherein the one or more sensors are adapted to generate a sensor signal indicative of a position or motion of the body part and wherein the microcontroller is responsive to the sensor signal for generating three dimensional direction cues.

16. The device according any one of the preceding claims adapted to communicate with a virtual reality module and wherein the control signals are responsive to inputs from the virtual reality module such that the three dimensional direction cues provided to the user are dependent on a virtual reality environment.

17. The device according to any one of the preceding claims adapted to communicate with a wireless beacon attached to an object of interest and wherein the control signals are responsive to inputs from the beacon such that the three dimensional direction cues provided to the user are dependent on the tracked location of the beacon.

18. The device according to any one of the preceding claims wherein the microcontroller is programmed with one or more preset training modules that provide a predetermined sequence of three dimensional direction cues to the user. 19. The device according to any one of the preceding claims wherein the three dimensional direction cue conveys to the user a direction of a target, a distance to the target and a direction of hand movement.

Description:
WEARABLE VIBRO-TACTILE FEEDBACK DEVICE FIELD OF THE INVENTION

[0001] The present application relates to tactile feedback device and in particular to a wearable device for providing vibratory tactile feedback to a user.

[0002] Embodiments of the present invention are particularly adapted for providing a wearable tactile feedback device adapted to fit to a user’s forearm to monitor and improve visuo-motor performance of the user. However, it will be appreciated that the invention is applicable in broader contexts and other applications.

BACKGROUND

[0003] In recent times, implantable visual prostheses (also known as ’Bionic eyes’) are the leading treatment for patients with advanced stage Retinitis Pigmentosa and Age-related Macular Degeneration. The abstract, low resolution form of visual perception that is restored with current prosthetic vision devices presents significant challenges for interpreting three- dimensional (3D) space and guiding self-motion during real-world tasks. While some functional outcomes have been demonstrated with prosthetic vision devices, vision remains significantly impeded compared to normal healthy vision. In particular, visuo-motor tasks such as reaching and pointing remain challenging, with only limited attention given to date.

[0004] While technical developments continue to improve visual perception, training and rehabilitation are central to improving visuo-motor outcomes for prosthetic vision users. Specifically, in order to adapt to the newly acquired vision and effectively apply it for performing real-world tasks with accuracy, extensive visual rehabilitation is essential. Complementary information from other sensory modalities may facilitate learning in a newly restored sensory modality.

[0005] Typically this is part of rehabilitation with prosthetic vision devices and the training is provided by experts at designated facilities. However, this requires regular interaction between a patient and his or her trainers/physicians/clinicians. Accordingly, the inventors have identified a desire for a more autonomous solution to rehabilitation.

[0006] Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field. SUMMARY OF THE INVENTION

[0007] In accordance with one aspect of the present invention, there is provided a wearable tactile feedback device for attachment to a body part of a user, the device comprising: a body adapted to fit to the body part of the user; a plurality of vibration motors attached to or embedded within the body at spaced apart locations; and a microcontroller configured to send control signals to the plurality of vibration motors; wherein, the plurality of vibration motors are responsive to respective ones of the control signals to vibrate at predetermined intensities, frequencies and durations to provide vibratory tactile feedback to the user in the form of three dimensional direction cues for movement of the body part.

[0008] In some embodiments, the body comprises a compression fabric. In some embodiments, the vibration motors are woven into the compression fabric. Preferably, the body is in the form of a sleeve to be sleeved around the body part. In some embodiments, the body part is an arm.

[0009] In some embodiments, the plurality of vibration motors are disposed into a plurality of linear arrays. The linear arrays are preferably circumferentially disposed around the body. In some embodiments, the vibration motors are disposed into four linear arrays, each comprising four vibration motors. Preferably, the four linear arrays are disposed on the sleeve at separate circumferential locations such that, when worn by the user, the linear motor arrays extend along a medial, dorsal, ventral and lateral side of the arm.

[0010] In some embodiments, the vibration motors are spaced apart by a distance in the range of 25 mm to 40 mm. Preferably, the vibration motors are spaced apart by a distance in the range of 25 mm to 30 mm.

[0011] In some embodiments, the wearable tactile feedback device includes a wireless communication device for wirelessly connecting the microcontroller to other electronic devices.

[0012] In some embodiments, the wearable tactile feedback device includes one or more sensors to sense the position or motion of the body part. In some embodiments, the one or more sensors comprise one or more inertial measurement units. In some embodiments, the one or more sensors comprise one or more accelerometers. In some embodiments, the one or more sensors comprise an image sensor in electrical communication with the microcontroller to image and track the position of the body part and/or objects of interest in three dimensional space.

[0013] The one or more sensors are preferably adapted to generate a sensor signal indicative of a position or motion of the body part and wherein the microcontroller is responsive to the sensor signal for generating three dimensional direction cues.

[0014] In some embodiments, the wearable tactile feedback device is adapted to communicate with a virtual reality module and wherein the control signals are responsive to inputs from the virtual reality module such that the three dimensional direction cues provided to the user are dependent on a virtual reality environment.

[0015] In some embodiments, the wearable tactile feedback device is adapted to communicate with a wireless beacon attached to an object of interest. The wireless beacon may include a Bluetooth Beacon. The control signals are preferably responsive to inputs from the beacon such that the three dimensional direction cues provided to the user are dependent on the tracked location of the beacon.

[0016] In some embodiments, the microcontroller is programmed with one or more pre-set training modules that provide a predetermined sequence of three dimensional direction cues to the user.

[0017] In some embodiments, the three dimensional direction cue conveys to the user a direction of a target, a distance to the target and a direction of hand movement. The target may be a real target or a virtually perceived target.

BRIEF DESCRIPTION OF THE FIGURES

[0018] Example embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings in which:

Figure 1 is a schematic perspective view of a tactile feedback device according to a first embodiment being worn on a forearm of a user;

Figure 2 is a system diagram illustrating the primary components of the tactile feedback device of Figure 1 having integrated electronics components; Figure 3 is a system diagram illustrating the primary components of a tactile feedback device according to a second embodiment having a separate electronics module;

Figure 4 is a schematic logic flow diagram illustrating logic flow between components of the tactile feedback device of Figures 1 and 2; and

Figure 5 is a system diagram illustrating the primary components of a tactile feedback device according to a third embodiment having sensors incorporated therein.

DESCRIPTION OF THE INVENTION

[0019] Embodiments of the present invention are described herein with reference to the application of training patients implanted with prosthetic vision devices to improve visuo-motor performance. However, it will be appreciated that embodiments of the present invention are also capable of being used to provide a range of arm guidance, motor training and visual adaptation in other scenarios such as:

• Low-vision training;

• Low vision aids;

• Assistive displays;

• Sensory substitution devices;

• Adaptation to altered visual environments (e.g. augmented reality, virtual reality);

• Sports training;

• On-the-job training and simulation such as for construction workers or surgeons;

• Tactile communication in low light/vision scenarios in industry and military applications; and

• Tactile directional cues and/or alerts in low-visibility scenarios for drivers and cyclists.

[0020] Furthermore, the embodiments described herein relate to a wearable tactile feedback device adapted to fit to a user’s forearm. However, it will be appreciated that, in other embodiments, the tactile feedback device may be adapted to fit to other body parts of a user such as hands, legs, torso, waist and forehead. In general, the tactile feedback device can be adapted for any three-dimensional guidance application. System overview

[0021] Referring initially to Figures 1 and 2, there is illustrated a wearable tactile feedback device 100 for attachment to a body part of a user such as a user’s forearm 102. The device 100 comprises a body 104, in the form of a sleeve, adapted to fit to the forearm 102 of the user through a sleeving engagement. Device 100 also includes a plurality of vibration motors 106-121 attached to or embedded within the body 104 at spaced apart locations. As shown in Figure 2, device 100 includes a microcontroller 124 and related control circuitry 125 configured to send control signals to the plurality of vibration motors 106-121. The vibration motors are each responsive to respective ones of the control signals to vibrate at predetermined intensities/amplitudes, frequencies and durations to provide vibratory tactile feedback to the user in the form of three dimensional direction cues for movement of the body part. The significance of a three dimensional direction cue is described below.

[0022] Preferably, body 104 of device 100 comprises or is entirely formed of a compression fabric such as lycra, nylon or other known synthetic polymers having elastic qualities. This provides advantages of the device being adaptable for wearing by different users. The vibration motors 106-121 are woven into the compression fabric or may be attached to an inner surface of the compression fabric by way of a suitable attachment means such as one or more clips, press studs, hooks or adhesive materials. In other embodiments, body 104 may be formed of other materials, such as non-compressible fabrics which may be wound into tight contact with the forearm 102 and secured in a similar manner to a bandage. In some embodiments, body 104 may comprise rigid or semi-rigid materials. However, it is preferable for body 104 to conform closely with the user’s forearm 102 so as to enhance the vibration interaction of vibration motors 106-121 with the user’s skin.

[0023] As illustrated in Figure 1 , the vibration motors 106-121 are disposed in four linear arrays 126-129 disposed at circumferentially separated locations on the sleeve corresponding to positioning along a medial, dorsal, ventral and lateral side of the user’s forearm 102 respectively. When worn by the user, the arrays 126-129 extend longitudinally along the user’s forearm 102. This layout of vibration motors aids in providing three dimensional direction cues to the user. In other embodiments, different numbers and layouts of vibration motors may be used provided that a three-dimensional distribution of vibration motors is formed when the device is worn by the user. The choice of the number and position of motors in the design is based on the requirement to provide direction cues for forward and backward forearm (or other body part) movement.

[0024] The arrays 106-121 may be formed by weaving four vibration motors into a strip of fabric and subsequently weaving that strip into body 104. In order to prevent the diffusion of vibrations onto unintended stimulation sites via wires, the internal electrical connectivity may also be woven into the fabric using stainless steel conductive thread. This also helps to reduce the thickness of the sleeve and promotes maximum transfer of vibrations to the skin with minimal damping. To provide flexibility for fitting device 100 to all arm sizes, stretchable sleeves of multiple sizes may be designed with provision to interchangeably move the motor strips between sleeves or different positions within a sleeve using press stud buttons.

[0025] By way of example, vibration motors 106-121 may be ADA1201 model Vibrating Mini Motor Discs manufactured by Adafruit Industries. These devices have a diameter of about 10 mm, a thickness of about 2.7 mm, and operate at a voltage range of 2 V - 5V and frequency range of 73-183 Hz. These types of vibration motors move an internal mass at high frequencies to generate a vibratory force that is parallel to the plane of the disc and along the user’s skin. Other types of vibration motors, such as cylindrical motors may be used in place of or in conjunction with disc-type vibration motors. Cylindrical motors offer vibratory forces that are perpendicular to the user’s skin.

[0026] Within each array 126-129, vibration motors 106-121 are preferably spaced at a distance of 30 mm along forearm 102. Each array is preferably separated in the circumferential direction such that, in a stretched configuration during wearing by a user, the arrays are separated by a distance of about 25 mm. However, in other embodiments, particularly those employing different numbers of vibration motors, the motors may be disposed at different distances from each other, preferably in the range of 25 mm to 30 mm.

[0027] With reference to Figure 2, microcontroller 124 is in electrical communication with each of the 16 vibration motors via the control circuitry 125 to provide independent control signals to each motor. Although not illustrated for simplicity, it will be appreciated that individual electrical connections such as insulated wires connect each vibration motor with corresponding electrical pins of microcontroller 124.

[0028] In device 100, microcontroller 124 and control circuitry 125 are embedded within or attached to body 104 to form a unitary module. By way of example, microcontroller 124 may be a Pro mini microcontroller developed by Arduino. Microcontroller 124 may be connected to a 16-channel Pulse Width Modulation (PWM) signal controller as part of control circuitry 125. Suitable PWM signal controllers include those available from Adafruit Industries, which use the Inter-Integrated Circuit (l 2 C) communication protocol. In other embodiments, microcontroller 124, the PWM signal controller and other related control circuitry may be collocated on a single integrated circuit for reduced spatial footprint. In other embodiments, microcontroller 124 may represent any type of small integrated circuit or processor capable of being embedded within or attached to body 104, or, as described below, electrically or wirelessly connected to vibration motors 106-121.

[0029] As shown in Figure 2, device 100 may also include a wireless transceiver 131 for wirelessly communicating with external devices such as computers 133, servers 135 and mobile devices 137. Device 100 also includes a battery module 139 for providing power to microcontroller 124, control circuitry 125, vibration motors 106-121 and other components of device 100. Battery module 139 may include terminals for one or more batteries as well as a voltage regulator.

[0030] Referring now to Figure 3, there is illustrated an alternate embodiment device 300 in which microcontroller 124 is contained in a separate electronics module 302 that may be carried by the user in a pocket or situated nearby. Features common to both devices 100 and 300 are designated with the same reference numerals for simplicity and clarity. Electronics module 302 is preferably contained within a protective casing that is electrically insulating.

[0031] The separate electronics module 302 is electrically or wirelessly connected to body 104 by a local communications device 304 communicable with a corresponding communications device 306. In the case of a wired connection, communications devices 304 and 306 may include electrical ports such as a Universal Serial Bus (USB) ports, Ethernet ports, RS232 ports, Universal Asynchronous Receiver/Transmitter (UART) or other type of port supporting wired communications protocols. In the case of a wireless connection between body 104 and electronics module 302, communications devices 304 and 306 may be wireless communication devices that communicate wirelessly with each other by a wireless communications protocol such as Bluetooth, Wi-Fi or NFC.

[0032] Although illustrated as being part of the electronics module 302, it will be appreciated that, in other embodiments, battery module 139 may be separate to electronics module 302. Moreover, electronics module 139 may incorporate a modular design to accommodate for scalability and future upgrades to include other components such as sensors (described below).

[0033] Referring now to Figure 4, there is illustrated a logic flow diagram showing how control of device 100 is controlled by an external computer 133. Microcontroller 124 is programmed and controlled via a UART based serial communication interface with computer 133. Microcontroller 124 converts the received computer commands to low level hardware instructions to allow the setting of vibration parameters and selection of motor groups (e.g. arrays 126-129).

[0034] Microcontroller 124 is interfaced with a 12 channel PWM signal controller 141. Low current PWM signals from controller 141 are amplified to a higher current signal using four channel H-bridge motor driver integrated circuits (IC) 143-146 to provide control signals to vibration motors 106-121. By way of example, ICs 143-146 may be L293D type driver ICs developed by Texas Instruments. This IC is capable of providing the motor drive with current up to 600 mA per channel at voltages from 4.5 to 36 Volts.

[0035] Although illustrated as delivering control signals to respective motor arrays 126-129, ICs 143-146 may be configured to deliver control signals to other groups of vibration motors 106-121 within or between arrays 126-129. In general, control circuitry 125 provides capability for control signals to be delivered from microcontroller 124 independently to each vibration motor 106-121.

[0036] The l 2 C interfacing with microcontroller 124 allows adding multiple PWM controllers such as PWM controllers 148 and 150, each with 16 channels without additional modification and therefore makes provision for a scalable design for future applications. It will be appreciated that a greater number of PWM controllers may be implemented, each connecting corresponding motor drivers and arrays of vibration motors.

[0037] In other embodiments, device 100 is capable of operating independently of external computer 133 and control instructions are stored on memory on or accessible by microcontroller 124.

Sensor feedback

[0038] Referring now to Figure 5, there is illustrated a further embodiment wearable tactile feedback device 500, which incorporates sensors 502 configured to sense the position or motion of forearm 102. Features of device 500 that are common to devices 100 and 300 are designated with the same reference numerals for simplicity and clarity.

[0039] Incorporating sensors 502 into device 500 allows for full feedback of the user’s movement while responding to three-dimensional direction cues issued by device 500. Sensors 502 may comprise one or more or inertial measurement units, and/or accelerometers, which sense movement of the user’s forearm 102 and/or vibration motors 106-121 and, in response, generate sensor signals for processing by microcontroller 124.

[0040] In addition, sensors 502 may comprise an image sensor such as a depth camera mounted at a known fixed position relative to the device and in electrical communication with microcontroller 124 and/or an external computer such as computer 133 to image and track the position of the user’s forearm 102 and/or vibration motors 106-121 in three-dimensional space. The images captured by the image sensor may be processed locally by microcontroller 124 or connected computer such as computer 133, which executes an object tracking algorithm to track the position of forearm 102 and/or vibration motors 106-121. The object tracking algorithm may include object recognition including contour detection and shape recognition or a machine learning trained detector to recognise the shape of device 500, the pattern of vibration motors 106-121 or a pattern adhered to device 500 or forearm 102 (such as a sticker pattern).

[0041] The one or more sensors 502 are adapted to generate sensor signals indicative of a position or motion of the forearm 102 and/or vibration motors 106-121. The sensor signals may include an absolute position or a relative position of the forearm 102 and/or vibration motors 106-121.

[0042] In some embodiments, microcontroller 124 is responsive to the sensor signals for generating the three-dimensional direction cues. As such, the sensor signals may provide feedback to augment the direction cues provided to the user. For example, if the user is partway through a cued arm movement, the sensor feedback to microcontroller 124 may prompt microcontroller 124 to issue revised control signals to vibration motors 106-121 to produce revised direction cues based on the user’s movement. This feedback can act to correct the user’s movement in real-time or near real-time.

[0043] In further embodiments, device 500 is adapted to communicate with a virtual reality (VR) module (not shown) having a display to be viewed by the user and wherein the control signals are responsive to inputs from the VR module such that the three-dimensional direction cues provided to the user are dependent on a virtual reality environment. The VR module may be controlled or monitored by microcontroller 124 and/or another computer such as computer 133.

[0044] In some embodiments, microcontroller 124 is programmed with one or more pre-set training modules that provide a predetermined sequence of three-dimensional direction cues to the user. These pre-set training modules can be used independently by users to practice movements at home in the absence of a trainer. Microcontroller 124 may also be programmed to execute one or more calibration routines for calibrating the vibration motors 106-121 to suit a particular user’s sensitivity. During these calibration routines, feedback from the user may be provided directly by way of a user interface (e.g. on computer 133) or indirectly via a clinician or trainer assisting the user. Feedback from the calibration routine may act to adjust the control signals to vibration motors 106-121 such that they are actuated with different intensities, frequencies or duty cycles.

[0045] Although not illustrated, microcontroller 124 includes memory such as Random- Access Memory (RAM), Read Only Memory (ROM) or Electrically Erasable Programmable Read Only Memory (EEPROM) to store data indicative of movement of forearm 102 as sensed by sensors 502. This stored data may be transmitted to other computers over the internet (e.g. via wireless transceiver 131) for monitoring by clinicians and the like.

[0046] In some embodiments, devices 100, 300 and 500 include a visual and/or audio interface to allow the user to receive visual and/or audio input and feedback such as instructions or context for the movement exercises to be performed.

Three-dimensional direction cues

[0047] The configuration of vibration motors in a three-dimensional array around forearm 102 allows a direction cue to convey (i) direction of a virtually perceived target, (ii) distance to the target and (iii) direction of hand movement. This is achieved by controlling the intensities/amplitudes, frequencies and durations of individual vibration motors in a predefined manner so as to produce a pattern of vibration signals to stimulate the forearm 102 to move in a desired direction/orientation.

[0048] Encoding of vibrotactile messages in the control signals to vibration motors 106-121 involves the control of four stimulus parameters: (i) location (or site of stimulation), (ii) intensity (or amplitude of vibration), (iii) frequency and (iv) duration (or timing). However, stimulus perception is dependent on several factors such as types and density of mechanoreceptors, innervation density of mechanoreceptors, and the possible absorption of vibrations by the underlying soft tissue. In view of the constantly improving profile of the electronic components and their probable limitations, it is important to test the capability of the electronics design to generate and deliver perceivable stimuli.

[0049] As mentioned above, embodiments of the present invention involve miniature disc type vibration motors which are controlled by a (PWM) signal. In PWM control, the average power delivered to a motor is dependent upon the duty cycle which proportionally affects the vibration frequency and amplitude. This limits the design in terms of generating a stimulus of constant frequency and amplitude. This is commonly experienced in recent designs using disc type vibration motors and PWM control.

[0050] In view of these design related boundary conditions, the control signals generated in devices 100, 300 and 500 are designed to deliver stimulus parameters that can be used to encode effective movement cues on the motor array arrangement. In particular, the direction cues can incorporate a perceived location of a virtually perceived target, an effect of location based on perceived intensity of vibration and perceived direction of motion based on discrimination of intensity levels of vibration at different motor locations at different times.

[0051] Simple exemplary forward/backward direction cues include actuation of a single array of vibration motors in a sequential manner to indicate a forward or backward direction. Simple exemplary translation cues include actuating a single vibration motor array for a predetermined time period such as 1 second to indicate a translational movement along a dorsal, medial, lateral or ventral direction. Simple exemplary rotation cues include sequential actuation of arrays of vibration motors in a desired direction. More complex cues involving more complex movement can combine actuation of vibration motors along an array or between arrays at different intensities and frequencies. In some embodiments, the frequency applied to a vibration motor is dependent on its location on forearm 102 as different regions of forearm 102 have different frequency sensitivity. Typical duration times of the vibration motors in the movement cues is in the range of 100 ms to several seconds, depending on the complexity of the cue.

[0052] Combinations of vibration intensities may be used to convey distance, alerts, urgency, motion errors. For example, consecutive stimuli of progressively increasing intensities may be presented to indicate an approaching target and progressively decreasing intensities for receding targets. In PWM control, a change in duty cycle will reflect a proportional change in vibration frequency and amplitude of a motor and, in turn, the intensity. Additionally, user’s ability to perceive different intensities contained in a pattern is also likely to be influenced by sensory characteristics of the stimulation site. In the present invention, both these factors can be taken into consideration for designing tactile patterns involving more than one intensity by controlling the frequency of vibration in combination with the amplitude (by controlling duty cycle). By way of example, intensity discrimination may be tested on a user to establish the ability to distinguish consecutively changing intensities in different sets of patterns generated using PWM signals made up of different intensities. This process allows for calibration of the device to suit a particular user.

[0053] In a particular intensity discrimination test for calibration, two sets of vibro-tactile patterns can be created consisting of (i) three (duty cycle = 50%, 75%, 100%) and (ii) four (duty cycle = 25%, 50%, 75%, 100%) steps of consecutively changing intensities with a delay of 500 ms between intensity transitions. Each block consists of eight trials with each pattern being presented once in random order to one of the four sides of the forearm. Each pattern may be played five times in a trial with a delay of 1000 ms between subsequent presentations. The user is instructed to observe for the number of perceivable changes in intensity. The user’s answers to the number of intensities perceived by them (1,2, 3, 4, more than 4, No change) should be recorded at the end of each trial.

[0054] Vibro-tactile direction cue patterns to convey direction of intended motion are primarily performed by sequential stimulation of a set of motors to generate illusory sensations of apparent motion on the skin. The design of tactile patterns for indicating direction of motion is relatively complex and requires the electronics to support synchronised control of multiple device components and parameters. In some embodiments, devices 100, 300 and 500 can be configured to convey direction using veridical and amplitude modulation methods of direction encoding.

[0055] In a veridical presentation mode, a set of motors are sequentially activated for a defined duration to create a pattern of moving stimulus on the skin. Amplitude modulation is based on the funnelling effect and involves modulation of amplitude between successive stimulation sites such that one decreases while the other increases. While both methods are used to convey direction, encoding for amplitude modulation is relatively complex, involving simultaneous activation of multiple motors and control of parameters such as duration of stimulus (DOS) (time for which a motor is active) and inter-stimulus onset interval (ISOI) (time between activation of two successive motors). In comparison to veridical mode, amplitude modulation has been shown to produce a smoother and continuous perception of direction.

Conclusions

[0056] It will be appreciated that embodiments of the invention described above provide a tactile feedback device for delivering vibro-tactile feedback to a user to facilitate technology assisted learning of reach tasks. Similar to guiding movements from an expert trainer, the tactile feedback device delivers complementary vibro-tactile cues to guide the user’s arm towards a visually or virtually perceived target to facilitate learning. The device can be used in a training setting, or users can use it independently at home, or with remote guidance from a rehab expert.

[0057] The device provides opportunities to explore new models for delivering training. Beyond prosthetic vision rehabilitation, the invention has the potential to be applied in a range of arm guidance and motor training scenarios such as low vision training or during adaptation to altered visual environments (e.g. augmented reality, VR).

[0058] The tactile feedback device can be pre-programed and used by patients in the comfort of their home environment to train independently. The device can be further enhanced by interfacing with modern ICT devices where remote rehab sessions can be delivered and monitored by experts. Additionally, the option to interface with ICT provides access to computer generated artificial interactive environments such as VR. Using VR patients can be trained on a wider range of tasks, in safe environments, with lower infrastructure requirements. While these novel training models can increase opportunities for practice and improve functional outcomes for prosthetic vision patients and people with low and impaired vision, they will indirectly address some of the commonly encountered issues in low-vision rehabilitation such as:

• travel of patients to training facilities;

• device use; and

• adherence to training regime etc. [0059] By means of providing complementary vibro-tactile information that guides the arm towards an object, the invention facilitates the learning that will help users to make association between a visually seen object and the movement that needs to be performed to reach it accurately. The tactile feedback device can be programmed with pre-set training modules that can be used independently by users to practice movements at home in the absence of a trainer.

[0060] Preliminary human testing of the present invention addressing key perceptual questions about the device's intended use has showed high recognition rates for interpreting the direction of target and direction in which the movement is to be performed.

[0061] It is anticipated that the tactile feedback device will help to evolve novel training delivery models including in-home training and telerehabilitation that are aimed at increasing functional outcomes for patients implanted with prosthetic vision devices.

Interpretation

[0062] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," “determining”, analysing” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.

[0063] Reference throughout this specification to “one embodiment”, “some embodiments” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment”, “in some embodiments” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.

[0064] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner. [0065] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.

[0066] It should be appreciated that in the above description of exemplary embodiments of the disclosure, various features of the disclosure are sometimes grouped together in a single embodiment, Fig., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this disclosure.

[0067] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the disclosure, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.

[0068] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the disclosure may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.

[0069] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical, electrical or optical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

[0070] Embodiments described herein are intended to cover any adaptations or variations of the present invention. Although the present invention has been described and explained in terms of particular exemplary embodiments, one skilled in the art will realize that additional embodiments can be readily envisioned that are within the scope of the present invention.