Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR GUIDING USER
Document Type and Number:
WIPO Patent Application WO/2022/243882
Kind Code:
A1
Abstract:
Disclosed is a system (100) and a method for guiding a user (800, 902) through an environment. The system comprises a sensing arrangement (102) and a wearable arrangement (104). The sensing arrangement comprises a time-of-flight sensor (106), a detection module (108). The time-of-flight sensor is configured to scan the environment in near real-time to generate a scanned view (800B, 800D, 800F, 800H, 900). The detection module is configured to detect one or more obstacles (804, 806, 808, 810, 812, 820, 822, 824, 904, 906, 908, 910, 916) in the scanned view. The wearable arrangement is adapted to be worn on a chest (802) by the user. The wearable arrangement comprises an arrangement of haptic actuators (110) configured to provide a feedback to the user based on the detected one or more obstacles.

Inventors:
MWENDA BRIAN MWITI (KE)
Application Number:
PCT/IB2022/054602
Publication Date:
November 24, 2022
Filing Date:
May 18, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HOPE TECH PLUS LTD (KE)
International Classes:
A61H3/06
Foreign References:
US20180303702A12018-10-25
DE102008039153A12010-02-25
Download PDF:
Claims:
CLAIMS

1. A system (100) for guiding a user through an environment, the system comprising: a sensing arrangement (102) comprising: a time-of-flight sensor (106) configured to scan the environment in near real- time to generate a scanned view (800B, 800D, 800F, 800H, 900); a detection module (108) configured to detect one or more obstacles (804, 806, 808, 810, 812, 820, 822, 824, 904, 906, 908, 910, 916) in the scanned view; and a wearable arrangement (104) adapted to be worn on a chest (802) by the user (800, 902), the wearable arrangement comprising an arrangement of haptic actuators (110) configured to provide a feedback to the user based on the detected one or more obstacles.

2. A system (100) according to claim 1 , wherein the arrangement of the haptic actuators (110) is in the form of an array pattern in the wearable arrangement (104).

3. A system (100) according to claim 2, wherein the sensing arrangement (102) is located in a centre of the array pattern in the wearable arrangement (104). 4. A system (100) according to any one of claims 1-3, wherein the sensing arrangement

(102) is configured to divide the scanned view (800B, 800D, 800F, 800H, 900) in near real time into a grid pattern (814) corresponding to the array pattern of the arrangement of the haptic actuators (110), with each of the haptic actuator (202, 818A, 818B, 818C, 818D, 818E, 818F, 818G, 818H, 8181, 818J, 818K, 818L, 818M, 818N) in the arrangement of haptic actuators being associated with a corresponding grid (816 A, 816B, 816C, 816D, 816E, 816F, 816G, 816H, 8161, 816J, 816K, 816L, 816M, 816N) in the grid pattern of the scanned view.

5. A system (100) according to claim 4, wherein each of the haptic actuator (202, 818A, 818B, 818C, 818D, 818E, 818F, 818G, 818H, 8181, 818J, 818K, 818L, 818M, 818N) in the arrangement of haptic actuators (110) is configured to be triggered to provide the feedback to the user (800, 902) when an obstacle of the one or more obstacles (804, 806, 808, 810, 812, 820, 822, 824, 904, 906, 908, 910, 916) is detected in the corresponding grid (816A, 816B, 816C, 816D, 816E, 816F, 816G, 816H, 8161, 816J, 816K, 816L, 816M, 816N) in the grid pattern (814) of the scanned view (800B, 800D, 800F, 800H, 900). 6. A system (100) according to claim 5, wherein the detection module (108) is further configured to determine a distance of each of the one or more detected obstacles (804, 806, 808, 810, 812, 820, 822, 824, 904, 906, 908, 910, 916) in the scanned view (800B, 800D, 800F, 800H, 900), wherein each of the haptic actuator (202, 818 A, 818B, 818C, 818D, 818E, 818F, 818G, 818H, 8181, 818J, 818K, 818L, 818M, 818N) in the arrangement of haptic actuators (110) is configured to vary an intensity of the provided feedback to the user (800, 902) based on the distance of the obstacle of the one or more obstacles detected in the corresponding grid (816 A, 816B, 816C, 816D, 816E, 816F, 816G, 816H, 8161, 816J, 816K, 816L, 816M, 816N ) in the grid pattern (814) of the scanned view. 7. A system (100) according to any one of preceding claims further comprising at least one bracelet device (600) adapted to be worn on one of designated wrists by the user (800, 902), wherein the at least one bracelet device comprises a secondary actuator (612) configured to provide a secondary feedback to the user.

8. A system (100) according to claim 7 further comprising a navigation module configured to define a navigation path (826) for the user (800, 902), wherein the secondary feedback by the at least one bracelet device (600) is generated based on the defined navigation path.

9. A system (100) according to any one of preceding claims, wherein the time-of-flight sensor (106) is an infrared camera (514). 10. A system (100) according to any one of preceding claims, wherein the wearable arrangement (104) comprises an adjustable chest strap (204) adapted to be worn around the chest (802) by the user (800, 902).

11. A method for guiding a user (800, 902) through an environment, the method comprising: - scanning, using a time-of-flight sensor (106), the environment in near real-time to generate a scanned view (800B, 800D, 800F, 800H, 900); detecting one or more obstacles (804, 806, 808, 810, 812, 820, 822, 824, 904, 906, 908, 910, 916) in the scanned view; and providing a feedback to the user, using a wearable arrangement (104) adapted to be worn on a chest (802) by the user by an arrangement of haptic actuators (110) therein, based on the detected one or more obstacles.

12. A method according to claim 11 further comprising: - dividing the scanned view (800B, 800D, 800F, 800H, 900) in near real-time into a grid pattern (814) corresponding to an array pattern of the arrangement of the haptic actuators in the wearable arrangement; and associating each of the haptic actuator (202, 818A, 818B, 818C, 818D, 818E, 818F, 818G, 818H, 8181, 818J, 818K, 818L, 818M, 818N) in the arrangement of haptic actuators (110) with a corresponding grid (816A, 816B, 816C, 816D, 816E, 816F, 816G, 816H, 8161,

816J, 816K, 816L, 816M, 816N in the grid pattern of the scanned view.

13. A method according to claim 12 further comprising triggering each of the haptic actuator (202, 818A, 818B, 818C, 818D, 818E, 818F, 818G, 818H, 8181, 818J, 818K, 818L, 818M, 818N) in the arrangement of haptic actuators (110) to provide a feedback to the user (800, 902) when an obstacle of the one or more obstacles (804, 806, 808, 810, 812, 820, 822,

824, 904, 906, 908, 910, 916) is detected in the corresponding grid (816 A, 816B, 816C, 816D, 816E, 816F, 816G, 816H, 8161, 816J, 816K, 816L, 816M, 816N) in the grid pattern (814) of the scanned view (800B, 800D, 800F, 800H, 900).

14. A method according to claim 13 further comprising: - determining a distance of each of the one or more detected obstacles (804, 806, 808,

810, 812, 820, 822, 824, 904, 906, 908, 910, 916) in the scanned view (800B, 800D, 800F, 800H, 900); and configuring each of the haptic actuator (202, 818 A, 818B, 818C, 818D, 818E, 818F, 818G, 818H, 8181, 818J, 818K, 818L, 818M, 818N) in the arrangement of haptic actuators (110) to vary an intensity of the provided feedback to the user (800, 902) based on the distance of the obstacle of the one or more obstacles detected in the corresponding grid (816 A, 816B, 816C, 816D, 816E, 816F, 816G, 816H, 8161, 816J, 816K, 816L, 816M, 816N) in the grid pattern (814) of the scanned view.

15. A method according to any one of claims 11-14 further comprising: providing at least one bracelet device (600) adapted to be worn on one of designated wrists by the user (800, 902), wherein the at least one bracelet device comprises a secondary actuator (612) configured to provide a secondary feedback to the user; defining a navigation path (826) for the user; and - generating the secondary feedback to be provided by the at least one bracelet device based on the defined navigation path.

Description:
SYSTEM AND METHOD FOR GUIDING USER

TECHNICAL FIELD

The present disclosure relates to aid devices for visually impaired users; and more specifically to a system and a method for guiding such visually impaired user through an environment.

BACKGROUND

There are millions of blind and visually impaired persons in the world. Such persons are limited in movement and may need assistance to perform basic tasks such as, navigating through an environment, crossing roads and the likes. This reduces the participation of visually impaired people in the society socially as well as economically.

Typically, visually impaired and blind people rely on the availability of care givers and canes to move around. A wide variety of objects and hazards are found in the outside environment. The user may struggle to discern such objects using the conventional white cane. The white cane when used as a main aid for mobility, comes with many limitations such as, obstacles detection of limited height, that is, the white cane may not be useful to detect obstacles above knee-level. For example, if there is a table in the path of the user, the cane may pass between the table legs, under the tabletop. The white cane also has other shortcomings, such as detection of obstacles within limited distance range, poor manoeuvrability, wearing of the white cane with use and a stigma of the white cane use. Since, the use of white canes limits the object recognition feedback to low level hazards, they may not help the visually impaired and blind people efficiently. In order to mitigate such issues smart canes have come up as improvements to the traditional white cane. However, they too may not be able to detect objects above knee level. Apart from white canes, existing mobility aids include guide dogs. Guide dogs may be expensive and may not be accessible always with less than one percentage of the visually impaired people owning guide dogs.

Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks associated with conventional mobility aids for blind and visually impaired people. SUMMARY

An object of the present disclosure is to provide a system and a method for guiding a user through an environment. Another object of the present disclosure is to provide a solution that overcomes at least partially the problems encountered in the prior art. In one aspect, an embodiment of the present disclosure provides a system for guiding a user through an environment, the system comprising: a sensing arrangement comprising: a time-of-flight sensor configured to scan the environment in near real-time to generate a scanned view; - a detection module configured to detect one or more obstacles in the scanned view; and a wearable arrangement adapted to be worn on a chest by the user, the wearable arrangement comprising an arrangement of haptic actuators configured to provide a feedback to the user based on the detected one or more obstacles. In one aspect, an embodiment of the present disclosure provides a method for guiding a user through an environment, the method comprising: scanning, using a time-of-flight sensor, the environment in near real-time to generate a scanned view; detecting one or more obstacles in the scanned view; and - providing a feedback to the user, using a wearable arrangement adapted to be worn on a chest by the user by an arrangement of haptic actuators therein, based on the detected one or more obstacles.

Embodiments of the present disclosure substantially eliminate or at least partially address the aforementioned problems in the prior art, and provide proper guidance to a visually impaired user through the environment.

Additional aspects, advantages, features and objects of the present disclosure will be made apparent from the drawings and the detailed description of the illustrative embodiments construed in conjunction with the appended claims that follow. It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.

BRIEF DESCRIPTION OF DRAWINGS

Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:

FIG. l is a schematic illustration of a system for guiding a user through an environment, in accordance with an embodiment of the present disclosure;

FIG. 2 is a perspective view illustration of a system for guiding a user through an environment, in accordance with an embodiment of the present disclosure;

FIG. 3 is a partial exploded view illustration of the present system, in accordance with an embodiment of the present disclosure;

FIG. 4A is a perspective front view illustration of a sensing arrangement of the system of FIG. 2, in accordance with an embodiment of the present disclosure;

FIG. 4B is a perspective rear view illustration of the sensing arrangement, in accordance with an embodiment of the present disclosure;

FIG. 5 is an exploded view illustration of the sensing arrangement, in accordance with an embodiment of the present disclosure;

FIG. 6A is a perspective view illustration of a bracelet device of the present system, in accordance with an embodiment of the present disclosure;

FIG. 6B is a wireframe illustration of the bracelet device of FIG. 6A, in accordance with an embodiment of the present disclosure;

FIG. 7 is a perspective view illustration of a back cover for the sensing arrangement, in accordance with an embodiment of the present disclosure;

FIG. 8A is a depiction of the present system being worn by a user on his/her chest, in accordance with an embodiment of the present disclosure; FIG. 8B is a depiction of a first exemplary scanned view as generated by the present system, in accordance with an embodiment of the present disclosure;

FIG. 8C is a depiction of the first exemplary scanned view divided into a grid pattern as generated by the present system, in accordance with an embodiment of the present disclosure;

FIG. 8D is a depiction of a second exemplary scanned view divided into the grid pattern as generated by the present system, in accordance with an embodiment of the present disclosure;

FIG. 8E is a depiction of the present system being worn by the user and providing a feedback according to detected one or more obstacles in the second exemplary scanned view of FIG. 8D, in accordance with an embodiment of the present disclosure;

FIG. 8F is a depiction of a third exemplary scanned view divided into the grid pattern as generated by the present system, in accordance with an embodiment of the present disclosure; FIG. 8G is a depiction of the present system being worn by the user and providing a feedback according to the detected one or more obstacles in the third exemplary scanned view of FIG. 8F, in accordance with an embodiment of the present disclosure;

FIG. 8H is a depiction of a fourth exemplary scanned view divided into the grid pattern as generated by the present system, in accordance with an embodiment of the present disclosure;

FIG. 81 is a depiction of the present system being worn by the user and providing a feedback according to the detected one or more obstacles in the fourth exemplary scanned view of FIG. 8H, in accordance with an embodiment of the present disclosure;

FIG. 8J is a depiction of the fourth exemplary scanned view of FIG. 8H, showing a navigation path as generated by the present system, in accordance with an embodiment of the present disclosure;

FIG. 9 is a schematic illustration of a fifth exemplary scanned view, in accordance with an embodiment of the present disclosure; and FIG. 10 is a flowchart illustrating steps of a method for guiding the user through the environment, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION OF EMBODIMENTS

The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.

In one aspect, an embodiment of the present disclosure provides a system for guiding a user through an environment, the system comprising: a sensing arrangement comprising: a time-of-flight sensor configured to scan the environment in near real-time to generate a scanned view; a detection module configured to detect one or more obstacles in the scanned view; and a wearable arrangement adapted to be worn on a chest by the user, the wearable arrangement comprising an arrangement of haptic actuators configured to provide a feedback to the user based on the detected one or more obstacles.

In one aspect, an embodiment of the present disclosure provides a method for guiding a user through an environment, the method comprising: scanning, using a time-of-flight sensor, the environment in near real-time to generate a scanned view; detecting one or more obstacles in the scanned view; and providing a feedback to the user, using a wearable arrangement adapted to be worn on a chest by the user by an arrangement of haptic actuators therein, based on the detected one or more obstacles.

It may be appreciated that blind and visually impaired persons may face challenges in executing basics task and may need assistance. This is mainly due to presence of a wide variety of obstacles that may possibly cause accidents to the visually impaired persons. The present disclosure relates to a system and a method for guiding a user through an environment. Herein, the user may be the blind and/or visually impaired person. The environment may be a spatial region around an area that the user wishes to traverse. The system and the method may help the user to traverse the area without, particularly, needing any other assistance.

The system comprises a sensing arrangement. Herein, the sensing arrangement may be configured to scan the environment and detect one or more obstacles so as to provide a feedback about the obstacles to the user and hence, may prevent the user from accidently hitting the one or more obstacles. It may be appreciated that a wide variety of obstacles may be present in the environment. Depending on a height of the one or more obstacles they may be classified into a ground level obstacle, a low level obstacle, a middle-level obstacle and a high level obstacle. The obstacles such as, small pebbles lying around, that are of ground level may be the ground level obstacles. The obstacles, such as, shrubs, bins and the likes whose height may be slightly above the ground level may be classified as the low level obstacles. The obstacles, such as, cars, whose height may be above the ground level and about half the height of an average person may be classified as the middle-level obstacles. The obstacles, such as, overhead wires, trees, that may be above the height of the average person may be defined as the high level obstacle.

The sensing arrangement comprises a time-of-flight sensor configured to scan the environment in near real-time to generate a scanned view. It may be appreciated that the time-of-flight sensor may be a time-of-flight camera that may scan the environment similar to the way bats scan the environment. Herein, the time-of-flight sensor may send a light beam into the environment. The light beam may hit the one or more obstacle and may return back to the time-of-flight sensor. The time taken by the light beam to return to the time-of- flight sensor may be measured in order to determine a distance at which the one or more obstacles are located with respect to the time-of-flight sensor. It may be appreciated that as the user traverses through the environment, the distance of the one or more obstacles keeps on changing. Hence, the time-of-flight sensor may scan the environment in near real time, as the user moves forwards to generate the instant scanned view. It may be understood that, herein, the term “scanned view” may refer to a field of view captured by the time-of-flight sensor.

In an embodiment, the time-of-flight sensor is an infrared camera. It may be appreciated that the infrared camera may send infrared light beams which are invisible to human eyes, in order to capture the scanned view. Herein, the infrared camera may send infrared light beams which may return back to the infrared camera after hitting the one or more obstacles. The time taken by the infrared light beams to return back to the infrared camera may be used to measure the distances at which the one or more obstacles are positioned from the infrared camera. Such technique may be contemplated by a person skilled in the art and thus has not been described further for brevity of the present disclosure.

The sensing arrangement comprises a detection module configured to detect one or more obstacles in the scanned view. Herein, the detection module may be a processor that may work on computer vision techniques, such as, object detection techniques, objection identification techniques and the likes, to detect one or more obstacles. The one or more obstacle may be poles, overhead wires, lamp posts, trees and the likes present in the environment that may be potential threat to the user. In order to detect the one or more obstacles, the scanned view may be given as an input to an artificial intelligence (AI) model trained to detect the one or more obstacle. Various algorithms which implement computer vision techniques to detect the one or more obstacle are known, and thus have not been described herein in detail. It may be noted that the detection module may detect one or more obstacles including those which are at about head height, in front of and at the ground level of the user as the user moves forward in the environment. That is, the detection module may detect the ground level obstacles, the low level obstacles, the middle-level obstacles as well as the high level obstacles.

The system comprises a wearable arrangement adapted to be worn on a chest by the user. Herein, the wearable arrangement may be in the form of a strap that may be worn on the chest of the user. The said strap for the wearable arrangement may be a soft thermoformed rubber over moulded to a sub frame. This allows the present system to be worn by the user, and thus eliminate the need for the user to handle the system by using his/her hands, and thus adds to the convenience and utility. Although the present disclosure has been described in terms of the wearable arrangement being worn on the chest by the user, it may be appreciated that the wearable arrangement may, alternatively, be worn slightly below (e.g., as a waist belt or the like) or above (e.g., as a neck collar or the like) without departing from the spirit and the scope of the present disclosure. In an embodiment, the wearable arrangement comprises an adjustable chest strap adapted to be worn around the chest by the user. The adjustable chest strap may include a buckle that may help the user to adjust the wearable arrangement according to his/her physique, so that the wearable arrangement is worn over the chest of the user firmly and does not fall off. In order to remove the wearable arrangement, the user may loosen it by loosening the buckle. Again, such arrangement may be contemplated by a person skilled in the art and thus has not been described in detail herein.

It may be noted that the sensing arrangement comprising the time-of-flight sensor and the detection module may be assembled together so as to form the unitary device. It may be noted that the sensing arrangement may include a front cover and a housing. The front cover of the sensing arrangement may be positioned on a front of the housing. The housing may support the time-of-flight sensor and the detection module. The wearable arrangement may include a cavity according to shape of the housing of the sensing arrangement, to accommodate and support the sensing arrangement therein. In one or more examples, the housing of the sensing arrangement may snap fit into the cavity of the wearable arrangement, to complete the present system for use. In one or more embodiments of the present disclosure, the cavity may comprise a first set of pins and a second set of pins. A back of the housing may include a third set of pins and a fourth set of pins. In order to removably fasten the sensing arrangement on to the wearable arrangement, the third set of pins and the fourth set of pins of the sensing arrangement may be received in the first set of pins and the second set of pins of the wearable arrangement.

It may be noted that the housing may also comprise a rechargeable battery having a sufficiently long battery life such as, a lithium polymer battery, to provide power to the sensing arrangement. In such cases, when the rechargeable battery may have drained out, the sensing arrangement may be removed from the wearable arrangement and the sensing arrangement may be plugged in for recharging. Once, the rechargeable battery is charged, the sensing arrangement may be refastened to the wearable arrangement. The rechargeable battery may be selected to facilitate continuous use of the system for about a day worth of use at the least, i.e. typically about eight hours. The wearable arrangement comprises an arrangement of haptic actuators configured to provide a feedback to the user based on the detected one or more obstacles. Herein, the haptic actuator may be device that may provide a feedback in the form of sense of touch to the user by applying force or vibrations to the user. It may be appreciated that haptic actuators may comprise two poles having some space in between and fastened together by means of a spring. A first pole may be stationary and may comprise coil wound over it. A second pole may be able to move. The coil may be energised by providing a direct current (DC) voltage across it. Once the coil is energised, the magnetic field produced by a current flowing through the coil may attract the second pole leading to collision of the second pole and the first pole which in turn may produce vibrations across the haptic actuator that may be relayed to the user. It may be appreciated that the produced vibration may depend on the magnitude of the DC voltage applied across the coil. The haptic actuators of the present disclosure may provide feedback about the one or more obstacles by vibrating. That is, if the one or more obstacle is detected, the haptic actuators may vibrate and hence, may provide the feedback to the user.

In an embodiment, the arrangement of the haptic actuators is in the form of an array pattern in the wearable arrangement. In an exemplary embodiment, twelve number of haptic actuators may be arranged in the form of the array pattern of 2 x 6, i.e. with two rows and six columns. The haptic actuators in the form of the array pattern may be arranged in such a way that when the user wears the wearable arrangement on the chest, the haptic actuators may generally be located just below the height of the sensing arrangement and may be are positioned from left to right across the chest.

In an embodiment, the sensing arrangement is located in a centre of the array pattern in the wearable arrangement. As discussed, the haptic actuators may be arranged in the form of the array pattern in the wearable arrangement. The sensing arrangement may be located in the centre of the array pattern. That is, if twelve haptic actuators are arranged in the array pattern having two rows and six columns, the sensing arrangement may be located in the centre of the array pattern such that, on each side of the sensing arrangement, two rows and three columns of the haptic actuators are present. This enables to generate the scanned view along and across the centre of the field of view of the time-of-flight sensor. Moreover, fastening the sensing arrangement in the centre of the array may help in conveying the feedback to the user efficiently. In an embodiment, the sensing arrangement is configured to divide the scanned view in near real-time into a grid pattern corresponding to the array pattern of the arrangement of the haptic actuators, with each of the haptic actuator in the arrangement of haptic actuators being associated with a corresponding grid in the grid pattern of the scanned view. As discussed, the time-of-flight sensor may scan the environment in near real-time to generate the scanned view. The sensing arrangement may then divide the scanned view into the grid pattern according to the array pattern of the haptic actuators. For example, if twelve haptic actuators are arranged in the array pattern having two rows and six columns, the sensing arrangement may divide the scanned view into a grid pattern with corresponding twelve grids arranged in two rows and six columns. Each haptic actuator of the array pattern may be associated with the corresponding grid in the grid pattern of the scanned view. That is, for example, the haptic actuator in the first row and the first column of the array pattern may be associated with the grid in the first row and the first column of the grid pattern. Similarly, other haptic actuators may be also associated with the corresponding grid. In an embodiment, each of the haptic actuator in the arrangement of haptic actuators is configured to be triggered to provide the feedback to the user when an obstacle of the one or more obstacles is detected in the corresponding grid in the grid pattern of the scanned view. As discussed, each of the haptic actuator in the arrangement of haptic actuators may be associated with the corresponding grid in the grid pattern of the scanned view. Each of the grid may be analysed by the detection module to detect if any of the obstacle(s) of the one or more obstacles is present therein. If any obstacle of the one or more obstacles is detected in a particular grid of the gird pattern, the corresponding haptic actuators may be triggered to vibrate. For example, if twelve haptic actuators are arranged in the array pattern having two rows and six columns, the sensing arrangement divides the scanned view into the grid pattern having two rows and six columns, and further if a tree is detected in grid lying in the first row and the first column of the grid pattern, the corresponding haptic actuator in the first row and the first column of the array pattern may be triggered. Thus, the user may be alarmed that an obstacle is present in the corresponding direction, and may move towards the other direction so as to prevent hitting the obstacle. In an embodiment, the detection module is further configured to determine a distance of each of the one or more detected obstacles in the scanned view, wherein each of the haptic actuator in the arrangement of haptic actuators is configured to vary an intensity of the provided feedback to the user based on the distance of the obstacle of the one or more obstacles detected in the corresponding grid in the grid pattern of the scanned view. That is, the detection module may determine the distance of each of the one or more detected obstacles in the scanned view and may trigger the corresponding haptic actuators; and closer the one or more obstacle, the more will be the vibration of the haptic actuator. The intensity of the provided feedback may be varied by varying a pulse and a frequency of the vibration of the haptic actuator. This may help the user to deduce how far or how near an obstacle may be in the environment. If the obstacle is very near, the user may change his/her path immediately in order to avoid collision. In an example, twelve haptic actuators are arranged in the array pattern having two rows and six columns and the sensing arrangement divides the scanned view into the grid pattern with twelve grids arranged in two rows and six columns; and a pole is detected in the grid corresponding in the first row, first column of the grid pattern and further a tree is detected in the grid corresponding to the first row, fifth column; and if the measured distance of the pole is 3 meters and the measured distance of the tree is 6 meters, the haptic actuator corresponding to the first row, first column of the grid pattern may be triggered sooner and may trigger with greater intensity and higher frequency as compared to the haptic actuator corresponding to the first row, fifth column of the grid pattern. That is, the intensity and the frequency of vibration of the haptic actuator corresponding to the first row, first column of the grid pattern may be higher than the intensity and the frequency of vibration of the haptic actuator corresponding to the first row, firth column of the array pattern. Thus, the distance of each of the one or more detected obstacles in the scanned view may be deduced from the feedback to the user. This may help the user to navigate the detected one or more obstacles in the environment.

In an embodiment, the system further comprises at least one bracelet device adapted to be worn on one of designated wrists by the user, wherein the at least one bracelet device comprises a secondary actuator configured to provide a secondary feedback to the user. Herein, the at least one bracelet device may be in the form of an elastic and thermoformed wrist band that may be worn on one of the designated wrists. In an embodiment, a first bracelet device may be worn on a left wrist and a second bracelet device may be worn on a right wrist. The at least one bracelet device may comprise the secondary actuator. Herein, the secondary actuators may be a plurality of haptic actuators. For example, the secondary actuators may comprise four haptic actuators arranged in a row pattern over the at least one bracelet device. The secondary actuators may provide the secondary feedback to the user.

In one or more examples, the secondary feedback may be an indication of an immediate danger. For example, if user wearing the at least one bracelet device on one of the designated wrists and the wearable arrangement on the chest is trying to cross a road and a vehicle such as, a car approaches in front of the user, the secondary actuator on the at least one bracelet device and the haptic actuators arranged on the wearable arrangement may vibrate, giving an indication of the danger. The user may thus, be saved from accidents. Hence, the at least one bracelet device may help in conveying information about the detected one or more obstacle with greater resolution. Alternatively, the secondary feedback may be configured to indicate other scenarios, like when the user may reach a designated destination or the like, without any limitations.

In an embodiment, the system further comprises a navigation module configured to define a navigation path for the user, wherein the secondary feedback by the at least one bracelet device is generated based on the defined navigation path. It may be appreciated that the navigation module may help in navigation. In the present system, the navigation module may assess the scanned view to define the navigation path. For example, if a road turns towards a left and a right and the road towards the left is constricted and has less space, the navigation module may define the navigation path by selecting the road towards the right. Herein, if the user is wearing the first bracelet device on the left hand and the second bracelet device on the right hand, the secondary actuator on the second bracelet device may vibrate indicating to the user that he/she has to turn to the right.

In the present embodiment, the navigation module may be connected to a global positioning system (GPS) of a cell phone. The user may set an initial point and a destination point on the global positioning system (GPS) via a helper or by using voice commands. Once, initial point and the destination point is set, the navigation path from the initial point to the destination point may be sent from the cell phone to the navigation module. According to the navigation path, the secondary actuator on the respective bracelet device may vibrate. For example, if the user is wearing the first bracelet device on the left hand and the second bracelet device on the right hand, and the navigation path provides a turn towards the left, the secondary actuator on the first bracelet device may vibrate providing indication to the user that he/she has to turn to the left. In some embodiments, the sensing arrangement may include a back cover, such that an inner side of the back cover may be snapped or fastened to the back of the housing. An outer side of the back cover may include a clip that may be used to fasten the unitary device with the arrangement on to a clothing of the user. The unitary device may be removed from the clothing by removing the clip from the clothing.

It may be noted that array of the haptic actuators arranged on the wearable arrangement to be worn on the chest and the secondary actuator on each of the at least one bracelet device adapted to be worn on one of designated wrists may work together as a unit for guiding the user through the environment by giving the feedback and the secondary feedback to the user. However, if the user does not wish to wear the wearable arrangement on the chest, he/she may fasten just the sensing arrangement on his/her clothing and use the at least one bracelet device for the secondary feedback using the back cover as described in the preceding paragraph. In some examples, the user may only use the unitary device and the at least one bracelet device. Herein, the unitary device may be fastened to the clothing of the user and the at least one bracelet device may be worn on one of designated wrists of the user. The unitary device and the at least once bracelet device may guide the user through the environment (with a lower resolution) by providing the secondary feedback.

It may be noted that, instead of using haptic actuators, in an embodiment, audio devices may be also used for providing the feedback and the secondary feedback. For example, a first audio device may be disposed in the wearable arrangement instead of the arrangement of haptic actuators and a second audio device may be disposed in the at least one bracelet device instead of the secondary actuator. Herein, based on the one or more detected obstacles, in a near, a middle and a far range, the first audio device and the second audio device may be triggered for providing the feedback and the secondary feedback to the user. The present description also relates to the method for guiding the user through the environment as described above. The various embodiments and variants disclosed above apply mutatis mutandis to the method for guiding the user through the environment.

In an embodiment, the method further comprises dividing the scanned view in near real-time into a grid pattern corresponding to an array pattern of the arrangement of the haptic actuators in the wearable arrangement and associating each of the haptic actuator in the arrangement of haptic actuators with a corresponding grid in the grid pattern of the scanned view.

In an embodiment, the method further comprises triggering each of the haptic actuator in the arrangement of haptic actuators to provide a feedback to the user when an obstacle of the one or more obstacles is detected in the corresponding grid in the grid pattern of the scanned view.

In an embodiment, the method further comprises determining a distance of each of the one or more detected obstacles in the scanned view and configuring each of the haptic actuator in the arrangement of haptic actuators to vary an intensity of the provided feedback to the user based on the distance of the obstacle of the one or more obstacles detected in the corresponding grid in the grid pattern of the scanned view.

In an embodiment, the method further comprises providing at least one bracelet device adapted to be worn on one of designated wrists by the user, wherein the at least one bracelet device comprises a secondary actuator configured to provide a secondary feedback to the user, defining a navigation path for the user and generating the secondary feedback to be provided by the at least one bracelet device based on the defined navigation path.

The system and the method may guide the user through the environment efficiently by detecting the one or more obstacle of all types. The field of view of the time-of-flight sensor may be larger with its position on the chest level of the user due to the wearable arrangement, and hence, the detection module may detect the high level, the medium level, the low level and the ground level obstacles. Moreover, the system of the present disclosure may also relay the distance of the detected one or more obstacle to the user, thus enable the user to make more informed choice while navigation. The at least one bracelet device may help in guiding the user through the navigation path and may also indicate immediate obstacles or the like. Furthermore, the system may be weatherproof and hence, its operation may not be affected in harsh weather such as, rainfalls. The present system in the form of wearable device may be reliable, durable, robust and drop proof to reasonable extent. The system is easy to use and the user may not need specific training, as such. Further, the present system in the form of wearable device suits the users’ needs i.e. is not obtrusive, yet is an indicator of their abilities. DETAILED DESCRIPTION OF DRAWINGS

FIG. l is a schematic illustration of a system 100 for guiding a user through an environment, in accordance with an embodiment of the present disclosure. The system 100 includes a sensing arrangement 102 and a wearable arrangement 104. The sensing arrangement 102 includes a time-of-flight sensor 106 and a detection module 108. The time-of-flight sensor 106 is configured to scan the environment in near real-time to generate a scanned view. The detection module 108 is configured to detect one or more obstacles in the scanned view. The wearable arrangement 104 is adapted to be worn on a chest by the user. The wearable arrangement 104 includes an arrangement of haptic actuators 110 configured to provide a feedback to the user based on the detected one or more obstacles.

FIG. 2 is a perspective view illustration of the system 100 of FIG. 1, in accordance with an embodiment of the present disclosure. It may be observed from FIG. 2 that the arrangement of the haptic actuators 110 includes a plurality of haptic actuators, for example, haptic actuators 202 arranged in the form of an array pattern in the wearable arrangement 104. The sensing arrangement 102 is located in a centre of the array pattern in the wearable arrangement 104. The wearable arrangement 104 includes an adjustable chest strap 204 adapted to be worn around the chest of the user. The adjustable chest strap 204 is provided with a buckle 206 that may be used to adjust a length of the adjustable chest strap 204 according to a physique of the user so that the wearable arrangement 104 fits the chest of the user.

FIG. 3 is a partial exploded view illustration of the system 100 of FIG. 1, in accordance with an embodiment of the present disclosure. Herein, the sensing arrangement 102 is shown separated from the wearable arrangement 104. As illustrated, the wearable arrangement 104 includes a cavity 302 according to a shape of the sensing arrangement 102. The cavity 302 includes a first set of pins 304 and a second set of pins 306 that may be received by a third set of pins and a fourth set of pins of the sensing arrangement 102 (as discussed later).

FIG. 4A is a perspective front view illustration of the sensing arrangement 102, in accordance with an embodiment of the present disclosure. The sensing arrangement 102 includes a front cover 402 and a housing 404. The front cover 402 includes a lens 406. The housing 404 supports the time-of-flight sensor (similar to the time-of-flight sensor 106 of FIG. 1), a rechargeable battery and the detection module (similar to the detection module 108 of FIG. 1) inside thereof. The housing 404 also includes a power button 408 that may be used to switch on or off the sensing arrangement 102

FIG. 4B is a perspective rear view illustration of the sensing arrangement 102, in accordance with an embodiment of the present disclosure. As illustrated, the housing 404 includes a third set of pins 410 and a fourth set of pins 412 on the rear side thereof. In the system 100, the pins 410, 412 may be received by the first set of pins 304 and the second set of pins 306 of the cavity 302 of FIG. 3, to form electrical contact between the sensing arrangement 102 and the wearable arrangement 104.

FIG. 5 is an exploded illustration of the sensing arrangement 102, in accordance with an embodiment of the present disclosure. The sensing arrangement 102 includes the front cover 402, a status light emitting diode (LED) light pipe 502, an array of infrared light emitting diodes 504, a printed circuit board (PCB) assembly 506, a rechargeable battery 508, and the housing assembly 404. Herein, the status light emitting diode (LED) light pipe 502 may include a plurality of light emitting diodes (LEDs) that may be turned on so as to indicate that the sensing arrangement 102 is turned ON. The array of infrared light emitting diodes 504 may include a plurality of infrared light emitting diodes, such as, the infrared light emitting diodes 510 arranged around a circular aperture 512. The array of infrared light emitting diodes 504 may send an infrared beam into the environment. The printed circuit board (PCB) assembly 506 may provide electrical connection between various components. An infrared camera 514 is provided to scan the environment in near real-time to generate the scanned view by measuring a time taken by the infrared beam to return back after hitting the one or more obstacles. The rechargeable battery 508 may be a lithium polymer battery that may provide power to the array of infrared light emitting diodes 504, the infrared camera 514 and the status LED light pipe 502. FIG. 6A is a perspective view illustration of a bracelet device 600 adapted to be worn on one of designated wrists by the user, in accordance with an embodiment of the present disclosure. The bracelet device 600 includes a capacitance sensor 602 for menu controls. The capacitance sensor 602 may include a plurality light emitting diodes (LEDs) such as, a first LED 604, a second LED 606 and a third LED 608 that may turn ON when the bracelet device 600 is turned ON. The bracelet device 600 may further include an adjustable wrist strap 610 that may be adjusted according to a girth of the designated wrists of the user. FIG. 6B is a wireframe illustration of the bracelet device 600, in accordance with an embodiment of the present disclosure. The bracelet device 600 includes the capacitance sensor 602 and a secondary actuator 612. The secondary actuator 612 may include first haptic actuators 614 arranged in a row across circular length of the bracelet device 600. The secondary actuator 612 may be configured to provide a secondary feedback to the user.

FIG. 7 is a perspective view illustration of a back cover 700 for the sensing arrangement 102, in accordance with an embodiment of the present disclosure. The back cover 700 may be snapped or fastened to the housing 404 of the sensing arrangement 102 from the rear side thereof. The back cover 700 includes a clip 702 that may be used to removably fasten the sensing arrangement 102 to a clothing of the user. Herein, if the user doesn’t want to wear the wearable arrangement on his/her chest, he/she may simply fasten the sensing arrangement 102 on the clothing and may navigate by means of the bracelet device 600.

FIG. 8A is a depiction of the present system 100 being worn by a user 800 on his/her chest 802, in accordance with an embodiment of the present disclosure. Herein, the adjustable chest strap 204 of the wearable arrangement 104 may be adjusted in its length by using the buckle 206 so that the wearable arrangement 104 is worn tightly on the chest 802 of the user 800. The sensing arrangement 102 may scan the environment in real-time to generate the scanned view.

FIG. 8B is a depiction of a first exemplary scanned view 800B, in accordance with an embodiment of the present disclosure. The first exemplary scanned view 800B may be obtained by scanning the environment. It may be observed from FIG. 8B that the first exemplary scanned view 800B includes one or more obstacles. The one or more obstacles may be a first person 804 and a second person 806 walking towards the user 800, a lamppost 808, a first tree 810 and a second tree 812. FIG. 8C is a depiction of the first exemplary scanned view 800B divided into a grid pattern 814, in accordance with an embodiment of the present disclosure. The grid pattern 814 correspond to the array pattern of the arrangement of the haptic actuators 110, with each of the haptic actuator in the arrangement of haptic actuators 110 being associated with a corresponding grid in the grid pattern 814 of the first exemplary scanned view 800B. Herein, the grid pattern 814 includes twelve grids arranged in two rows and six columns, corresponding to the arrangement of haptic actuators 110 of FIG. 1 which also includes twelve haptic actuators arranged in two rows and six columns on the wearable arrangement 104. As discussed, each of the haptic actuator 202 in the arrangement of haptic actuators 110 may be associated with the corresponding grid in the grid pattern 814 of the first exemplary scanned view 800B. FIG. 8D is a depiction of a second exemplary scanned view 800D divided into the grid pattern 814, in accordance with an embodiment of the present disclosure. Herein, the obstacles including the first person 804 and the second person 806 walking towards the user 800 have come closer to the user 800 as compared to the first exemplary scanned view 800B. Such information is provided as feedback to the user 800 who may be walking towards the first person 804 and the second person 806 so that, the user 800 does not accidently hit the first person 804 and/or the second person 806. It may be observed from FIG. 8D that the first person 804 lie in grids 816A and 816B and the second person 806 lie in grids 816C and 816D.

FIG. 8E is a depiction of the present system 100 being worn by the user 800 and providing a feedback according to detected one or more obstacles in the second exemplary scanned view 800D of FIG. 8D, in accordance with an embodiment of the present disclosure. As, the first person 804 and the second person 806 walking towards the user 800 have come closer to the user 800 in the second exemplary scanned view 800D, haptic actuators 818A, 818B, 818C and 818D corresponding to grids 816A, 816B, 816C and 816D respectively, are triggered to provide the feedback to the user 800.

FIG. 8F is a depiction of a third exemplary scanned view 800F divided into the grid pattern 814, in accordance with an embodiment of the present disclosure. Herein, the one or more obstacles detected by the detection module (such as, the detection module 108 of FIG. 1) includes a lamppost 820. It may be observed from FIG. 8F that the lamppost 820 lies in grids 816E and 816F.

FIG. 8G is a depiction of the present system 100 being worn by the user 800 and providing a feedback according to detected one or more obstacles in the third exemplary scanned view 800F of FIG. 8F, in accordance with an embodiment of the present disclosure. Herein, the haptic actuators 818E and 818F corresponding to the grids 816E and 816F respectively are triggered to provide the feedback to the user 800. FIG. 8H is a depiction of a fourth exemplary scanned view 800H divided into the grid pattern 814, in accordance with an embodiment of the present disclosure. Herein, the one or more obstacle detected by the detection module (such as, the detection module 108 of FIG. 1) includes a lamppost 822 and a tree 824. The lamppost 822 lies in grids 8161 and 816J and the tree 824 lies in grids 816K, 816L, 816M and 816N. The detection module may determine a distance of each of the one or more detected obstacles in the fourth exemplary scanned view 800H. A first distance of the lamppost 822 from the user 800 is determined as 3 meters and a second distance of the tree 824 is determined as 6 meters. Thus, the lamppost 822 is closer to the user than the tree 824 and the feedback to user 800 may be sent accordingly in order to give an idea of the first distance and the second distance.

FIG. 81 is a depiction of the present system 100 being worn by the user 800 and providing a feedback according to detected one or more obstacles in in the fourth exemplary scanned view 800H of FIG. 8H, in accordance with an embodiment of the present disclosure. Herein, haptic actuators 8181 and 818J corresponding to the grids 8161 and 816J are triggered with greater intensity so as to indicate that the lamppost 822 is closer to the user 800, as compared to haptic actuators 818K, 818L, 818M and 818N corresponding to the grids 816K, 816L, 816M and 816N which are triggered with lesser intensity so as to indicate that the tree 824 is at farther distance from the user 800.

FIG. 8J is a depiction of the fourth exemplary scanned view 800H showing a navigation path 826, in accordance with an embodiment of the present disclosure. Based on the detected one or more obstacle which is the lamppost 822 and the tree 824, a navigation module may be configured to define the navigation path 826 for the user 800 of FIG. 8 A. According to the navigation path 826, the secondary feedback by the at least one bracelet device (such as, the bracelet device 600 of FIG. 6A and FIG. 6B) is generated. As seen from FIG. 8J, the navigation path 826 indicates a towards a left from the tree 824. Hence, the at least one bracelet device 600 worn on a left wrist of the user (not shown) vibrate when the user reaches at that point in order to indicate that the user should turn towards the left.

FIG. 9 is a schematic illustration of a fifth exemplary scanned view 900, in accordance with an embodiment of the present disclosure. Herein, a user 902 is wearing the system (similar to the system 100 of FIG. 1) on his/her chest to enable the system to generate the fifth exemplary scanned view 900. The one or more obstacles are detected in the fifth exemplary scanned view 900 include a first obstacle 904 which may be a signboard, a second obstacle 906 which may be a traffic cone, a third obstacle 908 which may be a bin, a fourth obstacle 910 which may be a shrub, a fifth obstacle 912 which may be a pole, a sixth obstacle 914 which may be a signboard, and a seventh obstacle 916 which may be a tree. A virtual line 918 is shown touching a ground and indicates a ground level, a virtual line 920 is shown slightly above the ground and indicates a low level, a virtual line 922 is shown relatively higher above the ground and indicates a middle level, and a virtual line 924 is shown far above the ground and indicates a high level. Depending on a height of the one or more obstacles, the obstacles may be classified into a ground level obstacle, a low level obstacle, a middle level obstacle and a high level obstacle. In the illustrated example, the second obstacle 906 is below the virtual line 920 and hence, may be categorized as the ground level obstacle. Similarly, the fourth obstacle 910 may be categorized as the low level obstacle, the third obstacle 908 and the fifth obstacle 912 may be categorized as the middle level obstacle and the first obstacle 904, the sixth obstacle 914 and the seventh obstacle 916 may be categorized as the high level obstacle. Hence, it may be observed that the system 100 detects all types of obstacle.

FIG. 10 is a flowchart illustrating steps of a method 1000 for guiding the user through the environment, in accordance with an embodiment of the present disclosure. The method includes, at step 1002, scanning the environment in near real-time to generate the scanned view. Herein, the scanning is done using the time-of-flight sensor. The method includes, at step 1004, detecting one or more obstacles in the scanned view. The method includes, at step 1006, providing the feedback to the user. Herein, the feedback is based on the detected one or more obstacles and is provided using the wearable arrangement adapted to be worn on the chest by the user by the arrangement of haptic actuators. Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as "including", "comprising", "incorporating", "have", "is" used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Expressions such as "may" and "can" are used to indicate optional features, unless indicated otherwise in the foregoing. Reference to the singular is also to be construed to relate to the plural.