Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
VIRTUAL MOBILITY FOR REMOTE ISSUE RESOLUTION
Document Type and Number:
WIPO Patent Application WO/2018/235096
Kind Code:
A1
Abstract:
The present disclosure relates to a system (100) for remote issue resolution comprising at least one kit (200) having components designed for mounting on the person of at least one unskilled operator (A) and at least one backend setup (300) adapted to be used by the in-house expert(s) (B) and optionally, the third-party expert(s) (C). The 5 components of the kit (200) and the backend setup (300) are in communicable connection over at least one computer network (400) and are adapted to work together to facilitate the expert(s) to be virtually present at the area having issue via the interface of the unskilled operator (A) or optionally via at least one local unskilled operator-resource (D) carrying the kit (200), and to resolve the issue. The present 10 disclosure also provides a method for remote issue resolution by employing the afore- stated system (100).

Inventors:
PURANDARE YATIN (IN)
Application Number:
PCT/IN2018/050397
Publication Date:
December 27, 2018
Filing Date:
June 15, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PURANDARE YATIN (IN)
International Classes:
G06Q10/06; G06F9/44; G06F15/16
Domestic Patent References:
WO2017027110A12017-02-16
Foreign References:
US9110581B22015-08-18
Attorney, Agent or Firm:
MEHENDALE, Rujuta (IN)
Download PDF:
Claims:
I Claim:

1. A system (100) for remote issue resolution comprising:

a. at least one kit (200) having components designed for mounting on the person of at least one unskilled operator (A), and adapted to be influenced by at least one in-house expert (B) and optionally, at least one third-party expert (C); said kit (200) comprising at least one component selected from the group consisting of:

i. at least one wearable utility assembly (200a) having components adapted to comprehensively capture particulars about at least one area having issue; and ii. at least one companion device (200b) adapted to function as the user interface for at least one of the components of said wearable utility assembly (200a); and

b. at least one backend setup (300) adapted to be used by said in-house expert(s) (B) and optionally, said third-party expert(s) (C); said backend setup (300) comprising at least one component selected from the group consisting of:

i. at least one remote issue resolution server apparatus (300a); and

ii. at least one backend electronic device (300b), wherein the components of said kit (200) and said backend setup (300) are in communicable connection over at least one computer network (400) and are adapted to work together to facilitate said in-house expert(s) (B) and optionally said third-party expert(s) (C) to be virtually present at the area having issue via the interface of said unskilled operator (A) or optionally via at least one local unskilled operator-resource (D) carrying said kit (200), and to resolve said issue.

2. The system as claimed in claim 1, wherein said wearable utility assembly (200a) comprises at least one component selected from the group consisting of: at least one head-mounted eyepiece (202) comprising at least one visual feed capturing module (204) adapted to capture the visual feed of the surroundings of the unskilled operator (A), at least one audio feed capturing module (206) adapted to capture the audio feed of the surroundings of the unskilled operator (A), at least one audio data disseminating module (208) adapted to convey pre-selected audio data relayed by the remote issue resolution server apparatus (300a) to the unskilled operator (A), at least one visual data projecting module (210) adapted to project pre-selected visual data, relayed by the remote issue resolution server apparatus (300a), in front of the unskilled operator (A) and at least one eyepiece communicator (212) adapted to communicate the feed captured by the eyepiece (202) to at least one assembly controller (214) and to communicate the data received from the assembly controller (214) to the eyepiece (202);

at least one sensor-probe (216) accompanied by at least one sensor- probe controller (218), said sensor-probe (216) being at least one selected from the group consisting of flow sensor, pressure sensor, temperature sensor, vibrations sensor, pH sensor, humidity sensor, pollution sensor, oxygen sensor, multimeter, oscilloscope, torque wrench, screw driver(s), spanner(s), distance measurement probe, Vernier calliper(s), Spirit Level aligner with measurement and inclination measurement and adapted to record data of the area having issue, said data being in at least one form selected from the group consisting of live data form and stored time series data form, and said sensor-probe controller (218) adapted to carry out at least one task selected from the group consisting of communicating the data recorded by the sensor-probe(s) (216) to said assembly controller (214), signal conditioning, linearizing and calibrating;

at least one adjustable intensity illumination apparatus (220) adapted to illuminate the surroundings of said unskilled operator (A) as per the requirement; d. at least one protective headgear (222) accompanied by at least one specialized visual feed capturing module (224) adapted to capture the visual feed of the surroundings of the unskilled operator (A); e. at least one assembly battery (226) adapted to act as the power source for the components of said assembly (214);

f. at least one assembly controller (214) adapted to control the functioning of at least one of the components of the wearable utility assembly (200a), said assembly controller (200a) comprising at least one component selected from the group consisting of at least one assembly computer readable medium; at least one assembly processor; at least one assembly bus; at least one assembly memory and at least one assembly communication module adapted to enable the components of the assembly to communicate with said remote issue resolution server apparatus (300a); and

g. at least one assembly mounting aid (224) adapted to facilitate the mounting of the components of the wearable utility assembly (200a) on the person of the unskilled operator (A).

3. The system as claimed in claim 1, wherein said companion device (200b) comprises at least one component selected from the group consisting of at least one companion processor; at least one companion bus; at least one companion memory; at least one companion communication module; at least one companion input module; at least one companion output module; at least one companion mounting aid adapted to facilitate the mounting of the companion device on the person of the unskilled operator (A) and at least one companion computer readable medium comprising at least one head-mounted eyepiece control module adapted to carry out at least one task selected from the group consisting of manipulating the volume of the audio of the eyepiece (202) and manipulating the projected visuals of the eyepiece (202).

4. The system as claimed in claim 1, wherein said remote issue resolution server apparatus (300a) comprises at least one server apparatus processor (302); at least one server apparatus bus (304); at least one server apparatus memory (306); at least one server apparatus communication module (308) and at least one server apparatus computer readable medium (310), said server apparatus computer readable medium (310) comprising at least one module selected from the group consisting of:

a. at least one user data management module (312) adapted to manage the data associated with the users of said system (100); said user data management module (312) comprising at least one component selected from the group consisting of at least one third-party expert unit (314) adapted to facilitate the creation of a database of third-party experts (B) belonging to a plurality of technical domains, at least one local unskilled operator-resource unit (316) adapted to facilitate the creation of a database of local unskilled operator-resources (D) capable of carrying out the role of the unskilled operator (A); at least one user profile unit (318) adapted to create and maintain an editable profile of the users of said system (100) and at least one user subscription unit (320) adapted to facilitate the users of the system (100) to subscribe onto said system (100);

b. at least one control systems data procurement module (322) adapted to obtain data from control system(s) (324) associated with the area having issue, said data being in at least one form selected from the group consisting of live data form and stored time series data form; c. at least one wearable utility assembly control module (326) comprising at least one component selected from the group consisting of at least one illumination apparatus control unit (328) adapted to facilitate the adjustment of the intensity of the illumination apparatus (220), at least one specialized visual feed capturing module control unit (330) adapted to facilitate the adjustment of the focus and zooming function of the specialized visual feed capturing module (224) and at least one sensor-probe control unit (332) adapted to facilitate the prompting of at least one sensor-probe (216) to record data; at least one expert interaction module (334) adapted to facilitate interaction between the experts of said system (100) by carrying out at least one task selected from the group consisting of simultaneous viewing of the feed and data captured by the components of the wearable utility assembly (200a), simultaneous viewing of at least one media, communicating, files sharing, real-time editing of shared files, audio-video conferencing, whiteboard conferencing and expert interaction recording, translating the audio of at least one expert participating in the interaction, controlling the electronic device(s) (300b) of other expert(s) participating in the interaction, sharing the screen(s) of other expert(s) participating in the interaction, accessing the database of third-party experts (C) to identify at least one third- party expert (C) having expertise in the field of the issue, accessing the database of local unskilled operator resource (D) to identify at least one local unskilled operator resource to perform the role on the unskilled operator (A) and making payment; and

at least one expert interaction data management module (336) adapted to facilitate the management of the data resulting from the expert interaction, said expert interaction data management module (336) comprising at least one component selected from the group consisting of at least one live feed manipulating unit (338) adapted to facilitate the expert(s) to view and manipulate the live feed captured by the head- mounted eyepiece (202) and the specialized visual feed capturing unit (224) the components of the wearable utility assembly (200a) during expert interaction, while simultaneously backing up the unmanipulated live feed into the server apparatus (300a); at least one visual data selection unit (340) adapted to facilitate the selection of at least a part of the visual data generated during the expert interaction and the conveyance of said selected data to the visual data projecting module (210) of the head-mounted eyepiece (202) and at least one audio data selection unit (342) adapted to facilitate the selection of at least a part of the audio data generated during the expert interaction and the conveyance of said selected data to the audio data disseminating module (208) of the head-mounted eyepiece (202).

The system as claimed in claim 1, wherein said backend electronic device (300b) comprises at least one component selected from the group consisting of at least one backend device processor; at least one backend device bus; at least one backend device memory; at least one backend device communication module; at least one backend device photo capture module; at least one backend device video capture module; at least one backend device audio capture module; at least one backend device input module and at least one backend device output module; and at least one backend device computer readable medium comprising at least one online unit adapted to enable the expert(s) using the backend electronic device to execute the functionality of the expert interaction module, the expert interaction data management module and the wearable utility assembly control module; and at least one offline unit adapted to enable the expert(s) using the backend electronic device to execute the functionality of the user data management module and the control system data procurement module.

A method for remote issue resolution comprising:

a. virtually transporting at least one in-house expert (B), optionally accompanied by at least one third-party expert (C), to at least one area having issue by means of at least one backend setup (300) comprising at least one remote issue resolution server apparatus (300a) and at least one backend electronic device (300b), via the interface of at least one unskilled operator (A) carrying at least one kit (200) designed to be mounted on the person of the unskilled operator (A), said kit (200) comprising at least one wearable utility assembly (200a) and at least one companion device (200b) adapted to function as the user interface of the components of the wearable utility assembly (200a); and b. resolving said issue by said in-house expert (B), optionally accompanied by said third-party expert (C), by means of said backend setup (300) with the intervention from said unskilled operator (A) carrying said kit,

wherein said method for remote issue resolution is facilitated by said backend setup (300) and said kit (200) being in communicable connection over at least one computer network (400) and by the ability of the components of the kit (200) to comprehensively capture particulars about the area having issue and to be influenced to action by at least one in-house expert (B) and optionally, at least one third- party expert (C).

7. The method as claimed in claim 6, further includes the steps selected from the group consisting of creating and maintaining editable profile(s) of the user(s) by means of at least one user profile unit (318) of at least one user data management module (312) of said remote issue resolution server apparatus (300a) and at least one offline unit of said backend electronic device (300b); facilitating the users to subscribe onto at least one system (100) for remote issue resolution employing said method by means of at least one subscription unit (320) of said user data management module (312) of said remote issue resolution server apparatus (300a) and said offline unit of said backend electronic device (300b); creating at least one database of third -party experts belonging to a plurality of technical domains by means of at least one third- party expert unit (314) of said user data management module (312) of said remote issue resolution server apparatus (300a) and said offline unit of said backend electronic device (300b) and creating at least one database of local resources capable of carrying out the role of said unskilled operator (A) by means of at least one local unskilled operator-resource unit (316) of said user data management module (312) of said remote issue resolution server apparatus (300a) and said offline unit of said backend electronic device (300b).

8. The method as claimed in claim 6, wherein said step of virtually transporting the in-house expert (B), optionally accompanied by the third-party expert (C) is optionally preceded by the steps of accessing, by the expert(s), said database of local resources to identify at least one local unskilled operator-resource (D) for performing the role of said unskilled operator (A), by means of at least one component selected from the group consisting of said offline unit of said backend electronic device (300b) and said local unskilled operator-resource unit (316) of said user data management module (312) of said remote issue resolution server apparatus (300a) and assigning the role to said resource (D).

9. The method as claimed in claim 6, wherein said step of virtually transporting the in-house expert (B), optionally accompanied by the third-party expert (C) comprises at least one task selected from the group consisting of capturing the visual feed of the surroundings of the unskilled operator (A) using at least one visual feed capturing module (204) of at least one head-mounted eyepiece (202) of said wearable utility assembly (200a) and relaying said visual feed to said expert(s), by means of at least one component selected from the group consisting of at least one eyepiece communicator (212) of said head-mounted eyepiece (202), at least one assembly controller (214) of said wearable utility assembly (200a), at least one expert interaction module (334) of said remote issue resolution apparatus (300a), at least one live feed manipulating unit (338) of at least one expert interaction data management module (336) of said remote issue resolution apparatus (300a) and at least one online unit of said backend electronic device (300b); and capturing the audio feed of the surroundings of the unskilled operator (A) using at least one audio feed capturing module (206) of said head-mounted eyepiece (202) of said wearable utility assembly (200a) and relaying the audio feed to said expert(s), by means of at least one component selected from the group consisting of said eyepiece communicator (212) of said head-mounted eyepiece (202), said assembly controller (214) of said wearable utility assembly (200a), said live feed manipulating unit (338) of said expert interaction data management module (336) of said remote issue resolution apparatus (300a) and said online unit of said backend electronic device (300b).

10. The method as claimed in claim 6, wherein said step of resolving said issue comprises at least one task selected from the group consisting of: obtaining data from control system(s) (324) associated with the area having issue by means of at least one control systems data procurement module (322) of said remote issue resolution server apparatus (300a) and said offline unit of said backend electronic device (300b); said data being in at least one form selected from the group consisting of live data form and stored time series data form;

facilitating the expert(s) to influence the components of said wearable utility assembly (200a) by carrying out at least one task selected from the group consisting of adjusting the intensity of at least one illumination apparatus (220) of said wearable utility assembly (200a), by means of at least one illumination apparatus control unit (328) of at least one wearable utility assembly control module (326) of said remote issue resolution server apparatus (300a), said assembly controller (214) of said wearable utility assembly (200a) and said online unit of said backend electronic device (300b); adjusting the focus and zooming function of at least one specialized visual feed capturing module (224) of said wearable utility assembly (200a), by means of at least one specialized visual feed capturing module control unit (330) of said wearable utility assembly control module (326) of said remote issue resolution server apparatus (300a), said assembly controller (214) of said wearable utility assembly (200a) and said online unit of said backend electronic device (300b); and prompting at least one sensor-probe (216) of said wearable utility assembly (200a) to record data of the area having issue by means of at least one component selected from the group consisting of at least one sensor- probe controller (218) of said wearable utility assembly (200a), said assembly controller (214) of said wearable utility assembly (200a), at least one sensor-probe control unit (332) of said wearable utility assembly control module (326) of said remote issue resolution server apparatus (300a) and said online unit of said backend electronic device (300b), said data being in at least one form selected from the group consisting of live data form and stored time series data form;

facilitating interaction between the experts of said system (100) by carrying out at least one task selected from the group consisting of simultaneous viewing of the feed and data generated by the components of the wearable utility assembly (200a); simultaneous viewing of at least one media, manipulating said feed while simultaneously backing up the unmanipulated live feed into the server apparatus (300a); communicating; files sharing; real-time editing of shared files; audio-video conferencing; whiteboard conferencing and expert interaction recording; translating the audio of at least one expert participating in the interaction, controlling the electronic device(s) of other expert(s) participating in the interaction, sharing the screen(s) of other expert(s) participating in the interaction and making payment, said tasks being carried out by means of at least one component selected from the group consisting of said assembly controller (214) of said wearable utility assembly (200a), said expert interaction module (334) of said remote issue resolution server apparatus (300a), said expert interaction data management module (336) of said remote issue resolution server apparatus (300a), said live feed manipulating unit (338) of said expert interaction data management module (336) of said remote issue resolution server apparatus (300a) and said online unit of said backend electronic device (300b);

optionally, accessing said database of third-party experts to identify and include in the steps of claim 10, at least one suitable third-party expert (C), by means of at least one component selected from the group consisting of said expert interaction module (334) of said remote issue resolution server apparatus (300a), said third-party expert unit (314) of said user data management module (312) of said remote issue resolution server apparatus (300a), said offline unit of said backend electronic device (300b) and said online unit of said backend electronic device (300b);

e. facilitating the selection of at least a part of the visual data generated during the step of claim 10(g) and conveying said pre- selected data to at least one visual data projecting module (210) of the head-mounted eyepiece (202), by means of at least one component selected from the group consisting of said assembly controller (214) of said wearable utility assembly (200a), at least one visual data selection unit (340) of said expert interaction data management module (336) of said remote issue resolution server apparatus (300a), said expert interaction module (334) and said online unit of said backend electronic device (300b);

f. facilitating the selection of at least a part of the audio data generated during the step of claim 10(g) and conveying said pre- selected data to at least one audio data disseminating module (208) of the head- mounted eyepiece (202), by means of at least one component selected from the group consisting of said assembly controller (214) of said wearable utility assembly (200a), at least one audio data selection unit (342) of said expert interaction data management module (336) of said remote issue resolution server apparatus (300a), said expert interaction module (334) and said online unit of said backend electronic device (300b);

g. capturing the changing visual feed of the surroundings of the unskilled operator (A) using said visual feed capturing module (204) of said head-mounted eyepiece (202) of said wearable utility assembly (200a) and relaying said visual feed to said expert(s), by means of at least one component selected from the group consisting of said eyepiece communicator (212) of said head-mounted eyepiece (202), said assembly controller (214) of said wearable utility assembly (200a), said expert interaction module (334) of said remote issue resolution apparatus (300a), said live feed manipulating unit (338) of said expert interaction data management module (336) of said remote issue resolution apparatus (300a) and said online unit of said backend electronic device (300b); and

h. capturing the changing audio feed of the surroundings of the unskilled operator (A) using said audio feed capturing module (206) of said head-mounted eyepiece (202) of said wearable utility assembly (200a) and relaying said audio feed to said expert(s), by means of at least one component selected from the group consisting of said eyepiece communicator (212) of said head-mounted eyepiece (202), said assembly controller (214) of said wearable utility assembly (200a), said live feed manipulating unit (338) of said expert interaction data management module (336) of said remote issue resolution apparatus (300a) and said online unit of said backend electronic device (300b).

11. The method as claimed in claim 6, further includes at least one step selected from the group consisting of the unskilled operator (A) manipulating the audio relayed by the audio data disseminating unit (208) of the head-mounted eyepiece (202) by means of at least one eyepiece control module of at least one companion computer readable medium of said companion device (200b) and the unskilled operator (A) manipulating the visual data relayed by the visual data projecting unit (210) of the head-mounted eyepiece (202) by means of said eyepiece control module of said companion computer readable medium of said companion device (200b).

12. The method as claimed in claim 6, wherein the steps of claim 10 are carried out repeatedly and in a variable sequence until the issue is resolved.

13. A kit (200) for remote issue resolution comprising:

a. at least one wearable utility assembly (200a) having components adapted to comprehensively capture particulars about at least one area having issue; and

b. at least one companion device (200b) adapted to function as the user interface of at least one component of said wearable utility assembly (200a), wherein the components of said kit (200) are designed for mounting on the person of at least one unskilled operator (A) and upon communicable connection with at least one backend setup (300) over at least one computer network (400), are adapted to be influenced by at least one in-house expert (B) and optionally, at least one third-party expert (C) and to facilitate said in-house expert (B) and optionally said third-party expert (C) to be virtually present at the area having issue via the interface of said unskilled operator (A) or optionally via at least one local unskilled operator-resource (D), and to resolve said issue. 14. The kit as claimed in claim 13, wherein said wearable utility assembly (200a) comprises at least one component selected from the group consisting of: a. at least one head-mounted eyepiece (202) comprising at least one visual feed capturing module (204) adapted to capture the visual feed of the surroundings of the unskilled operator (A), at least one audio feed capturing module (206) adapted to capture the audio feed of the surroundings of the unskilled operator (A), at least one audio data disseminating module (208) adapted to convey pre-selected audio data relayed by the remote issue resolution server apparatus (300a) to the unskilled operator (A), at least one visual data projecting module (210) adapted to project pre-selected visual data, relayed by the remote issue resolution server apparatus (300a), in front of the unskilled operator (A) and at least one eyepiece communicator (212) adapted to communicate the feed captured by the eyepiece (202) to at least one assembly controller (214) and to communicate the data received from the assembly controller (214) to the eyepiece (202);

b. at least one sensor-probe (216) accompanied by at least one sensor- probe controller (218), said sensor-probe (216) being at least one selected from the group consisting of flow sensor, pressure sensor, temperature sensor, vibrations sensor, pH sensor, humidity sensor, pollution sensor, oxygen sensor, multimeter, oscilloscope, torque wrench, screw driver(s), spanner(s), distance measurement probe, Vernier calliper(s), Spirit Level aligner with measurement and inclination measurement and adapted to record data of the area having issue, said data being in at least one form selected from the group consisting of live data form and stored time series data form, and said sensor-probe controller (218) adapted to carry out at least one task selected from the group consisting of communicating the data recorded by the sensor-probe(s) (216) to said assembly controller (214), signal conditioning, linearizing and calibrating;

c. at least one adjustable intensity illumination apparatus (220) adapted to illuminate the surroundings of said unskilled operator (A) as per the requirement;

d. at least one protective headgear (222) accompanied by at least one specialized visual feed capturing module (224) adapted to capture the visual feed of the surroundings of the unskilled operator (A); e. at least one assembly battery (226) adapted to act as the power source for the components of said assembly (214);

f. at least one assembly controller (214) adapted to control the functioning of at least one of the components of the wearable utility assembly (200a), said assembly controller (200a) comprising at least one component selected from the group consisting of at least one assembly computer readable medium; at least one assembly processor; at least one assembly bus; at least one assembly memory and at least one assembly communication module adapted to enable the components of the assembly to communicate with said remote issue resolution server apparatus (300a); and

g. at least one assembly mounting aid (224) adapted to facilitate the mounting of the components of the wearable utility assembly (200a) on the person of the unskilled operator (A).

15. The kit as claimed in claim 13, wherein said companion device (200b) comprises at least one component selected from the group consisting of at least one companion processor; at least one companion bus; at least one companion memory; at least one companion communication module; at least one companion input module; at least one companion output module; at least one companion mounting aid adapted to facilitate the mounting of the companion device on the person of the unskilled operator (A) and at least one companion computer readable medium comprising at least one head-mounted eyepiece control module adapted to carry out at least one task selected from the group consisting of manipulating the volume of the audio of the eyepiece (202) and manipulating the projected visuals eyepiece (202).

16. The kit as claimed in claim 13, further comprising at least one carrier (228) adapted to house the components of the kit (200).

Description:
TITLE: VIRTUAL MOBILITY FOR REMOTE ISSUE RESOLUTION

FIELD

The present disclosure relates to virtual mobility for remote issue resolution. Particularly, the present disclosure relates to a system, method and kit for remote issue resolution.

DEFINITIONS

Without departing from the conventional meaning of the word, the term 'issue', for the purpose of the present disclosure, is to be interpreted as any undesirable circumstance. From the standpoint of industry, the term 'issue' is to be interpreted as any unwanted irregularity in the functioning of any setup.

The term 'area having issue', for the purpose of the present disclosure, is to be interpreted as the physical surroundings of the 'issue' . Illustration 1: if there is an issue in the 'pulper' of a paper manufacturing setup, the pulper is the area having issue. Moreover, the entire paper manufacturing setup including the factory premises can be construed to mean the area having issue. Illustration 2: if a road accident has occurred in a particular place, the area having issue can be construed to include the vehicle(s), the road as well as the neighborhood.

The term 'unskilled operator', for the purpose of the present disclosure, is to be interpreted as an entity - human or machine, that is unskilled, untrained and has no knowledge about the technical domain of the issue and consequently, is incapable of using independent discretion to resolve the issue.

The term 'in-house expert', for the purpose of the present disclosure, is to be interpreted as an entity, human or machine, associated with the area having issue and capable of adding value to the effort of issue resolution. For the purpose of the present disclosure, the in-house expert is said to be located remotely. Illustration: if there is an issue in the factory of company X, the personnel employed in the R&D wing of the company X will be referred to as in-house experts.

The term 'third-party expert', for the purpose of the present disclosure, is to be interpreted as an entity, human or machine, not having any association with the area having issue and capable of adding value to the effort of issue resolution. For the purpose of the present disclosure, the third-party expert is said to be located remotely. Illustration: if there is an issue in the factory of company X and an independent consultant is taken on board to resolve the issue, said consultant will be referred to as a third-party expert. Without departing from the conventional meaning, the phrase 'located remotely' for the purpose of the present disclosure, is to be interpreted as present away from the area having issue.

The term 'remotely located expert(s)' when used in the present disclosure, is to be interpreted as the in-house expert and optionally the third-party expert. Without departing from the conventional meaning, the phrase 'virtually present or virtual transporting' for the purpose of the present disclosure, is to be interpreted as the technological phenomenon which makes a person feel or give the person the appearance of being present at a place other than their true location.

The term 'local unskilled operator-resource', for the purpose of the present disclosure, is to be interpreted as an entity, human or machine, located near the area having issue and capable of carrying out the function of the unskilled operator.

Without departing from the conventional meaning, the term 'user' for the purpose of the present disclosure, is to be interpreted as an entity that uses the system and method of the present disclosure. Particularly, the term user encompasses the unskilled operator, the local unskilled operator-resource, the third-party expert and the in-house expert.

Without departing from the conventional meaning, the term 'holographic projector' for the purpose of the present disclosure, is to be interpreted as a projector that uses holograms rather than graphic images to produce projected pictures in thin air. Without departing from the conventionally acceptable meaning, the term 'computer readable medium' for the purpose of the present disclosure, is to be interpreted as the medium in which the instructions for carrying out the method of the present disclosure and for effective functioning of the system of the present disclosure are stored. The computer readable medium, in one embodiment is an integral component. The computer readable medium, in another embodiment, is communicatively coupled to the hardware with which it is associated by a means of a communication technique selected from the group consisting of wired communication and wireless communication. The computer readable medium, is of at least one type selected from the group consisting of volatile memory, non-volatile memory, application specific integrated circuit (ASIC) including the logic configured to perform the steps of the present disclosure, random access memory, magnetic memory such as hard disc floppy disc, solid state drive and flash memory, local database and remote database. For the purpose of the present disclosure, volatile memory is the memory that depends upon power to store information. For the purpose of the present disclosure, nonvolatile memory is the memory that does not depend upon power to store information.

Without departing from the conventionally acceptable meaning, the term 'processor' for the purpose of the present disclosure, is to be interpreted as the hardware that executes the instructions stored on the afore- stated computer readable medium. The processor, in one embodiment, includes an embedded system. The processor, in another embodiment, includes an application specific integrated circuit (ASIC) having embedded program instructions.

Without departing from the conventionally acceptable meaning, the term 'bus' for the purpose of the present disclosure, is to be interpreted as the hardware that upon connection with a processor, facilitates communication between the processor, the computer network and the other components of the system.

Without departing from the conventionally acceptable meaning, the term 'memory' for the purpose of the present disclosure, is to be interpreted as the hardware that acts as a platform for persistent storage of data such as photos, videos, audios and e-files. Without departing from the conventionally acceptable meaning, the term 'communication unit' for the purpose of the present disclosure, is to be interpreted as the hardware that facilitates customized communication protocols. The communication unit, in another embodiment, facilitates de facto communication standards selected from the group consisting of Ethernet, IEEE 802.11 series, IEEE 802.15 series and Wireless USB. The communication unit, in yet another embodiment, facilitates telecommunication standards selected from the group consisting of GPRS, CDNLA2000, TD-SCDMA, LTE, LET- Advance and WiMAX standards. The communication unit optionally facilitates customized or de facto multimedia encoding, decoding and compression instructions.

Without departing from the conventionally acceptable meaning, the term 'video capture module' for the purpose of the present disclosure, is to be interpreted as the hardware that is used for capturing moving images on electronic media or videos and includes a buffer memory for obtaining and generating image frames of the video. Without departing from the conventionally acceptable meaning, the term 'photo capture module' for the purpose of the present disclosure, is to be interpreted as the hardware that is used for capturing photographs.

Without departing from the conventionally acceptable meaning, the term 'audio capture module' for the purpose of the present disclosure, is to be interpreted as the hardware (audio recorder) that is used to capture audio feed or sounds and includes an audio editing software.

Without departing from the conventionally acceptable meaning, the term 'input module' for the purpose of the present disclosure, is to be interpreted as the hardware that functions as the user interface and is at least one selected from the group consisting of a keyboard, a mouse, a display screen or other input means to receive user's input.

Without departing from the conventionally acceptable meaning, the term 'output module' for the purpose of the present disclosure, is to be interpreted as the hardware that renders the output. BACKGROUND

Typically, businesses or industries wish to work with the best of the efficiencies to maximize the productivity and consequently the profits; and any down time or inefficiency therein can erode such profits and push the businesses into survival crisis. Most industries work 24x7 for 365 days and the main job of such industries is to keep the up time of the equipment to a maximum and any issues therein to a minimum. Down times or inefficiencies, however, are inevitable considering the dynamic conditions in which the businesses or industries operate. Issues such as stoppage, breakdown and quality check fails are common and must be addressed and resolved immediately to restart the operations and reduce the losses. To resolve such issues, the presence of all the in-house experts as well as any third-party experts at the area having issue is necessary. However, this is not practical as the costs of transportation, accommodation and the like, at such a frequency, are sky high. Furthermore, the availability of the experts depends on their schedules and coordinating everyone's schedule is extremely difficult. The conventional techniques of issue resolution are, thus, associated with multiple disadvantages such as availability of the right resource at the right time, the willingness of the resource to travel on site to offer such services, time required to reach the site, cost of traveling and personal and health issues arising out of travel for the expert - factors that cumulatively compromise the operations till the service resumes, which causes huge financial losses. The businesses suffer bilaterally and unfortunately the financial losses usually go unaccounted.

The afore-stated problem exists not only in industries but also in day-to-day situations where there is an issue, but no experts are present at the area having issue, at the right time, to resolve the issue. Support from remotely-based experts can be obtained using telephones, emails or even instant messaging applications. However, in all the afore-mentioned techniques, the scope and extent of communication is limited as these are one way and not live means of communication. Other audio-video conferencing solutions can also be used; however, they lack in a complete and comprehensive data collection mechanism required for decision making. Furthermore, the afore-mentioned solutions make mandatory, the presence of skilled manpower at the area having issue to coordinate with the remotely based expert. Presence of such skilled manpower at the area having issue is also not always possible to arrange due to various limitations.

The inventor of the present disclosure has envisaged a solution to the afore-stated quandary which makes the process of issue resolution, time and cost efficient and very effective.

OBJECTS

It is an object of the present disclosure to provide a system (100) for remote issue resolution.

It is another object of the present disclosure to provide a method for remote issue resolution.

It is yet another object of the present disclosure to provide a kit (200) for remote issue resolution.

It is still another object of the present disclosure to provide a system (100), method and kit (200) for remote issue resolution which is cost effective.

It is yet another object of the present disclosure to provide a system (100), method and kit (200) for remote issue resolution which is time effective.

It is still another object of the present disclosure to provide a system (100), method and kit (200) for remote issue resolution which has a low environmental footprint.

SUMMARY

The present disclosure provides a system (100) for remote issue resolution comprising at least one kit (200) having components designed for mounting on the person of at least one unskilled operator (A), and adapted to be influenced by at least one in-house expert (B) and optionally, at least one third-party expert (C) and at least one backend setup (300) adapted to be used by said in-house expert(s) (B) and optionally, said third-party expert(s) (C) wherein the components of said kit (200) and said backend setup (300) are in communicable connection over at least one computer network (400) and are adapted to work together to facilitate said in-house expert(s) (B) and optionally said third-party expert(s) (C) to be virtually present at the area having issue via the interface of said unskilled operator (A) or optionally via at least one local unskilled operator-resource (D) carrying said kit (200), and to resolve said issue. The kit (200) comprises at least one wearable utility assembly (200a) having components adapted to comprehensively capture particulars about at least one area having issue and at least one companion device (200b) adapted to function as the user interface for at least one of the components of said wearable utility assembly (200a). The backend setup (300) comprises at least one remote issue resolution server apparatus (300a) and at least one backend electronic device (300b). The present disclosure further provides a method for remote issue resolution by employing the afore- stated system (100).

BRIEF DESCRIPTION OF THE DRAWINGS

The objectives and advantages of the proposed invention will be more clearly understood from the following description of the proposed invention taken in conjunction with the accompanying drawings, wherein;

Figure 1 illustrates a non-limiting embodiment of the kit (200) of issue resolution of the present disclosure;

Figure 2a illustrates a non-limiting representation of the head-mounted eyepiece

(202) of the wearable utility assembly (200a) of the kit (200) of the present disclosure;

Figure 2b illustrates a non-limiting representation of the front view of the protective headgear (222) accompanied by at least one specialized visual feed capturing module

(224) of the wearable utility assembly (200a) of the kit (200) of the present disclosure;

Figure 2c illustrates a non-limiting representation of the side view of the protective headgear (222) accompanied by at least one specialized visual feed capturing module

(224) of the wearable utility assembly (200a) of the kit (200) of the present disclosure; Figure 3 illustrates a non-limiting representation of the unskilled operator (A) carrying the components of the kit (200) of the present disclosure;

Figure 4 illustrates a non-limiting embodiment of the system (100) of issue resolution of the present disclosure;

Figure 5 illustrates another non-limiting embodiment of the system (100) of issue resolution of the present disclosure; Figure 6 illustrates a non-limiting embodiment of the components of the remote issue resolution server apparatus (300a) of the backend setup (300) of the present disclosure; and

Figure 7 illustrates a non-limiting embodiment of the method of issue resolution of the present disclosure.

DETAILED DESCRIPTION

The foregoing objects of the invention are accomplished, the problems and shortcomings associated with the prior art techniques and approaches are overcome by the proposed invention as described below in the preferred embodiment.

In accordance with one aspect, the present disclosure provides a system (100) for remote issue resolution. The system (100) of the present disclosure comprises at least one kit (200) specially designed for use by at least one unskilled operator (A) and at least one backend setup (300) adapted to be used by at least one expert located remotely, wherein the components of the kit (200) and the backend setup (300) are in communicable connection over at least one computer network (400) (illustrated in Figures 4 and 5).

The system (100) is specifically designed to address the issue of the constraints involved in making experts available at the area having issue, at the right time. Assuming that an expert would not always be available at the area having issue, the system (100) provides a kit (200) which not only enables remotely located expert(s) to be virtually present at the area having issue at any given time, but also empowers them to diagnose and resolve the issue in a fast as well as comprehensive manner - via the interface of the unskilled operator (A). By means of the kit (200), the system (100) of the present disclosure facilitates capturing of a wide range of particulars about the area having issue, to aid the remotely located expert(s) to better resolve the issue. Even further, the system (100) of the present disclosure provides databases of third- party experts in case the in-house experts (B) do not have the expertise to resolve the issue. The system (100) also provides the technology to patch the third-party experts (C) with the in-house experts (B) and the unskilled operators (A) to resolve the issue. As the components of the kit (200) are under control of the remotely located expert(s), characteristically, unlike most of the popular solutions in this area, the system (100) of the present disclosure does not require the manpower at the area of issue to be skilled in the technical domain of the issue. The unskilled operator (A) merely acts as a puppet and follows the instructions of the remotely located expert(s). Thus, the present system (100) works perfectly well, rather is specifically designed to cater to the situation of skilled manpower not being available at the area having issue to coordinate with the remotely located expert(s). Moreover, the system (100) of the present disclosure provides databases of local resources capable of carrying out the role of unskilled operators (A) - in case even unskilled operators are not available at the area having issue. By providing the afore-mentioned technology, the inventor of the present disclosure aims at decreasing the down-time of the machines and services and saving the companies and businesses from financial losses.

The kit (200) of the present system has components designed for mounting on the person of the unskilled operator (A), leaving the hands and legs of the operator free to move around and manipulate (make changes, record data) in the area having issue (as illustrated in Figure 3). Furthermore, as the operator (A) is unskilled and therefore incapable of using their own discretion to resolve the issue, the components of the kit (200) are adapted to be influenced by at least one in-house expert (B) and optionally, at least one third-party expert (C), as necessary. The kit (200) comprises at least one component selected from the group consisting of at least one wearable utility assembly (200a) and at least one companion device (200b) (illustrated in Figure 1).

The wearable utility assembly (200a) comprises at least one component selected from the group consisting of at least one head-mounted eyepiece (202), at least one sensor- probe (216) accompanied by at least one sensor-probe controller (218), at least one adjustable intensity illumination apparatus (220), at least one protective headgear (222) accompanied by at least one specialized visual feed capturing module (224), at least one assembly battery (226), at least one assembly controller (214) and at least one assembly mounting aid (224). The head-mounted eyepiece (202) (illustrated in Figure 2a) comprises at least one visual feed capturing module (204), at least one audio feed capturing module (206), at least one audio data disseminating module (208), at least one visual data projecting module (210) and at least one eyepiece communicator (212). The visual feed capturing module (204) is adapted to capture the visual feed of the surroundings of the unskilled operator (A). In one embodiment, the visual feed capturing module (204) is a camera. The audio feed capturing module (206) is adapted to capture the audio feed of the surroundings of the unskilled operator (A). The audio feed of the surroundings of the unskilled operator (A) includes the voice of the unskilled operator (A) and the sounds emitting from the surroundings of the unskilled operator (A) including the area having issue. In one embodiment, the audio feed capturing module (206) is a microphone. The audio data disseminating module (208) is adapted to convey pre-selected audio data relayed by the backend setup (300) to the unskilled operator (A). In one embodiment, the audio data disseminating module (208) is a speaker. The visual data projecting module (210) is adapted to project pre-selected visual data relayed by the backend setup (300) in front of the unskilled operator (A). In one embodiment, the visual data projecting module (210) is a holographic projector. In another embodiment, the projected visual data is visible solely to the unskilled operator (A). The eyepiece communicator (212) is adapted to communicate the feed captured by the eyepiece (202) to the assembly controller (214) and to communicate the data received from the assembly controller (214) to the eyepiece (202). The remotely located expert(s), using the audio data disseminating module (208) and the visual data projecting module (210), instruct the unskilled operator (A) to move in the direction or to the area of which they wish to see the visual feed and hear the audio feed. In one embodiment, to help the unskilled operator (A) in understanding the requirement, the expert(s) may project certain explanatory visuals using the backend setup (300) and the visual data projecting module (210).

The sensor-probe (216) (illustrated in Figure 1) is at least one selected from the group consisting of flow sensor, pressure sensor, temperature sensor, vibrations sensor, pH sensor, humidity sensor, pollution sensor, oxygen sensor, multimeter, oscilloscope, torque wrench, screw driver(s), spanner(s), distance measurement probe, Vernier calliper(s), Spirit Level aligner with measurement and inclination measurement. The sensor-probe (216) is adapted to record data of the area having issue in accordance with the instructions given by the remotely located expert(s). The data is in at least one form selected from the group consisting of live data form and stored time series data form. The sensor-probe controller (218) is adapted to carry out at least one task selected from the group consisting of communicating the data recorded by the sensor- probe^) (216) to the assembly controller (214), signal conditioning, linearizing and calibrating. In one embodiment, the expert(s) instruct the unskilled operator (A) to measure oscillation of an industry equipment having issue using the oscilloscope. In another embodiment, the expert(s) give instructions to the IoT temperature sensor, using the backend setup (300), the sensor-probe controller (218) and the assembly controller (214), to send the temperature time series data.

The adjustable intensity illumination apparatus (220) (illustrated in Figure 1) is adapted to illuminate the surroundings of the unskilled operator (A). As per the lighting conditions in the area having issue, the intensity of the illumination apparatus (220) can be adjusted so that a clear view of the area having issue is available to the expert(s) located remotely. Characteristically, the intensity of the illumination apparatus (220) can be adjusted only by the remotely located expert(s). The adjustable intensity illumination apparatus (220) is mounted on a mounting aid (224) that is to be worn by the unskilled operator (A). In one embodiment, the mounting aid (224) is a jacket to be worn by the unskilled operator (A). In another embodiment, the mounting aid (224) is a belt to be worn by the unskilled operator (A).

The protective headgear (224) is accompanied by at least one specialized visual feed capturing module (226) (illustrated in Figure 2b and 2c) adapted to capture the visual feed of the surroundings of the unskilled operator (A). In one embodiment, the protective headgear (224) is a helmet. The specialized visual feed capturing module (226), in one embodiment, is a camera. The specialized visual feed capturing module (226) has higher capacity or frames per second (fps) and is used for capturing more specialized type of visual feed. The specialized visual feed capturing module (226) is adapted to be adjusted for the zooming and focusing functions by the remotely located expert(s). The assembly battery (226) (illustrated in Figure 1) is adapted to act as the power source for all the components of the assembly (200a), as necessary.

The assembly controller (214) (illustrated in Figure 1) is adapted to control the functioning of at least one of the components of the wearable utility assembly (200a). The assembly controller (214) comprises at least one component selected from the group consisting of at least one assembly computer readable medium; at least one assembly processor; at least one assembly bus; at least one assembly memory and at least one assembly communication module. The assembly communication module is adapted to enable the components of the wearable utility assembly to communicate with the backend setup (300) and consequently with the remotely based experts. The assembly controller (214), in one embodiment, is a single board computer.

The assembly mounting aid (224) is adapted to facilitate the mounting of the components of the wearable utility assembly (200a) on the person of the unskilled operator (A). The components of the wearable utility assembly (200a) are specifically adapted to comprehensively capture a wide range of the particulars about the area having issue to aid the remotely located expert(s) to better resolve the issue. Furthermore, since the operator (A) is unskilled, the components of the wearable utility assembly (200a) are adapted to be influenced or controlled by the remotely located expert(s) to aid in their effort of issue resolution.

The companion device (200b) of the kit (200) (illustrated in Figure 1) is adapted to function as the user interface for at least one of the components of the wearable utility assembly (200a) and comprises at least one component selected from the group consisting of at least one companion processor; at least one companion bus; at least one companion memory; at least one companion communication module; at least one companion input module; at least one companion output module; at least one companion mounting aid and at least one companion computer readable medium. The companion mounting aid is adapted to facilitate the mounting of the companion device on the person of the unskilled operator (A). The companion computer readable medium comprises at least one head-mounted eyepiece control module that is adapted to carry out at least one task selected from the group consisting of manipulating the volume of the audio of the eyepiece (202) and manipulating the projected visuals of the eyepiece (202). The term 'manipulating the volume' is to be interpreted as increasing and decreasing the volume. The term 'manipulating the projected visuals' is to be interpreted as pausing, increasing the speed, decreasing the speed, annotating on, stopping and capturing and reviewing at least one pre-determined part of the feed. The companion device (200b) is in one embodiment, an electronic computing device selected from the group consisting of mobile phones, smart phones, laptop computer machines, desktop computer machines, tablet computer machines, wearable computing machines and personal digital assistants. The companion device (200b) in another embodiment, is a specialized electronic device adapted to carry out the aforementioned functions.

The kit (200) also comprises at least one carrier (228) adapted to house the components of the kit (200). The backend setup (300) of the present disclosure is adapted to be used by in-house expert(s) (B) and optionally by third-party expert(s) (C). The backend setup (300) comprises at least one component selected from the group consisting of at least one remote issue resolution server apparatus (300a) and at least one backend electronic device (300b) (illustrated in Figure 4 and 5). The remote issue resolution server apparatus (300a) comprises at least one server apparatus processor (302), at least one server apparatus bus (304), at least one server apparatus memory (306), at least one server apparatus communication module (308) and at least one server apparatus computer readable medium (310) (illustrated in Figure 6). The server apparatus computer readable medium (310) comprises at least one module selected from the group consisting of at least one user data management module (312), at least one control systems data procurement module (322), at least one wearable utility assembly control module (326), at least one expert interaction module (334) and at least one expert interaction data management module (336).

The user data management module (312) is adapted to manage the data associated with the users of the present system (100). The user data management module (312) comprises at least one component selected from the group consisting of at least one third-party expert unit (314), at least one local unskilled operator-resource unit (316), at least one user profile unit (318) and at least one user subscription unit (320). The third-party expert unit (314) is adapted to facilitate the creation of a database of third - party experts belonging to a plurality of technical domains. In case the in-house expert (B) is not equipped with the knowledge to resolve the issue, the database of third- party experts can be referred to and taken on board in the effort of issue resolution. The third-party experts (C) are experts belonging to multifarious technical domains such as engineering, legal, insurance and the like. The local unskilled operator- resource unit (316) is adapted to facilitate the creation of a database of local unskilled operator-resources (D) capable of carrying out the role of the unskilled operator (A), in cases where unskilled operators (A) are not available at the area having issue. The user profile unit (318) is adapted to create and maintain an editable profile of the users of the system (100). The editable profile includes all the information relating to the users, optionally accompanied by at least one media selected from the group consisting of photos, videos, audios and e-files. The user subscription unit (320) is adapted to facilitate the users of the system (100) to subscribe onto the system (100).

The control systems data procurement module (322) is adapted to obtain data from control system(s) (324) associated with the area having issue. Typically, the control system(s) (324) referred herein are to be interpreted as Supervisory control and data acquisition (SCAD A). The data obtained from control systems (324) is in at least one form selected from the group consisting of live data form and stored time series data form.

The wearable utility assembly control module (326) facilitates the remotely based expert(s) to influence the components of the wearable utility assembly (200a). The wearable utility assembly control module (326) comprises at least one component selected from the group consisting of at least one illumination apparatus control unit (328), at least one specialized visual feed capturing module control unit (330) and at least one sensor-probe control unit (332). The illumination apparatus control unit (328) is adapted to facilitate the adjustment of the intensity of the illumination apparatus (220). The specialized visual feed capturing module control unit (330) is adapted to facilitate the adjustment of the focus and zooming function of the specialized visual feed capturing module (224). The sensor-probe control unit (332) is adapted to facilitate the prompting of at least one sensor-probe (216) to record data.

The expert interaction module (334) is adapted to facilitate interaction between the experts of the system (100) by carrying out at least one task selected from the group consisting of simultaneous viewing of the feed and data captured by the components of the wearable utility assembly (200a), simultaneous viewing of at least one media, communicating amongst one another, files sharing, real-time editing of shared files, audio-video conferencing, whiteboard conferencing, expert interaction recording, translating the audio of at least one expert participating in the interaction, controlling the electronic device(s) of other expert(s) participating in the interaction, sharing the screen(s) of other expert(s) participating in the interaction, accessing the database of third-party experts to identify at least one third-party expert (C) having expertise in the field of the issue, accessing the database of local unskilled operator resource to identify at least one local unskilled operator resource (D) to perform the role of the unskilled operator (A) and making payment. The term media used herein before includes photos, videos, audios and e-files. The task of making payment entails facilitating the making of payment to the third-party expert (C) or the local unskilled operator resource (D), as necessary. The expert interaction module (334) facilitates the experts to resolve the issue as a joint effort, without leaving their locations, which may prove to be time and cost efficient. Further, the task of simultaneous viewing of the feed captured by the components of the wearable utility assembly (200a), specifically the head-mounted eyepiece (202), does not include showcasing the webcam feed of the remotely located expert(s); thereby significantly optimizing the bandwidth. Even further, the technology of the present disclosure facilitates the users to carry out the tasks of simultaneous viewing of the feed and data captured by the components of the wearable utility assembly (200a), communicating amongst one another and whiteboard conferencing simultaneously, by facilitating the three user interfaces to be shown together on the output unit of the backend electronic device (300b) of the backend setup (300). The expert interaction data management module (336) is adapted to facilitate management of the data resulting from the expert interaction. The expert interaction data management module (336) comprises at least one component selected from the group consisting of at least one live feed manipulating unit (338), at least one visual data selection unit (340) and at least one audio data selection unit (342). The live feed manipulating unit (338) is adapted to facilitate the expert(s) to view and manipulate the live feed captured by the head-mounted eyepiece (202) and the specialized visual feed capturing unit (224) during expert interaction, while simultaneously backing up the unmanipulated live feed into the server apparatus (300a). The term 'manipulate' is to be interpreted as pausing, increasing the speed, decreasing the speed, annotating on, stopping and capturing and reviewing at least one pre-determined part of the feed. The annotated feed can be re-transmitted back to the unskilled operator (A) along with documents, instructions, work flows to instruct, guide and resolve the issue without the expert(s) being present at the area having issue. The visual data selection unit (340) is adapted to facilitate the selection of at least a part of the visual data generated during the expert interaction and the conveyance of the selected data to the visual data projecting module (210) of the head-mounted eyepiece (220). In the cases where the expert(s) wish to explain or show certain technicalities to the unskilled operator (A), the technicalities are selected and sent to the assembly controller (214) and further to the eyepiece communicator (212) to be projected in front of the unskilled operator (A) via the head-mounted eyepiece (202). The audio data selection unit (342) is adapted to facilitate the selection of at least a part of the audio data generated during the expert interaction and the conveyance of the selected data to the audio data disseminating module (208) of the head-mounted eyepiece (202). In the cases where the expert(s) wish to convey some audio files to the unskilled operators (A) or wish to give verbal instructions to the unskilled operators (A), the audio data selection unit (342) facilitates the same.

The backend electronic device (300b) is adapted to be used by at least one in-house expert (B) and optionally by at least one third-party expert (C). The backend electronic device (300b) is a computing device selected from the group consisting of mobile phones, smart phones, laptop computer machines, desktop computer machines, tablet computer machines, wearable computing machines and personal digital assistants. The backend electronic device (300b) comprises at least one component selected from the group consisting of at least one backend device processor; at least one backend device bus; at least one backend device memory; at least one backend device communication module; at least one backend device photo capture module; at least one backend device video capture module; at least one backend device audio capture module; at least one backend device input module and at least one backend device output module; and at least one backend device computer readable medium. The backend device computer readable medium comprises at least one online unit adapted to enable the expert(s) using the backend electronic device (300b) to execute the functionality of the expert interaction module (334), the expert interaction data management module (336) and the wearable utility assembly control module (326); and at least one offline unit adapted to enable the expert(s) using the backend electronic device (300b) to execute the functionality of the user data management module (336) and the control system data procurement module (322).

The present disclosure, in another aspect, provides a method for remote issue resolution (distinctive characteristics highlighted in Figure 7). The method comprises at least one step selected from the group consisting of virtually transporting the in- house expert(s) (B), optionally accompanied by the third-party expert(s) (C), to the area(s) having issue via the interface of the unskilled operator(s) (A) and resolving the issue by the in-house expert(s) (B), optionally accompanied by the third-party expert(s) (C) with the intervention from the unskilled operator(s) (A). Typically, the method of the present disclosure is carried out by means of the afore-described system (100) of issue resolution comprising the backend setup (300) and the kit (200) in communicable connection over at least one computer network (400). The backend setup (300) comprises at least one component selected from the group consisting of at least one remote issue resolution server apparatus (300a) and at least one backend electronic device(s) (300b). The kit (200) comprises at least one component selected from the group consisting of at least one wearable utility assembly (200a) and at least one companion device (200b). As a part of the method, editable profile(s) of the user(s) of the system (100) are created and maintained by means of the user profile unit of the user data management module (312) of the remote issue resolution server apparatus (300a) and the offline unit of the backend electronic device (300b). The users are further subscribed onto the present system (100) by means of the subscription unit of the user data management module (312) of the remote issue resolution server apparatus (300a) and the offline unit of the backend electronic device (300b). Database(s) of third-party experts belonging to a plurality of technical domains are also created by means of the third- party expert unit (314) of the user data management module (312) of the remote issue resolution server apparatus (300a) and the offline unit of the backend electronic device (300b). In case the in-house expert (B) is not equipped with the knowledge to resolve the issue, the database of third-party experts can be referred to and taken on board in the effort of issue resolution. The third-party experts (C) are experts belonging to multifarious technical domains such as engineering, legal, insurance and the like. Similarly, database(s) of local resources capable of carrying out the role of the unskilled operator (A) are created by means of the local unskilled operator-resource unit (316) of the user data management module (312) of the remote issue resolution server apparatus (300a) and the offline unit of the backend electronic device (300b). In cases where unskilled operators (A) are also not available at the area having issue, the database(s) of local resources capable of carrying out the role of the unskilled operator (A) can be referred and the identified local unskilled operator-resource (D) can be taken on board.

In cases where an unskilled operator (A) is unavailable at the area having issue, the step of virtually transporting the remotely based expert(s) is preceded by the steps of accessing, by the expert(s), the database of local resources to identify at least one local unskilled operator-resource (D) for performing the role of the unskilled operator (A), by means of at least one component selected from the group consisting of the offline unit of the backend electronic device (300b) and the local unskilled operator-resource unit (D) of the user data management module (312) of the remote issue resolution server apparatus (300a). The identified local resource is then assigned the role of local unskilled operator-resource (D) and provided with the kit (200) of the present disclosure.

The step of virtually transporting the remotely located expert(s) to the area having issue, is carried out by means of the backend setup (300), via the interface of the unskilled operator (A) carrying the kit (200) designed to be mounted on the person of the unskilled operator (A). The remote issue resolution server apparatus (300a), the backend electronic device(s) (300b) and the kit (200) work together to facilitate the remotely located expert(s) to be virtually present at the area having issue at any given time. The step of virtually transporting comprises at least one task selected from the group consisting of capturing the visual feed of the surroundings of the unskilled operator (A) and relaying the visual feed to the remotely located expert(s); and capturing the audio feed of the surroundings of the unskilled operator (A) and relaying the audio feed to the remotely located expert(s). A cumulative effect of the two afore-mentioned tasks causes the remotely located expert(s) to be virtually present at the area having issue. The step of capturing the visual feed is carried out using the visual feed capturing module (204) of the head-mounted eyepiece (202) of the wearable utility assembly (202) and the step of relaying the visual feed to the remotely located expert(s) is carried out by means of at least one component selected from the group consisting of the eyepiece communicator (212) of the head-mounted eyepiece (202), the assembly controller (214) of the wearable utility assembly (200a), the expert interaction module (334) of the remote issue resolution apparatus (300a), the live feed manipulating unit (338) of the expert interaction data management module (336) of the remote issue resolution apparatus (300a) and the online unit of the backend electronic device (300b). The step of capturing the audio feed is carried out using the audio feed capturing module (206) of the head-mounted eyepiece (202) of the wearable utility assembly (200a) and the step of relaying the audio feed to the remotely located expert(s) is carried out by means of at least one component selected from the group consisting of the eyepiece communicator (212) of the head-mounted eyepiece (202), the assembly controller (214) of the wearable utility assembly (200a), the live feed manipulating unit (338) of the expert interaction data management module (336) of the remote issue resolution apparatus (300a) and the online unit of the backend electronic device (300b).

The step of resolving the issue by the in-house expert (B), optionally accompanied by the third-party expert (C), is carried out by means of the backend setup (300) with the intervention from the unskilled operator (A) carrying the kit (200). The step of resolving the issue comprises at least one task selected from the group consisting of obtaining data from control system(s) (324) associated with the area having issue, facilitating the remotely located expert(s) to influence the components of the wearable utility assembly (200a), facilitating interaction between the remotely located expert(s) of the system (100), optionally, accessing the database of third-party experts to identify and include in the interaction, at least one suitable third-party expert (B), facilitating the selection of at least a part of the visual data generated during the interaction and conveying the pre- selected visual data to the unskilled operator (A), facilitating the selection of at least a part of the audio data generated during the interaction and conveying the pre- selected audio data to the unskilled operator (A), capturing the changing visual feed of the surroundings of the unskilled operator (A) and relaying said visual feed to the remotely located expert(s) and capturing the changing audio feed of the surroundings of the unskilled operator (A)and relaying said audio feed to the remotely located expert(s). A cumulative effect of the afore- mentioned tasks, carried out repeatedly and in a variable sequence until the issue is resolved, causes the remotely located expert(s) to resolve the issue.

The task of obtaining data from the control system(s) (324) associated with the area having issue is carried out by means of the control systems data procurement module (322) of the remote issue resolution server apparatus (300a) and the offline unit of the backend electronic device (300b). The data procured is in at least one form selected from the group consisting of live data form and stored time series data form. The carrying out of the afore-mentioned task ensures that the remotely located expert(s) have access to a wide spectrum of data relating to the area having issue; thereby ensuring that the remotely located expert(s) resolves the issue in a comprehensive way. The task of facilitating the remotely located expert(s) to influence the components of the wearable utility assembly (200a) is a characteristic of the present method. As the operator (A) present at the area having issue is unskilled, the remotely located expert(s) have unfettered access to the components of the wearable utility assembly (200a) for gathering maximum possible data relating to the area having issue; thereby further ensuring that the issue resolution happens in a comprehensive way. The task of facilitating the remotely located expert(s) to influence the components of the wearable utility assembly (200a) comprises at least one sub-task selected from the group consisting of adjusting the intensity of the illumination apparatus (220) of the wearable utility assembly (200a), adjusting the focus and zooming function of the specialized visual feed capturing module (224)of the wearable utility assembly (200a) and prompting the sensor-probe (216) of the wearable utility assembly (200a) to record data of the area having issue. The sub-task of adjusting the intensity of the illumination apparatus (220) of the wearable utility assembly (200a) is carried out by means of at least one component selected from the group consisting of the illumination apparatus control unit (328) of the wearable utility assembly control module (326) of the remote issue resolution server apparatus (300a), the assembly controller (312) of the wearable utility assembly (200a) and the online unit of the backend electronic device (300b). The sub-task of adjusting the focus and zooming function of the specialized visual feed capturing module (224) of the wearable utility assembly (200a) is carried out by means of the specialized visual feed capturing module control unit (330) of the wearable utility assembly control module (326) of the remote issue resolution server apparatus (300a), the assembly controller (214) of the wearable utility assembly (200a) and the online unit of the backend electronic device (300b). The sub-task of prompting the sensor-probe (216) of the wearable utility assembly (200a) to record data of the area having issue is carried out by means of at least one component selected from the group consisting of the sensor-probe controller (332) of the wearable utility assembly (200a), the assembly controller (214) of the wearable utility assembly (200a), the sensor-probe control unit (332) of the wearable utility assembly control module (326) of the remote issue resolution server apparatus (300a) and the online unit of the backend electronic device (300b), the data being in at least one form selected from the group consisting of live data form and stored time series data form.

The task of facilitating interaction between the remotely located expert(s) is carried out by means of at least one sub-task selected from the group consisting of simultaneous viewing of the feed and data generated by the components of the wearable utility assembly (200a); simultaneous viewing of at least one media, manipulating the feed while simultaneously backing up the unmanipulated live feed into the server apparatus (300a); communicating; files sharing; real-time editing of shared files; audio- video conferencing; whiteboard conferencing and expert interaction recording; translating the audio of at least one expert participating in the interaction, controlling the electronic device(s) of other expert(s) participating in the interaction, sharing the screen(s) of other expert(s) participating in the interaction and making payment. The afore-mentioned sub-tasks are carried out by means of at least one component selected from the group consisting of the assembly controller (214) of the wearable utility assembly (200a), the expert interaction module (334) of the remote issue resolution server apparatus (300a), the expert interaction data management module (336) of the remote issue resolution server apparatus (300a), the live feed manipulating unit (338) of the expert interaction data management module (336) of the remote issue resolution server apparatus (300a) and the online unit of the backend electronic device (300b). The task of making payment entails facilitating the making of payment to the third-party expert (C) or the local unskilled operator resource (D), as necessary. As a virtue of the afore-mentioned task, a plurality of experts located in distant geographies can collaborate with one another to diagnose and resolve the issue in an efficient and comprehensive way. Further, the technique of collaboration of the present method is associated with unique technical advancements specified herein after. The task of manipulating the feed while simultaneously backing up the unmanipulated live feed into the server apparatus (300a), allows the users to pause, increase the speed, decrease the speed, annotate on, stop and capture and review at least one pre-determined part of the feed as per their requirement - without losing on wiping out the unmanipulated live feed for further reference. The task of simultaneous viewing of the feed captured by the components of the wearable utility assembly (200a), specifically the head-mounted eyepiece (202), does not include showcasing the webcam feed of the remotely located expert(s); thereby significantly optimizing the bandwidth. The technology of the present disclosure facilitates the users to carry out the tasks of simultaneous viewing of the feed and data captured by the components of the wearable utility assembly (200a), communicating amongst one another and whiteboard conferencing simultaneously, by facilitating the three user interfaces to be shown together on the output unit of the backend electronic unit (300b) of the backend setup (300); thereby cumulatively making the present method time and cost efficient. In case the in-house expert (B) is not equipped to resolve the issue at hand, the present method includes an additional step of identifying and including at least one suitable third-party expert (C) in the effort of issue resolution, by means of at least one component selected from the group consisting of the expert interaction module (334) of the remote issue resolution server apparatus (300a), the third-party expert unit (314) of the user data management module (312) of the remote issue resolution server apparatus (300a), the offline unit of the backend electronic device (300b) and the online unit of the backend electronic device (300b). With the contribution of a third- party expert (C) having specialized technical training, the interaction session can be more fruitful, and the issue resolution can be faster. The task of facilitating the selection of at least a part of the visual data generated during expert interaction and conveying the pre- selected data to at least one visual data projecting module (210) of the head-mounted eyepiece (202), is carried out by means of at least one component selected from the group consisting of the assembly controller (214) of the wearable utility assembly (200a), the visual data selection unit (340) of the expert interaction data management module (336) of the remote issue resolution server apparatus (300a), the expert interaction module (334) and the online unit of the backend electronic device (300b). In the cases where the remotely located expert(s) wish to explain or show certain technicalities to the unskilled operator (A), the technicalities can be selected and sent to the assembly controller (214) and further to the eyepiece communicator (212) to be projected in front of the unskilled operator (A) via the head-mounted eyepiece (202). The task of facilitating the selection of at least a part of the audio data generated during expert interaction and conveying the pre-selected data to at least one audio data disseminating module (208) of the head-mounted eyepiece (202), is carried out by means of at least one component selected from the group consisting of the assembly controller (214) of the wearable utility assembly (200a), the audio data selection unit (342) of the expert interaction data management module (336) of the remote issue resolution server apparatus (300a), the expert interaction module (334) and the online unit of the backend electronic device (300b). In the cases where the remotely located expert(s) wish to convey some audio files to the unskilled operators (A) or wish to give verbal instructions to the unskilled operators (A), the audio data selection unit (342) facilitates the same.

The task of capturing the changing visual feed of the surroundings of the unskilled operator (A) is carried out using the visual feed capturing module (204) of the head- mounted eyepiece (202) of the wearable utility assembly (200a) and that of relaying the visual feed to the expert(s), is carried out by means of at least one component selected from the group consisting of the eyepiece communicator (212) of the head- mounted eyepiece (202), the assembly controller (214) of the wearable utility assembly (200a), the expert interaction module (334) of the remote issue resolution server apparatus (300a), the live feed manipulating unit (338) of the expert interaction data management module (336) of the remote issue resolution apparatus (300a) and the online unit of the backend electronic device (300b). Thus, by means of the aforementioned task, the changing and updated visual feed of the area having issue is made available to the remotely located expert(s) for further manipulations and diagnosis.

The task of capturing the changing audio feed of the surroundings of the unskilled operator (A) is carried out using the audio feed capturing module (206) of the head- mounted eyepiece (202) of the wearable utility assembly (200a) and that of relaying the audio feed to the expert(s), by means of at least one component selected from the group consisting of the eyepiece communicator (212) of the head-mounted eyepiece (202), the assembly controller (214) of the wearable utility assembly (200a), the live feed manipulating unit (338) of the expert interaction data management module (336) of the remote issue resolution apparatus (300a) and the online unit of the backend electronic device (300b). Thus, by means of the afore-mentioned task, the changing and updated audio feed of the area having issue is made available to the remotely located expert(s) for further manipulations and diagnosis.

Further the unskilled operator (A) may manipulating the audio relayed by the audio data disseminating unit (208) of the head-mounted eyepiece (202) by means of at least one eyepiece control module of at least one companion computer readable medium of said companion device (200b) and manipulating the visual data relayed by the visual data projecting unit (210) of the head-mounted eyepiece (202) by means of said eyepiece control module of said companion computer readable medium of said companion device (200b), depending upon the unskilled operator's (A) requirement.

The method of the present disclosure is thus, specifically designed to address the issue of the constraints involved in making experts available at the area having issue, at the right time. Assuming that an expert would not always be available at the area having issue, the method not only enables remotely located expert(s) to be virtually present at the area having issue at any given time, but also empowers them to diagnose and resolve the issue in a fast as well as comprehensive manner - via the interface of the unskilled operator (A). The present method further enables onboarding third-part experts (C) and local unskilled operator resources (D), as need be, thereby saving significant cost and time involved in the effort of issue resolution. In accordance with yet another aspect, the present disclosure provides a kit (200) (illustrated in Figure 1) for remote issue resolution comprising the wearable utility assembly (200a) adapted to be mounted on the person of the unskilled operator (A) and at least one companion device (200b) adapted to function as the user interface for the components of the wearable utility assembly (200a). By means of the kit (200), the system (100) of the present disclosure facilitates capturing of a wide range of particulars about the area having issue, to aid the remotely located expert(s) to better resolve the issue. As the components of the kit (200) are under control of the remotely located expert(s), characteristically, unlike most of the popular solutions in this area, the system (100) and method of the present disclosure does not require the manpower at the area of issue to be skilled in the technical domain of the issue. The unskilled operator (A) merely acts as a puppet and follows the instructions of the remotely located expert(s). Thus, the present system (100) and method works perfectly well, rather is specifically designed to cater to the situation of skilled manpower not being available at the area having issue to coordinate with the remotely located expert(s). By providing the afore-mentioned technology, the inventor of the present disclosure aims at decreasing the down-time of the machines and services and saving the companies and businesses from financial losses.

The system (100), method and kit (200) of the present disclosure are thus advantageous as they facilitate efficient and interactive knowledge transfer backed up by virtual presence at any required location to remotely see, hear the problematic issues to be resolved and to take fast actions by guiding the on-site person to take required actions without being personally present at location.

The system (100), method and kit (200) of the present disclosure find applications in multifarious areas such as Remote Manufacturing, Remote Project Management, Wear Housing and Logistics, Remote Insurance Claim Settlements to save idle time of vehicles on roads, Last Mile Service Providers and the like. Characteristically, the system, method and kit of the present disclosure can be used in any situation where an issue exists, but an expert to resolve the issue is not present nearby.

Representative embodiment 1; Issue:

A company X supplied a complete Bar Rolling Mill in Yemen in the year 2010. The cost of the project was 200 million euros. The mill was erected and was ready for commissioning by end of 2011. The equipment supplied by the company X were special and needed experts to start and deliver the contractual performance. This required the company X to depute 6 of its experts along with a team of engineers at site in Yemen to start the mill. The duration of such operation was to be of around 2- 3 months.

Civil war broke out in Yemen by mid of 2011 and the company X had to issue travel restrictions to its staff to Yemen. The mill remained un-commissioned till 2017. Issue resolution using conventional means:

The company X was unsuccessful in resolving the issue completely. Consequently, the investments went on a stand-still and there was a huge loss of money. The investors who had previously invested in the mill went bankrupt as company X could not commission the mill.

Issue resolution using the present system and method:

With the system and method of the present disclosure, company X used the remote unskilled work force of the customer and commissioned the mill without sending their experts and team from Germany to Yemen. The system of the present disclosure facilitated the company X experts to be virtually present at the mill, without leaving their office desktops. Further, a local team of unskilled personnel acted as unskilled operators and provided a live stream of feed of the area of the mill, along with the data captured by the wearable utility components and control systems that helped the experts to solve the problem remotely and to deliver the required performance by the end of March 2012.

Time efficiency: The system and method of the present disclosure resolved the issue in 1 year; whereas the conventional techniques were not able to resolve the issue even in 6 years.

Economic significance: Monetary loss of 25 million dollars and business value loss of 100 million dollars could have been avoided.

Representative embodiment 2:

Issue:

Company A purchased a Wire Rod Mill from Company B, Sweden in 2008. The mill was erected and commissioned in India by the end of 2009. After about 3 years of operation in 2012, a unique problem started in the finished product manufactured through this mill that needed an expert from the machine supplier to understand and resolve. The problem was related to the quality of the material produced which caused a reduction in the selling value of the material, goodwill with the clients and customer satisfaction. Since this was a running line, it was important that the problem be solved on urgent basis to avoid devaluation, loss of revenue and good will.

The expert qualified to resolve the problem was not available due to his prior commitments for 45 days. The expert travelled business class, stayed in 5-star hotels and charged 900 euros per day as consultation charges. No other expert was available as there was an attrition in the staff due to recession. After his arrival, the expert resolved the issue in around 3 hours by looking into the process, performance, listening to various noise levels of the machine and checking various process parameters captured by the automation system. Issue resolution using conventional means: company X had to spend hundreds and thousands of euros to get the issue resolved, not to mention, the loss of goodwill with their customers. The rebuilding of company credential took a lot of time and money.

How the present system and method would have helped avert crisis: With the system and method of the present disclosure, the expert could have been virtually transported to the area of issue to collect, see and hear the required symptoms and could have guided the skilled/ unskilled manpower present at the area to take the corrective steps in less than 3 hours.

Company Y would have got its process reinstated in a shortest span of time. Company Y would have avoided financial losses and loss of goodwill in the market. Company Y would have saved hefty expenditure of getting the expert to the area having issue.

Time efficiency: The system and method of the present disclosure could have resolved the issue in under 3 hours, as opposed to the couple of months that were required to resolve the issue. Economic significance: Monetary loss of INR 5 crore and business value loss of INR 50 crores could have been avoided.

The embodiments described herein above are non-limiting. The foregoing descriptive matter is to be interpreted merely as an illustration of the concept of the proposed invention and it is in no way to be construed as a limitation. Description of terminologies, concepts and processes known to persons acquainted with technology has been avoided to preclude beclouding of the afore-stated embodiments.

TECHNICAL ADVANTAGES AND ECONOMIC SIGNIFICANCE

The technical advantages and economic significance of the system (100), method and kit (200) for remote issue resolution include but are not limited to:

• precludes the need for the expert to be at the area having issue;

• offers a system for identifying and seeking help from third-party experts for effecting issue resolution;

• offers a system for identifying and allotting local unskilled resources in case of unavailability of manpower;

• reduces downtime by right escalation and control;

• improves productivity by getting guidance from experts and increasing the value and growth of business;

• facilitates creation of a local work force for effecting employment creation and reduction in access time;

• creates highest value in the business.

The foregoing objects of the invention are accomplished, and the problems and shortcomings associated with prior art techniques and approaches are overcome by the proposed invention described in the present embodiment. Detailed descriptions of the preferred embodiment are provided herein; however, it is to be understood that the proposed invention may be embodied in various forms. Therefore, specific details disclosed herein are not to be interpreted as limiting, but rather as a basis for the claims and as a representative basis for teaching one skilled in the art to employ the proposed invention in virtually any appropriately detailed system, structure, or matter.

The embodiments of the invention as described above, and the processes disclosed herein will suggest further modification and alterations to those skilled in the art. Such further modifications and alterations may be made without departing from the scope of the invention.