Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
LOCATING INDUSTRIAL ASSETS USING VIRTUAL THREE-DIMENSIONAL ENVIRONMENT
Document Type and Number:
WIPO Patent Application WO/2022/156991
Kind Code:
A1
Abstract:
A method and system of locating industrial assets using a virtual three-dimensional environment is disclosed. In one embodiment, the method includes storing a plurality of position coordinates of at least one device (104A) in an industrial plant (112). Then, the method comprises generating a three-dimensional model of the industrial plant (112). The method comprises determining a first location in the generated three-dimensional model, which maps to the physical location of the at least one device (104A). The method further comprises controlling a display device (206) to display a first marker at the first location in the generated three- dimensional model. The first marker at the first location of the generated three-dimensional model, is indicative of the physical location of the at least one device (104A) in the industrial plant (112).

Inventors:
DAS SOUMADEEP (IN)
NAHA JAYDEEP (IN)
WEUSTINK JAN (DE)
Application Number:
PCT/EP2021/087320
Publication Date:
July 28, 2022
Filing Date:
December 22, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SIEMENS ENERGY GLOBAL GMBH & CO KG (DE)
International Classes:
G05B23/02
Foreign References:
US20200175765A12020-06-04
US20180130260A12018-05-10
Download PDF:
Claims:
34

Claims

1. A method (600) of locating industrial assets using a virtual three-dimensional environment, the method comprising: i) storing (606) , by a processor (202) in a memory (204) , a plurality of position coordinates of at least one device (104A and 104B) deployable in the industrial plant (112) , wherein the plurality of position coordinates are indicative of a physical location of the at least one device (104A and 104B) ; ii) generating (612) , by the processor (202) , a three- dimensional model (114C, 306) of the industrial plant (112) ; iii) determining (614) , by the processor (202) , a first location in the generated three-dimensional model of the industrial plant (112) , which maps to the physical location of the at least one device (104A and 104B) ; iv) receiving (616) , by the processor (202) , from the at least one device (104A and 104B) , dynamic process data associated with at least one process executed in the industrial plant (112) ; and v) controlling (618) , by the processor (202) , a display device (206) to display: a. a first marker (114A and 114B) at the first location in the generated three-dimensional model, wherein the first marker (114A and 114B) at the first location of the generated three-dimensional model, is indicative of the physical location of the at least one device (104A and 104B) in the industrial plant (112) ; and b. the received dynamic process data captured by the at least one device (104A and 104B) .

2. The method (600) according to claim 1, further comprising : receiving (602) , by the processor (202) , from a depth sensor (302A) , depth data associated with at least one portion of the industrial plant (112) , wherein the depth data is 35 captured by the depth sensor (302A) from the at least one portion of the industrial plant (112) ; and receiving (602) , by the processor (202) , from an image capture device (302B) , at least two images of the at least one portion of the industrial plant (112) , wherein the at least two images are captured by the image capture device (302B) from the at least one portion of the industrial plant (112) .

3. The method (600) according to claim 2, further comprising stitching (608) , by the processor (202) , the at least two images to generate a stitched image, wherein the three- dimensional model of the industrial plant (112) is generated based on the stitched image and the received depth data.

4. The method (600) according to claim 1, further comprising receiving (610) , by the processor (202) , a Computer- Aided-Design based model of the industrial plant (112) , wherein the Computer-Aided-Design based model comprises a plurality of physical attributes of the at least one device (104A and 104B) , and the three-dimensional model of the industrial plant (112) is generated based on the received Computer-Aided- Design based model of the industrial plant (112) and the plurality of physical attributes of the at least one device (104A and 104B) of the industrial plant (112) .

5. The method (600) according to claim 1, further comprising : determining (622) , by the processor (202) , a current location of a user in the industrial plant (112) ; determining (624) , by the processor (202) , a second location within the three-dimensional model, which maps to the current location of the user; generating (626) , by the processor (202) , a navigation path from the second location to the first location associated with the at least one device (104A and 104B) ; and controlling (628) , by the processor (202) , the display device (206) to display directions for the user to travel from the current location to the physical location of the at least one device (104A and 104B) , based on the generated na- vigation path.

6. The method (600) according to claim 1, further comprising : determining (640) , by the processor (202) , a severity level associated with a current event in the industrial plant (112) based on the received dynamic process data; and controlling (640) , by the processor (202) , the display device (206) to display information associated with the severity level associated with the current event.

7. The method (600) according to claim 1, further comprising : predicting (642) , by the processor (202) , an upcoming event in the industrial plant (112) , based on an analysis of the received dynamic process data; and controlling (644) , by the processor (202) , the display device (206) to display information associated with the predicted upcoming event in the industrial plant (112) .

8. The method (600) according to claim 1, further comprising : detecting (646) , by the processor (202) , a current location of a vehicle (502) ; determining (648) , by the processor (202) , a second location within the generated three-dimensional model, which maps to the current location of the vehicle (502) ; generating (650) , by the processor (202) , a navigation path (504) from the second location to the first location of the at least one device (104A and 104B) , based on the detection of the current location of the vehicle (502) ; and controlling (652) , by the processor (202) , the vehicle (502) to travel from the current location of the vehicle (502) to the physical location of the at least one device (104A and 104B) , based on the generated navigation path

(504) .

9. The method (600) according to claim 1, further comprising : receiving (630) , by the processor (202) from an electronic device (106) , an image (118C) of a portion of the industrial plant (112) , wherein the image (118C) is captured by the electronic device (106) ; mapping (632) , by the processor (202) , a portion of the three-dimensional model, to the portion of the industrial plant (112) ; determining (634) , by the processor (202) , that the first location in the three-dimensional model is present within the mapped portion of the three-dimensional model; and superposing (636) , by the processor (202) , a second marker (118A and 118B) on the received image (118C) , based on the determination.

10. A non-transitory computer-readable medium encoded with computer-readable instructions thereon that, when executed by a computer, causes the computer to perform a method (600) comprising : i) storing (606) , by a processor (202) in a memory (204) , a plurality of position coordinates of at least one device (104A and 104B) in an industrial plant (112) , wherein the plurality of position coordinates is indicative of a physical location of the at least one device (104A and 104B) ; ii) generating (612) , by the processor (202) , a three- dimensional model of the industrial plant (112) ; iii) determining (614) , by the processor (202) , a location in the generated three-dimensional model, which maps to the physical location of the at least one device (104A and 104B) ; iv) receiving (616) , by the processor (202) , from the at least one device (104A and 104B) , dynamic process data associated with at least one process executed by the industrial plant (112) ; and 38 v) controlling (618) , by the processor (202) , a display device (206) to display: a. a marker (114A) at the location in the generated three-dimensional model, wherein the marker at the location of the generated three-dimensional model, is indicative of the physical location of the at least one device (104A and 104B) in the industrial plant (112) ; and b. the received dynamic process data captured by the at least one device (104A and 104B) .

11. A system for locating industrial assets using a virtual three-dimensional environment, the system comprising: i) a processor (202) ; and ii) a memory (204) coupled to the processor, wherein the memory (204) comprises machine readable instructions, that when executed by the processor (202) , cause the processor (202) to execute method steps claimed in any of the claims 1 to 9.

Description:
Description

LOCATING INDUSTRIAL ASSETS US ING VIRTUAL THREE-DIMENS IONAL

ENVIRONMENT

The present disclosure relates to the field of industrial automation systems , and more particularly relates to a system and method of locating industrial assets using a virtual three-dimensional environment .

Recent years have witnessed a rise in a number of ways of monitoring process data in an industrial plant . The process data, which may be associated with at least one process executed in the industrial plant , may be captured by one or more sensor devices located inside the industrial plant . The one or more sensor devices may detect , based on the captured process data, a fault which has occurred in the industrial plant .

When the fault is detected, a technician may have to reach a location of the detected fault , in order to check a severity of the fault , and take action . However, the industrial plant may be dispersed in a vast geographical area . Therefore , the technician may face di f ficulty in locating a source of the detected fault in the industrial plant . Conventionally, the technician may use a paper-based navigation tool , such as a map, to locate the one or more sensor devices which may have detected the fault , and thereby locate the source of the detected fault . In some cases , the technician may use a radio device to receive directions from another person, to reach the source of the detected fault . Locating the detected fault with either of the paper-based navigation tool or the radio device , may be a labor intensive and time-consuming process . Furthermore , any time delay in locating the detected fault may cause the industrial plant to shut down . In some other cases , the detected fault may cause a health hazard in the industrial plant . In such cases , the technician may have to be warned to evacuate the industrial plant immediately . Any time delay in warning the technician may potentially lead to loss of li fe and property .

Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art , through comparison of described systems with some aspects of the present disclosure , as set forth in the remainder of the present application and with reference to the drawings .

In light of the above , there is a need for a method and system for locating industrial asset using a virtual three- dimensional environment associated with the industrial plant .

Therefore , it is the obj ect of the present disclosure to provide a system and method of locating industrial assets using a virtual environment as substantially as shown in, and/or described in connection with, at least one of the figures , as set forth more completely in the claims . This obj ect is solved by a method of locating industrial assets using a virtual three-dimensional environment , a non-transitory computer- readable medium encoded with computer-readable instructions , and a system for locating industrial assets using a virtual three-dimensional environment .

An aspect of the present disclosure is achieved by a method of locating industrial assets using a virtual three- dimensional environment . The method comprises storing, by a processor in a memory, a plurality of position coordinates of at least one device in an industrial plant . The processor may be one of an X86-based processor, a Reduced Instruction Set Computing (RISC ) processor, an Application-Speci fic Integrated Circuit (AS IC ) processor, a Complex Instruction Set Computing ( CISC ) processor, and/or other processors . Examples of implementation of the memory may include , but are not limited to , Random Access Memory (RAM) , Read Only Memory (ROM) , Electrically Erasable Programmable Read-Only Memory (EEPROM) , Hard Disk Drive (HDD) , a Solid-State Drive ( SSD) , a CPU cache , and/or a Secure Digital ( SD) card . The industrial plant may be at least one of a power plant , a factory, or a warehouse . The at least one device may be at least one of a sensor device , an actuator device , or a status indicator device . The plurality of position coordinates is indicative of a physical location of the at least one device . The plurality of position coordinates may include at least one of a Global Positioning System ( GPS ) based coordinates or a Global Navigation Satellite System ( GLONASS ) based coordinates of the physical location of the at least one device .

The method may comprise generating by the processor, a three- dimensional model of the industrial plant . Advantageously, the generated three-dimensional model of the industrial plant may be useful to a user, as an indoor navigation guide within the industrial plant . Advantageously, the generated three- dimensional model of the industrial plant may be used by the user, as a virtual training simulator for training of new workers .

The method may comprise determining, by the processor, a first location in the generated three-dimensional model of the industrial plant , which maps to the physical location of the at least one device . The method may further comprise receiving, by the processor, from the at least one device , dynamic process data associated with at least one process executed by the industrial plant . Examples of the at least one process may comprise a manufacturing process , a chemical process , a thermal process , an additive process , and an electrical process . Examples of the dynamic process data may include a temperature data, a pressure data, a throughput data, a voltage data, and a reaction rate data .

The method may further comprise controlling, by the processor, a display device to display a first marker at the first location in the generated three-dimensional model and the received dynamic process data captured by the at least one device . The first marker at the first location of the generated three-dimensional model , is indicative of the physical location of the at least one device in the industrial plant . The first marker may be a visual marker, a color-based marker, or an animation based marker . Advantageously, when the captured dynamic process data and the first marker is presented to a user in the industrial plant , a reliance of the user on paper-based navigation tools , such as maps , may be minimi zed . Therefore , the user may easily locate the at least one device in the industrial plant . Consequently, an amount of labor and time required to maintain the industrial plant may be minimi zed . Advantageously, the display of the captured dynamic process data may help the user to perform remote operation and maintenance of the industrial plant .

The method may comprise receiving, by the processor, from a depth sensor, depth data associated with at least one portion of the industrial plant . The depth sensor may be a laserbased depth sensor, an ultrasound-based depth sensor, an image based depth sensor, or an infrared light based depth sensor . The depth data is captured by the depth sensor from at least one portion of the industrial plant . The method may further comprise receiving, by the processor from an image capture device , at least two images of the at least one portion of the industrial plan . The at least two images are captured by the image capture device from the at least one portion of the industrial plant . The depth sensor and the image capture device may be comprised in a hand-held device , such as a smartphone , or in an autonomous vehicle , such as an unmanned aerial vehicle (UAV) . The method may further comprise stitching, by the processor, the at least two images to generate a stitched image . The three-dimensional model of the industrial plant is generated further based on the stitched image and the received depth data . In one example , the processor may generate the three-dimensional model of the industrial plant using image data and the depth data received from the handheld electronic device . In another example , the processor may generate the three-dimensional model of the industrial plant using the depth data and the image data recei- ved from the autonomous vehicle . Thus , advantageously, the processor may generate the three-dimensional model of the industrial plant even i f a floor plan of the industrial plant is unavailable . Advantageously, the generated three- dimensional model may enable quick detection of assets within the industrial plant .

The method may comprise receiving, by the processor, a Compu- ter-Aided-Design based model of the industrial plant . The Computer-Aided-Design based model may comprise a plurality of physical attributes of the at least one device . The three- dimensional model of the industrial plant is generated based on the received Computer-Aided-Design based model of the industrial plant and the plurality of physical attributes of the at least one device of the industrial plant . Advantageously, the processor is enabled to generate the three- dimensional model of the industrial plant in a case where the Computer-Aided-Design based model of the industrial plant is available .

The method may comprise determining, by the processor, a current location of a user in the industrial plant . The method may further comprise determining, by the processor, a second location within the three-dimensional model which maps to the current location of the user in the industrial plant . The method may comprise generating, by the processor, a navigation path from the second location, to the first location of the at least one device . The method may comprise controlling, by the processor, the display device to display directions for the user to travel from the current location to the physical location of the at least one device , based on the generated navigation path . Advantageously, the user may be directed towards the physical location of the at least one device , without use of the paper-based navigation tools . Advantageously, time delays incurred for the user to reach the physical location is minimi zed . The user may be able to promptly reach the physical location of the at least one device . Advantageously, a fault detected by the at least one device may be promptly recti fied by the user .

The method may comprise determining, by the processor, a severity level of a current event in the industrial plant , based on the received dynamic process data . The current event may be an occurrence of an error, an occurrence of an anomaly, or an occurrence of an accident during the execution of the at least one process in the industrial plant . The severity level of the current event is at least one of an alert , a warning, and a fault associated with the current event . The severity level may further comprise a health hazard information associated with the current event . The health hazard information may comprise information on whether the current event has caused a health hazard in the industrial plant . The method may further comprise controlling, by the processor, the display device to display information associated with the severity level . Thus , the severity level is presented to a user via the display device . Advantageously, the user is informed of the severity level of the current event , without having to reach the physical location of the at least one device . Advantageously, the presentation of the health hazard information to the user, may enable the user to avoid visiting the industrial plant in case the current event has caused the health hazard in the industrial plant . Thus , the presentation of the severity level may improve a health level and a safety level of the user . Advantageously, the presentation of the severity level may reduce a number of visits of the user to the industrial plant .

The method may comprise predicting, by the processor, an upcoming event in the industrial plant , based on an analysis of the received dynamic process data . The predicted upcoming event may be a predicted occurrence of an error, a predicted occurrence of an anomaly, and a predicted occurrence of an accident during the execution of the at least one process in the industrial plant . The method may comprise controlling, by the processor, the display device to display information associated with the predicted upcoming event in the industrial plant . Advantageously, a technician may note the predicted upcoming event and take corrective measures to af fect an outcome of the predicted upcoming event , without having to reach the physical location of the at least one device . Advantageously, the display of the information associated with the predicted upcoming event may enable the technician to plan maintenance operations , before an occurrence of the predicted upcoming event . The display of the information associated with the predicted upcoming event may further enable the technician to improve utili zation of various assets , resources , and equipment within the industrial plant .

The method may comprise detecting, by the processor, a current location of a vehicle . The vehicle may be an autonomous land-based vehicle , an autonomous air based vehicle , or a robotic vehicle . The method may comprise determining, by the processor, a second location within the generated three- dimensional model of the industrial plant , which maps to the current location of the vehicle . The method may further comprise generating, by the processor, a navigation path from the second location to the first location of the at least one device , based on the detection of the current location of the vehicle . The method may further comprise controlling, by the processor, the vehicle to travel from the current location of the vehicle to the physical location of the at least one device , based on the generated navigation path . Advantageously, the vehicle may reach the physical location of the at least one device to perform automated maintenance of the industrial plant . Advantageously, the method may improve automated maintenance of the industrial plant .

The method may comprise receiving, by the processor from an electronic device , an image of a portion of the industrial plant , wherein the image is captured by the electronic device . The electronic device may be a laptop, a smartphone , a tablet computer, or an augmented reality based heads-up display (AR HUD) . The method may comprise mapping, by the pro- cessor, a portion of the three-dimensional model , to the portion of the industrial plant . The method may comprise determining, by the processor, that the first location in the three-dimensional model is present within the detected portion of the three-dimensional model . The method may comprise superposing, by the processor, a second marker on the received image , based on the determination . Advantageously, the superposition of the second marker may enable the processor to provide an augmented reality-based experience to a user .

The obj ect of the present disclosure is also achieved by a non-transitory computer-readable medium encoded with computer-readable instructions thereon that , when executed by a computer, causes the computer to perform a method comprising storing, by a processor in a memory, a plurality of position coordinates of at least one device in an industrial plant , wherein the plurality of position coordinates is indicative of a physical location of the at least one device . The method may comprise generating by the processor, a three-dimensional model of the industrial plant .

The method may comprise determining, by the processor, a location in the generated three-dimensional model which maps to the physical location of the at least one device . The method may comprise receiving, by the processor, from the at least one device , dynamic process data associated with at least one process executed by the industrial plant . The method may comprise controlling, by the processor, a display device to display a marker at the location in the generated three- dimensional model , wherein the marker at the location of the generated three-dimensional model , is indicative of the physical location of the at least one device in the industrial plant and the received dynamic process data captured by the at least one device .

The obj ect of the present disclosure is also achieved by a system of locating industrial assets using a virtual three- dimensional environment , the system comprising a processor and a memory coupled to the processor, wherein the memory comprises machine readable instructions , that when executed by the processor, cause the processor to perform the method steps mentioned above .

These and other features and advantages of the present disclosure may be appreciated from a review of the following detailed description of the present disclosure , along with the accompanying figures in which like reference numerals refer to like parts throughout .

The present disclosure i s further described hereinafter with reference to illustrated embodiments shown in the accompanying drawings , in which :

FIG . 1 is a schematic representation of a network environment for implementation of a system of locating industrial assets using a virtual three dimensional environment , in accordance with a first embodiment of the disclosure ;

FIG . 2 is a block diagram that illustrates an exemplary electronic device of locating industrial assets u- sing a virtual three dimensional environment , in accordance with a first embodiment of the disclosure ;

FIG . 3 is a block diagram that illustrates an exemplary implementation of generation of a three-dimensional model of an industrial plant , in accordance with a first embodiment of the disclosure ;

FIG . 4 is a block diagram that illustrates an exemplary implementation of prediction of the upcoming events in an industrial plant , in accordance with a first embodiment of the disclosure ; FIG . 5 is a block diagram that illustrates an exemplary implementation of navigation of a vehicle to a first physical location of a first sensor device , in accordance with a first embodiment of the disclosure ; and

FIGs . 6A to 6E represent a process flowchart illustrating a detailed method of locating industrial assets using a virtual three-dimensional environment , in accordance with a first embodiment of the disclosure .

Various embodiments are described with reference to the drawings , wherein like reference numerals are used to refer the drawings , wherein like reference numerals are used to refer like elements throughout . In the following description, for the purpose of explanation, numerous speci fic details are set forth in order to provide thorough understanding of one or more embodiments . It may be evident that such embodiments may be practiced without these speci fic details .

FIG . 1 is a schematic representation of a network environment 100 for implementation of a system for locating industrial assets using a virtual three-dimensional environment , in accordance with a first embodiment of the disclosure . The network environment 100 may include a first electronic device 102 , a first sensor device 104A, a second sensor device 104B, a second electronic device 106 , and a server 108 . The first electronic device 102 , the first sensor device 104A, the second sensor device 104B, the second electronic device 106 , and the server 108 may be communicatively coupled to each other, via a communication network 110 . The first sensor device 104A and the second sensor device 104B may be located inside an industrial plant 112 . In one example , the industrial plant 112 may be an industrial set up such as a manufacturing facility, a power plant , or a warehouse . In another example , instead of the industrial plant 112 , the network environment 100 may be implemented in a commercial building such as a school , a hospital or a shopping complex . The first sensor device 104A and the second sensor device 104B may comprise sensors and/or status indicators located inside various machines , equipment , computing devices , and actuators of the industrial plant 112 . The first sensor device 104A and the second sensor device 104B may be configured to capture dynamic process data associated with at least one process which is occurring in the industrial plant 112 . Examples of the at least one process may comprise a manufacturing process , a chemical process , a thermal process , an additive process , and an electrical process . Examples of the dynamic process data may include a temperature data, a pressure data, a vibration data, a throughput data, a manufacturing data, a voltage data, and a reaction rate data associated with the at least one process .

The first electronic device 102 and the second electronic device 106 may comprise suitable circuitry, interfaces , and/or code that may be configured to locali ze industrial assets in the industrial plant 112 using the virtual three-dimensional environment . Examples of the first electronic device 102 and the second electronic device 106 may include , but are not limited to , a personal computer, a tablet computer, a smartphone , a laptop, a computer workstation, an augmented reality based device , a computing device , a server, a human-machine interface (HMI ) unit , and/or other consumer electronic ( CE ) devices . In one example , the first electronic device 102 and the second electronic device 106 may comprise a location sensor such as a global positioning system ( GPS ) sensor . The first electronic device 102 and the second electronic device 106 may further comprise may comprise a plurality of other sensors such as gyroscopes , accelerometers , altimeters , and electronic compasses . The electronic device 102 and the second electronic device 106 may further comprise at least one image capture device such as a camera and at least one sound capture device such as a microphone . The first sensor device 104A and the second sensor device 104B may comprise suitable circuitry, interfaces , and/or code that may be configured to capture the dynamic process data associated with the at least one process that is executed in the industrial plant 112 . Examples of the first sensor device 104A and the second sensor device 104B may include , but are not limited to , a thermometer device , a seismic sensor device , a vibration sensor, a pressure gauge , a voltmeter, a microphone , an infrared sensor, and a camera .

The server 108 may comprise suitable circuitry, interfaces , and/or code that may be configured to store information associated with a tag identi fier and a plurality of position coordinates of the first sensor device 104A and the second sensor device 104B . The tag identi fier may include information for asset identi fication of the first sensor device 104A and the second sensor device 104B . The plurality of position coordinates may indicate a first physical location of the first sensor device 104A and a second physical location of the second sensor device 104B in a real-world environment . The plurality of position coordinates may be coordinates of a geo-spatial coordinate system such as a Global Positioning System ( GPS ) or a Global Navigation Satellite System ( GLONASS ) . The server 108 may further comprise a computer- aided-design based model of the industrial plant 112 . The computer-aided-design based model of the industrial plant 112 may comprise information associated with a plurality of locations and a plurality of physical attributes of the first sensor device 104A and the second sensor device 104B .

The communication network 110 may include a communication medium through which the first electronic device 102 , the first sensor device 104A, the second sensor device 104B, the second electronic device 106 , and the server 108 may communicate with each other . Examples of the communication network 110 may include , but are not limited to , a telecommunication network such as a Public Switched Telephone Network ( PSTN) , a 2G network, a 3G network, a 4G network, the Internet , a cloud network, a Wireless Fidelity (Wi-Fi) network, a Local Area Network (LAN) , and/or a Metropolitan Area Network (MAN) . Various devices in the network environment 100 may be configured to connect to the communication network 110, in accordance with various wired and wireless communication protocols. Examples of such wired and wireless communication protocols may include, but are not limited to, at least one of a Transmission Control Protocol and Internet Protocol (TCP/IP) , User Datagram Protocol (UDP) , Hypertext Transfer Protocol (HTTP) , File Transfer Protocol (FTP) , ZigBee, EDGE, IEEE 802.11, IEEE 802.11b, IEEE 802.11g, IEEE 802. lln, and/or any other IEEE 802.11 protocol, multi-hop communication, wireless access point (AP) , device to device communication, cellular communication protocols, Light-fidelity (Li-Fi) , Internet-of- Things (loT) network, or Bluetooth (BT) communication protocols, or a combination or variants thereof.

The first electronic device 102, the first sensor device 104A, the second sensor device 104B, the second electronic device 106, and the server 108 may be implemented as Inter- net-of-things (IOT) devices which may be part of an lOT-cloud platform. The lOT-cloud platform may be a cloud infrastructure capable of providing cloud-based services such as data storage services, data analytics services, and data visualization services, based on the captured dynamic process data. The lOT-cloud platform may be part of a public cloud or a private cloud. The lOT-cloud platform may enable data scien- tists/vendors to provide one or more software applications or a firmware as a service, thereby eliminating a need for software maintenance, upgrading, and backup by one or more users. The one or more software applications may be a full application, or a software patch. In some embodiments, the one or more software applications may comprise an analytical application for performing data analytics on the captured dynamic process data. For example, the one or more software applications may include an application for down-sampling of timeseries data, filtering time-series data based on thresholds, performing Fast-Fourier transform on vibration data, fil- tering frequencies which indicate anomalies, performing linear regression and trend prediction, local classification u- sing support vector machine classifiers, neural network or deep learning classifiers, and performing stream analytics. Examples of the firmware may include but is not limited to Program Logic Controller (PLC) firmware, Human Machine Interface (HMI) firmware, firmware for motor drivers and firmware for robots.

The first electronic device 102 and the second electronic device 106 may be implemented as an exemplary electronic device. Internal components of the exemplary device are explained with reference to FIG. 2. FIG. 2 is explained in conjunction with elements from FIG. 1.

Referring to FIG. 2, there is shown an exemplary electronic device 200 for locating industrial assets in the industrial plant 112, using in the virtual three-dimensional environment. The electronic device 200 is at least one of the first electronic device 102 and the second electronic device 106.

The electronic device 200 may include one or more processors, such as a processor 202, a memory 204, a display device 206, an input device 208, and a network interface 210. The network interface 210 may be configured to communicate with one or more devices among the first electronic device 102, the first sensor device 104A, the second sensor device 104B, the second electronic device 106, and the server 108, via the communication network 110.

The processor 202 may comprise suitable logic, circuitry, interfaces, and/or code that may be configured to execute a set of instructions stored in the memory 204. The processor 202 may be implemented based on a number of processor 202 technologies known in the art. Other examples of the processor 202 may be an X86-based processor, a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (AS IC ) processor, a Complex Instruction Set Computing

(CISC ) processor, and/or other processors .

The memory 204 may comprise suitable logic, circuitry, and/or interfaces that may be configured to store a set of instructions executable by the processor 202 . The memory 204 may be configured to store operating systems and associated applications . The memory 204 may be further configured to store information associated with the tag identi fier and the plurality of position coordinates of the first sensor device 104A and the second sensor device 104B . The memory 204 may further comprise the computer-aided-design based model of the industrial plant 112 . The computer-aided-design based model of the industrial plant 112 may comprise information associated with the plurality of locations and the plurality of physical attributes of the first sensor device 104A and the second sensor device 104B . The memory 204 may further comprise an application repository for storing software and firmware , and data store for storing asset models associated with the first sensor device 104A and the second sensor device 104B . The memory 204 may be further configured to store TOT data models and a visuali zation database for storing visuali zation templates for the dynamic process data . Examples of implementation of the memory 204 may include , but are not limited to , Random Access Memory (RAM) , Read Only Memory (ROM) , Electrically Erasable Programmable Read-Only Memory (EEPROM) , Hard Disk Drive (HDD) , a Solid-State Drive ( SSD) , a CPU cache , and/or a Secure Digital ( SD) card .

The display device 206 may comprise suitable circuitry, interfaces , and/or code that may be configured to display information to a user . Examples of the display device 206 may include , but are not limited to , a Liquid crystal display ( LCD) , a Light emitting Diode ( LED) , a holographic display device and/or a proj ector device .

The input device 208 may comprise suitable circuitry, inter- faces , and/or code that may be configured to receive informa- tion from the user. Examples of the input device 208 may include, but are not limited to, a keyboard, a camera, a microphone, a computer mouse, a trackpad and a joystick.

The network interface 210 may comprise suitable logic, circuitry, interfaces, and/or code that may be configured to establish communication between first electronic device 102, the first sensor device 104A, the second sensor device 104B, the second electronic device 106, and the server 108, via the communication network 110. The network interface 210 may be implemented by use of various known technologies to support wired or wireless communication of the electronic device 200 with the network 110. The network interface 210 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer. The network interface 210 may communicate via wireless communication with networks, such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN) . The wireless communication may use any of a plurality of communication standards, protocols and technologies, such as Global System for Mobile Communications (GSM) , Enhanced Data GSM Environment (EDGE) , wideband code division multiple access (W-CDMA) , Long Term Evolution (LTE) , code division multiple access (CDMA) , time division multiple access (TDMA) , Bluetooth, Wireless Fidelity (Wi-Fi) (such as IEEE 802.11, IEEE 802.11b, IEEE 802.11g, IEEE 802. lln, and/or any other IEEE 802.11 protocol) , voice over Internet Protocol (VoIP) , light fidelity (Li-Fi) , Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS) . The network interface 210 may further comprise a global system for mobile (GSM) module.

An operation of the system for locating the industrial assets in the industrial plant 112 using the virtual three- dimensional environment is explained in detail with reference to FIG . 1 . Referring back to FIG . 1 , in operation, the first electronic device 102 may be configured to receive , from a depth sensor and an image capture device , depth data and image data respectively, for generation of a three- dimensional model of the industrial plant 112 . An exemplary implementation of generation of the three-dimensional model based on the depth data and image data is explained in detail with reference to FIG . 3 . FIG . 3 is explained in conj unction with elements from FIG . 1 and FIG . 2 .

Referring to FIG . 3 , there is shown an autonomous vehicle 302 in the industrial plant 112 . The autonomous vehicle 302 may be communicatively coupled to the first electronic device 102 via the communication network 110 . The autonomous vehicle 302 may comprise a depth sensor 302A such as a light detection and ranging ( LIDAR) device and an image capture device 302B such as a camera . The depth sensor 302A of the autonomous vehicle 302 may be configured to capture the depth data from a field of view 302C of the autonomous vehicle 302 . In other words , the depth data is captured from at least one portion of the industrial plant 112 , which is within the field of view 302C of the autonomous vehicle 302 . In one example , the captured depth data may comprise laser scanning data and may be represented as a laser scanning point cloud 304A. Further, the image capture device 302B may be configured to capture a plurality of images 304B from the field of view 302C of the autonomous vehicle 302 . The first electronic device 102 may be configured to receive the captured depth data and the plurality of images 304B from the depth sensor 302A and the image capture device 302B respectively .

In one embodiment , the first electronic device 102 may be further configured to analyze the plurality of images 304B with a plurality of image processing algorithms . The plurality of image processing algorithms may include at least one of an obj ect recognition algorithm and an optical character recognition algorithm . The first electronic device 102 may be further configured to detect a plurality of fiducial markers ( 308A and 308B ) from the plurality of images 304B, based on the analysis . The plurality of fiducial markers ( 308A and 308B ) are physical markers which may be imprinted on a plurality of portions of the industrial plant 112 , in order to mark a plurality of physical locations of a plurality of sensor devices such as the first sensor device 104A and the second sensor device 104B . The first electronic device 102 may be further configured to determine the plurality of position coordinates associated with the first sensor device 104A and the second sensor device 104B, based on the detection of the plurality of fiducial markers ( 308A and 308B ) in the plurality of images 304B . The determined plurality of position coordinates may be indicative of the first physical location of the first sensor device 104A and the second physical location of the second sensor device 104B . The first physical location and the second physical location may indicate locations of the first sensor device 104A and the second sensor device 104B respectively, in a real-world environment . The first electronic device 102 may be configured to store , in the memory 204 , the determined plurality of position coordinates associated with the first sensor device 104A, and the second sensor device 104B .

The first electronic device 102 may be further configured to stitch together the plurality of images 304B to generate a stitched image . The first electronic device 102 may be further configured to stitch together the plurality of images 304B based on the depth data received from the depth sensor 302A. The first electronic device 102 may be further configured to generate the three-dimensional model of the industrial plant 112 , based on an application of a photogrammetry technique on the stitched image . A view 124 depicts a perspective view of the generated three-dimensional model of the industrial plant 112 . The generated three-dimensional model of the industrial plant 112 exists in a virtual three-dimensional environment . In another example , the depth sensor 302A and the image capture device 302B may be housed in a handheld device such as a smart phone . In one example , the autonomous vehicle 302 may be configured to control the depth sensor 302A and the image capture device 302B to scan an entirety of the industrial plant 112 to generate the depth data and the plurality of images 304B . The first electronic device 102 may be configured to generate a meshed model of the industrial plant 112 , based on the depth data and the plurality of images 304B . The first electronic device 102 may be further configured to generate the three-dimensional model based on the generated meshed model . The generated three-dimensional model may comprise the plurality of locations of the first sensor device 104A and the second sensor device 104B . A perspective view of the generated three-dimensional model is depicted in view 306 .

In another example , the first electronic device 102 may be configured to receive the computer-aided-design based model of the industrial plant 112 from the server 108 . In such a case , the first electronic device 102 may be configured to generate the three-dimensional model of the industrial plant 112 based on the computer-aided-design based model of the industrial plant 112 . The computer-aided-design based model may comprise the plurality of locations and the plurality of physical attributes associated with the first sensor device 104A and the second sensor device 104B . The first electronic device 102 may be configured to generate the three-dimensional model of the industrial plant 112 based on the plurality of locations and the plurality of physical attributes associated with the first sensor device 104A and the second sensor device 104B . The generated three-dimensional model of the industrial plant 112 may further comprise a tag identi fier associated with each of the first sensor device 104A and the second sensor device 104B . It is noted that , the lOT-cloud platform may enable a technician to access one or more data items associated with the each of the first sensor device 104A and the second sensor device 104B, by use of the tag identi fier associated with each of the first sensor device

104A and the second sensor device 104B . Referring back to FIG . 1 , the first electronic device 102 may be configured to determine a first and a second location in the generated three-dimensional model , based on the first physical location and the second physical location of the first sensor device 104A and the second sensor device 104B respectively . The first location in the generated three- dimensional model , may map to the first physical location of the first sensor device 104A in the industrial plant 112 . The second location in the generated three-dimensional model , may map to the second physical location of the second sensor device 104B in the industrial plant 112 .

In one example , the first location and the second location may be determined by the first electronic device 102 based on an execution of a correlation operation between the plurality of position coordinates associated with the first sensor device 104A and the second sensor device 104B, and one or more coordinates of a coordinate system of the generated three- dimensional model of the industrial plant 112 .

The first electronic device 102 may be further configured to receive the dynamic process data captured by the first sensor device 104A and the second sensor device 104B . The first electronic device 102 may be further configured to control the display device 206 to display a first marker at the first location in the generated three-dimensional model and a second marker at the second location of the generated three- dimensional model . The first marker may be indicative of the first physical location of the first sensor device 104A. The second marker may be indicative of the second physical location of the second sensor device 104B . Further, the first electronic device 102 may be further configured to control the display device 206 to display the dynamic process data captured by the first sensor device 104A and the second sensor device 104B . In one example , the first sensor device 104A may be configured to continuously capture the dynamic process data in real-time . In such a case , the display device 206 may be controlled to display the dynamic process data in at least one of real-time or near real-time . In another example , the dynamic process data may be a media stream such as a live audio stream or a live video stream and the first sensor device 104A may be an image sensor or microphone which is configured to continuously capture one of the live media stream the dynamic process data in real-time . In such a case , the display device 206 may be controlled to display the media stream at least one of real-time or near real-time .

The first electronic device 102 may be further configured to receive a current location of a user in the industrial plant 112 . In one example , the first electronic device 102 may receive the current location from a location sensor in a wearable electronic device worn by the user . The first electronic device 102 may be configured to determine a third location in the generated three-dimensional model , which maps to the current location of the user .

In one example , the first electronic device 102 may be configured to generate a first navigation path from the third location in the generated three-dimensional model to the first location in the generated three-dimensional model . Further, the first electronic device 102 may be configured to control the display device 206 to display directions for the user to travel from the current location of the user to the first physical location of the first sensor device 104A based on the generated first navigation path . The first electronic device 102 may be configured to generate the first navigation path based on a shortest path algorithm . The first electronic device 102 may be further configured to continuously track the current location of the user and thereby execute live path tracking of the user based on the generated first navigation path .

Further, the second electronic device 106 may be configured to capture an image from a f ield-of-view 116 of a camera in the second electronic device 106 . The captured image may comprise at least one portion of the industrial plant 112 , which is within the f ield-of-view 116 of the second electronic device 106 . The first electronic device 102 may be configured to receive the captured image from the second electronic device 106 . The first electronic device 102 may be further configured to map the at least one portion of the industrial plant 112 to at least one portion of the generated three- dimensional model , based on an analysis of the captured image . The first electronic device 102 may be further configured to determine that the at least one mapped portion of the generated three-dimensional model contains the first location associated with the first sensor device 104A. The first electronic device 102 may be configured to superpose at least one marker on the captured image , in a case where the first location is determined to be inside the at least one mapped portion of the generated three-dimensional model .

The first electronic device 102 may be further configured to detect an occurrence of a current event in the industrial plant 112 , based on an analysis of the dynamic process data captured by the first sensor device 104A and the second sensor device 104B . The current event may be an occurrence of an error, an occurrence of an anomaly, or an occurrence of an accident during the execution of the at least one process in the industrial plant 112 . The first electronic device 102 may be further configured to detect a location of occurrence of the current event , based on the first location associated with the first sensor device 104A and the second location associated with the second sensor device 104B . The location of occurrence may be determined by the first electronic device 102 as a first set of position coordinates in a real- world environment and a second set of position coordinates in a virtual world environment .

The first electronic device 102 may be further configured to determine a severity level of the current event based on the captured dynamic process data . The severity level may be one of a first severity level , a second severity level , and a third severity level . The first severity level may indicate an alert for the at least one process . The second severity level may indicate a warning about the at least one process . The third severity level may indicate presence of a fault in the at least one process . The first electronic device 102 may be further configured to control the display device 206 to display information associated with the determined severity level . In one example , the severity level may be displayed on the at least one marker which is superposed on the captured image . In one example , the first electronic device 102 controls the display device 206 to display "X" to indicate the first severity level . The first electronic device 102 controls the display device 206 to display "XX" to indicate the second severity level . The first electronic device 102 may be further configured to control the display device 206 to display "XXX" to indicate the third severity level . The first electronic device 102 may be further configured to determine that the occurrence of the current event has caused a health hazard in the industrial plant 112 . In one example , the severity level further comprises information associated with the health hazard associated with the occurrence of the current event . In another example , the first electronic device 102 may be further configured to control the display device 206 to display "XXX" to indicate presence of the health hazard in the industrial plant 112 .

A first view 114 in FIG . 1 , illustrates a user interface of the first electronic device 102 , in a first exemplary scenario . In the first exemplary scenario , the first sensor device 104A and the second sensor device 104B may be temperature sensors . The captured dynamic process data may comprise a first reading of 230DC and a second reading of 380DC . Further, in the exemplary scenario , the first electronic device 102 has detected the occurrence of the current event in the industrial plant 112 . Further, in the exemplary scenario , the first electronic device 102 has determined the severity level of the current level as the third severity level . In the first exemplary scenario , the first view 114 comprises a first marker 114A to indicate the first physical location of the first sensor device 104A and a second marker 114B to indicate the second physical location of the second sensor device 104B . The first marker 114A and the second marker 114B are located in a perspective view 114C of the generated three-dimensional model of the industrial plant 112 . The first view 114 further comprises the dynamic process data, which is the first reading of 230DC and the second reading of 380DC in a banner 114D of the user interface . The first view 114 further comprises a third marker 114E at the third location to indicate the current location of the user .

Similarly, a second view 118 in FIG . 1 , illustrates a user interface of the second electronic device 106 , in the first exemplary scenario . The second view 118 comprises a first superposed marker 118A to indicate the first physical location of the first sensor device 104A and a second superposed marker 118B to indicate the second physical location of the second sensor device 104B . In the first exemplary scenario , the second electronic device 106 captures an image 118C from the f ield-of-view 116 . The first superposed marker 119A and the second superposed marker 118B may be superposed on the captured image 118C . Further, the first superposed marker 118A comprises a "XXX" to indicate the occurrence of the current event with the third severity level .

The first electronic device 102 may be further configured to predict an upcoming event in the industrial plant 112 , based on an analysis of the captured dynamic process data . The predicted upcoming event may be a predicted occurrence of an error, a predicted occurrence of an anomaly, and a predicted occurrence of an accident during the execution of the at least one process in the industrial plant 112 . The first electronic device 102 may be further configured to determine a first set of locations of the industrial plant 112 , where the upcoming event is predicted to occur . The first set of locations are within the real-world environment . The first electronic device 202 may be further configured to determine a second set of locations in the three-dimensional model , which maps to the first set of locations of the industrial plant 112 . The second set of locations are in the virtual world environment . The first electronic device 102 may be further configured to control the display device 206 to display information associated with the predicted upcoming event in the industrial plant 112 . An exemplary implementation of prediction of the upcoming events in the industrial plant 112 , is explained in detail with reference to FIG . 4 . FIG . 4 is explained in conj unction with elements from FIG . 1 , FIG . 2 , and FIG . 3 .

Referring to FIG . 4 , there is shown a top view 400 of the generated three-dimensional model of the industrial plant 112 . The top view 400 comprises a plurality of markers ( 402A, 402B, and 402C ) to indicate the second set of locations in the three-dimensional model of the industrial plant 112 . The second set of locations in the three-dimensional model are indicative of the first set of locations in the industrial plant 112 , where the upcoming event is predicted to occur .

Referring back to FIG . 1 , the first electronic device 102 may be further configured to receive location data associated with a vehicle . The vehicle may be an autonomous land-based vehicle , an autonomous air-based vehicle , or a robotic vehicle . The first electronic device 102 may be further configured to detect a current location of the vehicle with respect to the industrial plant 112 , based on the received location data . The first electronic device 102 may be further configured to determine a fourth location in the generated three- dimensional model , which maps to the current location of the vehicle with respect to the industrial plant 112 . The first electronic device 102 may be further configured to generate a second navigation path from the fourth location of the vehicle to the first location of the first sensor device 104A. The first electronic device 102 may be further configured to control the vehicle to travel from the current location of the vehicle to the first physical location of the first sensor device 104A. The second navigation path may be generated by use of a shortest path algorithm . The first electronic device 102 may be further configured to continuously track the current location of the vehicle and thereby execute live path tracking of the vehicle based on the generated second navigation path .

An exemplary implementation of navigation of the vehicle to the first physical location of the first sensor device 104A, is explained in detail with reference to FIG . 5 . FIG . 5 is explained in conj unction with elements from FIG . 1 , FIG . 2 , FIG . 3 , and FIG . 4 .

With reference to FIG . 5 , there is shown a robotic vehicle 502 in the industrial plant 112 . The robotic vehicle 502 may be communicatively coupled to the first electronic device 102 via the communication network 110 . The robotic vehicle 502 may comprise a location sensor such as a GPS sensor and Inertial Measurement Unit sensor . The robotic vehicle 502 may be configured to transmit location data to the first electronic device 102 . The first electronic device 102 may be further configured to detect a current location of the robotic vehicle 502 in the industrial plant 112 , based on the location data . The first electronic device 102 may be further configured to determine the fourth location in the generated three- dimensional model , which maps to the current location of the robotic vehicle 502 with respect to the industrial plant 112 . The first electronic device 102 may be further configured to generate the second navigation path 504 from the fourth location of the vehicle to the first physical location of the first sensor device 104A. The first electronic device 102 may be further configured to control the robotic vehicle 502 to travel from the current location of the robotic vehicle 502 to the first physical location of the first sensor device 104A. The first electronic device 102 may be further configured to control the robotic vehicle 502 based on the generated second navigation path 504 . The first electronic device 102 may be further configured to enable a user to control the robotic vehicle 502 remotely, through teleoperation . FIGs . 6A- 6E comprises a process flowchart 600 illustrating a detailed method of locating industrial assets using a virtual three-dimensional environment , in accordance with the first embodiment of the disclosure . FIG . 6 is explained in conj unction with elements from FIGs . 1-5 .

At step 602 , the depth data and the image data are received from the depth sensor 302A and an image capture device 302B respectively . The processor 202 may be configured to receive the depth data and the image data from the depth sensor 302A and an image capture device 302B respectively . The depth data is captured from the at least one portion of the industrial plant 112 , which is within the field of view 302C of the autonomous vehicle 302 . The image data ( i . e . the plurality of images 304B ) is captured from the at least one portion of the industrial plant 112 , which is the field of view 302C of the autonomous vehicle 302 . The processor 202 may be further configured to analyze the plurality of images 304B with the plurality of image processing algorithms . The processor 202 may be further configured to detect the plurality of fiducial markers ( 308A and 308B ) from the plurality of images 304B, based on the analysis .

At step 604 , the plurality of position coordinates associated with the first sensor device 104A and the second sensor device 104B may be determined . The processor 202 may be further configured to determine the plurality of position coordinates associated with the first sensor device 104A and the second sensor device 104B, based on the detection of the plurality of fiducial markers in the plurality of images 304B .

At step 606 , the plurality of position coordinates associated with the first sensor device 104A and the second sensor device 104B may be stored in the memory 204 . The processor 202 may be configured to store , in the memory 204 , the plurality of position coordinates associated with the first sensor device 104A, and the second sensor device 104B . At step 608 , the plurality of images 304B are stitched together to generate the stitched image . The processor 202 may be configured to stitch together the plurality of images 304B to generate the stitched image . The processor 202 may stitch together the plurality of images 304B based on the depth data received from the depth sensor 302A. The generated three- dimensional model of the industrial plant 112 exists in a virtual three-dimensional environment .

At step 610 , the computer-aided-design model of the industrial plant 112 is received . The processor 202 may be configured to receive the computer-aided-design model of the industrial plant 112 from the server 108 .

At step 612 , the three-dimensional model of the industrial plant 112 may be generated . The processor 202 may be configured to generate the three-dimensional model of the industrial plant 112 based on at least one of the stitched image and the computer-aided-design based model of the industrial plant 112 .

At step 614 , the first location and the second location may be determined in the generated three-dimensional model . The processor 202 may be configured to determine the first and the second location in the generated three-dimensional model , based on the plurality of position coordinates associated with the first sensor device 104A and the second sensor device 104B respectively . The first location in the generated three-dimensional model , may map to the first physical location of the first sensor device 104A in the industrial plant 112 . The second location in the generated three-dimensional model , may map to the second physical location of the second sensor device 104B in the industrial plant 112 .

At step 616 , the dynamic process data captured by the first sensor device 104A and the second sensor device 104B may be received . The processor 202 may be further configured to re- ceive the dynamic process data captured by the first sensor device 104A and the second sensor device 104B .

At step 618 , the display device 216 may be controlled to display the first marker at the first location in the generated three-dimensional model and the second marker at the second location of the generated three-dimensional model . The processor 202 may be further configured to control the display device 206 to display the first marker at the first location in the generated three-dimensional model and the second marker at the second location of the generated three-dimensional model , to indicate the first physical location and the second physical location respectively .

At step 620 , the display device 206 may be controlled to display the dynamic process data captured by the first sensor device 104A and the second sensor device 104B . Further, the processor 202 may be further configured to control the display device 206 to display the dynamic process data captured by the first sensor device 104A and the second sensor device 104B .

At step 622 , the current location of the user may be determined . The processor 202 may be further configured to determine the current location of a user in the industrial plant 112 . In one example , the processor 202 may receive the current location from a location sensor of a wearable electronic device worn by the user .

At step 624 , the third location in the generated three- dimensional model may be determined . The processor 202 may be configured to determine the third location in the generated three-dimensional model , which maps to the current location of the user .

At step 626 , the first navigation path may be generated between the third location and the first location . The processor 202 may be configured to generate a first navigation path from the third location in the generated three- dimensional model to the first location in the generated three-dimensional model .

At step 628 , the display device 206 may be controlled to display directions for the user to travel from the current location of the user to the first physical location of the first sensor device 104A. Further, the processor 202 may be configured to generate directions for the user to travel from the current location of the user to the first physical location of the first sensor device 104A based on the generated first navigation path .

At step 630 , the captured image may be received from the second electronic device 106 . The processor 202 be configured to receive the captured image from the second electronic device 106 . The captured image may comprise the at least one portion of the industrial plant 112 .

At step 632 , the at least one portion of the industrial plant 112 to may be mapped to at least one portion of the generated three-dimensional model , based on the analysis of the captured image . The processor 202 may be further configured to map the at least one portion of the industrial plant 112 to at least one portion of the generated three-dimensional model , based on the analysis of the captured image .

At step 634 , the at least one mapped portion of the generated three-dimensional model is determined to contain the first location associated with the first sensor device 104A. The processor 202 may be further configured to determine that the first location associated with the first sensor device 104A is present inside the at least one mapped portion of the generated three-dimensional model .

At step 636 , the at least one marker is superposed on the captured image . The processor 202 may be configured to superpose the at least one marker on the captured image , in a case where the first location is determined to be present inside the at least one mapped portion of the generated three- dimensional model .

At 638 , the occurrence of the current event in the industrial plant 112 , may be detected . The processor 202 may be further configured to detect the occurrence of the current event in the industrial plant 112 , based on the analysis of the captured dynamic process data .

At 640 , the severity level of the current event in the industrial plant 112 , may be detected and the information associated with the severity level associated with the current event may be displayed in the display device 206 . The processor 202 may be further configured to determine the severity level of the current event based on the captured dynamic process data . The processor 202 may be further configured to control the display device 206 to display information associated with the severity level associated with the current event .

At step 642 , the occurrence of the upcoming event in the industrial plant 112 , may be predicted . The processor 202 may be further configured to predict the upcoming event in the industrial plant 112 , based on the analysis of the captured dynamic process data .

At step 644 , the display device 206 may be controlled to display information associated with the predicted upcoming event in the industrial plant 112 . The processor 202 may be further configured to control the display device 206 to display information associated with the predicted upcoming event in the industrial plant 112 .

At step 646 , the current location of the vehicle is determined . The processor 202 may be further configured to detect the current location of the vehicle with respect to the industrial plant 112 . At step 648 , the fourth location in the generated three- dimensional model , which maps to the current location of the vehicle , may be determined . The processor 202 may be further configured to determine a fourth location in the generated three-dimensional model , which maps to the current location of the vehicle with respect to the industrial plant 112 .

At step 650 , the second navigation path from the fourth location of the vehicle to the first location of the first sensor device 104A, may be generated . The processor 202 may be further configured to generate the second navigation path from the fourth location of the vehicle to the first location of the first sensor device 104A.

At step 652 , the vehicle may be controlled to travel from the current location of the vehicle to the first physical location of the first sensor device 104A. The processor 202 may be further configured to control the vehicle to travel from the current location of the vehicle to the first physical location of the first sensor device 104A, based on the generated second navigation path .

The present disclosure can take a form of a computer program product comprising program modules accessible from computer- useable or computer-readable medium storing program code for use by or in connection with the one or more computers , processors , or instruction execution system . For the purpose of this description, a computer-useable or computer readable medium can be any apparatus that can contain store , communicate , propagate , or transport the program for use by or in connection with the instruction execution system, apparatus , or device . The medium can be electronic, magnetic, optical , electromagnetic, infrared, or semiconductor system ( or apparatus or device ) or a propagation mediums in and of themselves as signal carriers are not included in the definition of physical computer-readable medium include a semiconductor or solid state memory, magnetic tape , a removable computer dis- kette , random access memory (RAM) , a read only memory (ROM) , a rigid magnetic disk and optical disk such as compact disk read-only memory ( CD-ROM) , compact disk read/write , and DVD . Both processors and program code for implementing each aspect of the technology can be centrali zed or distributed ( or a combination thereof ) as known to those skilled in the art .

The present disclosure may be reali zed in hardware , or a combination of hardware and software . The present disclosure may be reali zed in a centrali zed fashion, in at least one computer system, or in a distributed fashion, where di f ferent elements may be spread across several interconnected computer systems . A computer system or other apparatus adapted to carry out the methods described herein may be suited . A combination of hardware and software may be a general-purpose computer system with a computer program that , when loaded and executed, may control the computer system such that it carries out the methods described herein . The present disclosure may be reali zed in hardware that comprises a portion of an integrated circuit that also performs other functions .

While the present disclosure has been described in detail with reference to certain embodiments , it should be appreciated that the present disclosure is not limited to those embodiments . In view of the present disclosure , many modi fications and variations would be present themselves , to those skilled in the art without departing from the scope of the various embodiments of the present disclosure , as described herein . The scope of the present disclosure is , therefore , indicated by the following claims rather than by the foregoing description . All changes , modi fications , and variations coming within the meaning and range of equivalency of the claims are to be considered within their scope . All advantageous embodiments claimed in method claims may also be apply to system/ apparatus claims .




 
Previous Patent: ORAL CARE COMPOSITION

Next Patent: TORQUE SENSOR