Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR WELD PATH GENERATION
Document Type and Number:
WIPO Patent Application WO/2020/076922
Kind Code:
A1
Abstract:
Embodiments of the present disclosure are directed towards a robotic system and method. The system may include a robot and a three dimensional sensor device associated with the robot configured to scan a welding area and generate a scanned welding area. The system may include a processor configured to receive the scanned welding area and to generate a three dimensional point cloud based upon, at least in part the scanned welding area. The processor may be further configured to perform processing on the three dimensional point cloud in a two-dimensional domain. The processor may be further configured to generate one or more three dimensional welding paths and to simulate the one or more three dimensional welding paths.

Inventors:
CHANG CHU-YIN (US)
LIMONE BRETT L (US)
POLIMENI JR (US)
ENGLISH JAMES D (US)
Application Number:
PCT/US2019/055353
Publication Date:
April 16, 2020
Filing Date:
October 09, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TERADYNE INC (US)
International Classes:
B23K9/16; B23K9/127; B23K37/02
Foreign References:
US20180065204A12018-03-08
CN107931830A2018-04-20
CN104400265A2015-03-11
JP2005271103A2005-10-06
JPH1177308A1999-03-23
US5999642A1999-12-07
US20100152870A12010-06-17
EP1188510A22002-03-20
CN106141374A2016-11-23
US6757587B12004-06-29
US7680300B22010-03-16
US8301421B22012-10-30
US8408918B22013-04-02
US8428781B22013-04-23
US9357708B22016-06-07
US20150199458A12015-07-16
US20160321381A12016-11-03
US20180060459A12018-03-01
Other References:
See also references of EP 3863791A4
Attorney, Agent or Firm:
WHITTENBERGER, Mark H. et al. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system comprising:

a robot;

one or more three dimensional sensor devices associated with the robot configured to scan a welding area and generate a scanned welding area; and

one or more processors configured to receive the scanned welding area and to generate a three dimensional point cloud based upon, at least in part the scanned welding area, the one or more processors further configured to perform processing on the three dimensional point cloud in a two-dimensional domain, the one or more processors further configured to generate one or more three dimensional welding paths and to simulate the one or more three dimensional welding paths.

2. The system of claim 1, wherein processing the three-dimensional point cloud in a two- dimensional domain includes at least one of binary image thinning of a welding path or smoothing a welding path based upon, at least in part, a moving average filter.

3. The system of claim 1, wherein processing the three-dimensional point cloud in a two- dimensional domain includes analysis of one or more environmental or robot constraints.

4. The system of claim 1, wherein simulating the one or more three dimensional welding paths includes verifying feasibility of the one or more three dimensional welding paths.

5. The system of claim 4, wherein if the one or more three dimensional welding paths are not verified, the one or more processors further configured to re-simulate after altering a position of a welding table or a welding part.

6. The system of claim 1, wherein performing processing on the three dimensional point cloud in a two-dimensional domain includes converting the three dimensional point cloud to a height field.

7. The system of claim 6, wherein performing processing on the three dimensional point cloud in a two-dimensional domain includes applying a local minimum filter to the height field.

8. The system of claim 2, wherein binary image thinning includes removing one or more branch nodes, wherein a branch node corresponds to a pixel having more than two neighbors.

9. The system of claim 1, further comprising: displaying a graphical user interface configured to allow a user to select a filter size, select a point cloud for processing, visualize a weld path result or save the one or more three- dimensional welding paths.

10. The system of claim 1, wherein the one or more processors are configured to use one or more extra degrees of freedom to maintain a welding formation with respect to gravity as a welding process is being performed.

11. A method comprising:

providing a robot; scanning a welding area using one or more three dimensional sensor devices associated with the robot to generate a scanned welding area;

receiving the scanned welding area at one or more processors;

generating a three dimensional point cloud based upon, at least in part the scanned welding area;

processing the three dimensional point cloud in a two-dimensional domain to generate one or more three dimensional welding paths; and

simulating the one or more three dimensional welding paths.

12. The method of claim 11, wherein processing the three-dimensional point cloud in a two- dimensional domain includes at least one of binary image thinning of a welding path or smoothing a welding path based upon, at least in part, a moving average filter.

13. The method of claim 11, wherein processing the three-dimensional point cloud in a two- dimensional domain includes analysis of one or more environmental or robot constraints.

14. The method of claim 11, wherein simulating the one or more three dimensional welding paths includes verifying feasibility of the one or more three dimensional welding paths.

15. The method of claim 14, wherein if the one or more three dimensional welding paths are not verified, the one or more processors further configured to re-simulate after altering a position of a welding table or a welding part.

16. The method of claim 11, wherein performing processing on the three dimensional point cloud in a two-dimensional domain includes converting the three dimensional point cloud to a height field.

17. The method of claim 16, wherein performing processing on the three dimensional point cloud in a two-dimensional domain includes applying a local minimum filter to the height field.

18. The method of claim 12, wherein binary image thinning includes removing one or more branch nodes, wherein a branch node corresponds to a pixel having more than two neighbors.

19. The method of claim 11, further comprising: displaying a graphical user interface configured to allow a user to select a filter size, select a point cloud for processing, visualize a weld path result or save the one or more three- dimensional welding paths.

20. The method of claim 11, further comprising: using one or more processors to maintain a welding formation with respect to gravity as a welding process is being performed wherein maintaining the welding formation is based upon, at least in part, one or more extra degrees of freedom associated with the robot.

Description:
SYSTEM AND METHOD FOR WELD PATH GENERATION

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This application claims the benefit of U.S. Utility Application No. 16/159,197, filed on October 12, 2018, entitled“System and Method for Weld Path Generation”, the content of which is incorporated herein by reference.

TECHNICAL FIELD

[0002] The invention generally relates to robotics and more specifically to a system and method for generating a weld path.

BACKGROUND

[0003] Robots and robotic systems are being used in more and more industrial applications. For example, robots have been used to assist in numerous welding tasks. These types of systems generally require a computer aided design (“CAD”) model in order to generate the most accurate and efficient welding path. These existing systems are also directed towards scanning a particular area of a part and do not scan the entire possible welding area.

SUMMARY

[0004] In one or more embodiments of the present disclosure, a system is provided. The system may include a robot and one or more three dimensional sensor devices associated with the robot configured to scan a welding area and generate a scanned welding area. The system may further include one or more processors configured to receive the scanned welding area and to generate a three dimensional point cloud based upon, at least in part the scanned welding area, the one or more processors further configured to perform processing on the three dimensional point cloud in a two-dimensional domain, the one or more processors further configured to generate one or more three dimensional welding paths and to simulate the one or more three dimensional welding paths.

[0005] One or more of the following features may be included. Processing the three- dimensional point cloud in a two-dimensional domain may include at least one of binary image thinning of a welding path or smoothing a welding path based upon, at least in part, a moving average filter. Processing the three-dimensional point cloud in a two-dimensional domain may also include analysis of one or more environmental or robot constraints. Simulating the one or more three dimensional welding paths may include verifying feasibility of the one or more three dimensional welding paths. If the one or more three dimensional welding paths are not verified, the one or more processors may be further configured to re-simulate after altering a position of a welding table or a welding part. Performing processing on the three dimensional point cloud in a two-dimensional domain may include converting the three dimensional point cloud to a height field. Performing processing on the three dimensional point cloud in a two-dimensional domain may include applying a local minimum filter to the height field. Binary image thinning may include removing one or more branch nodes, wherein a branch node corresponds to a pixel having more than two neighbors. The system may further include a graphical user interface configured to allow a user to select a filter size, select a point cloud for processing, visualize a weld path result or save the one or more three-dimensional welding paths. The system may use one or more processors to maintain a welding formation with respect to gravity as a welding process is being performed. Maintaining the welding formation may be based upon, at least in part, one or more extra degrees of freedom associated with the robot.

[0006] In another embodiment of the present disclosure, a method is provided. The method may include providing a robot and scanning a welding area using one or more three dimensional sensor devices associated with the robot to generate a scanned welding area. The method may further include receiving the scanned welding area at one or more processors and generating a three dimensional point cloud based upon, at least in part the scanned welding area. The method may also include processing the three dimensional point cloud in a two-dimensional domain to generate one or more three dimensional welding paths and simulating the one or more three dimensional welding paths.

[0007] One or more of the following features may be included. Processing the three- dimensional point cloud in a two-dimensional domain may include at least one of binary image thinning of a welding path or smoothing a welding path based upon, at least in part, a moving average filter. Processing the three-dimensional point cloud in a two-dimensional domain may also include analysis of one or more environmental or robot constraints. Simulating the one or more three dimensional welding paths may include verifying feasibility of the one or more three dimensional welding paths. If the one or more three dimensional welding paths are not verified, the one or more processors may be further configured to re-simulate after altering a position of a welding table or a welding part. Performing processing on the three dimensional point cloud in a two-dimensional domain may include converting the three dimensional point cloud to a height field. Performing processing on the three dimensional point cloud in a two-dimensional domain may include applying a local minimum filter to the height field. Binary image thinning may include removing one or more branch nodes, wherein a branch node corresponds to a pixel having more than two neighbors. The method may allow a user to select a filter size, select a point cloud for processing, visualize a weld path result or save the one or more three-dimensional welding paths. The method may include using one or more processors to maintain a welding formation with respect to gravity as a welding process is being performed wherein maintaining the welding formation is based upon, at least in part, one or more extra degrees of freedom associated with the robot.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] For a better understanding of the nature and objects of the present disclosure, reference is made to the following detailed description taken in conjunction with the following drawings, in which:

[0009] FIG. l is a block diagram of a weld path generation robotic system, according to an embodiment of the present disclosure;

[0010] FIG. 2 is a graphical user interface showing multiple degrees of freedom of a weld path generation robotic system, according to an embodiment of the present disclosure;

[0011] FIG. 3 is a flowchart of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0012] FIG. 4 is another flowchart of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0013] FIG. 5 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0014] FIG. 6 is a scanner for use with a weld path generation robotic method, according to an embodiment of the present disclosure; [0015] FIG. 7 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0016] FIG. 8 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0017] FIG. 9 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0018] FIG. 10 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0019] FIG. 11 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0020] FIG. 12 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0021] FIG. 13 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0022] FIG. 14 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure;

[0023] FIG. 15 is a diagram showing aspects of a weld path generation robotic method, according to an embodiment of the present disclosure; and

[0024] FIG. 16 is a graphical user interface of a weld path generation system, according to an embodiment of the present disclosure.

DETAILED DESCRIPTION

[0025] Embodiments of the subject application may include concepts from U.S. Patent No.

6,757,587, U.S. Patent No. 7,680,300, U.S. Patent No. 8,301,421, U.S. Patent No. 8,408,918, U.S. Patent No. 8,428,781, U.S. Patent No. 9,357,708, U.S. Publication No. 2015/0199458, U.S.

Publication No. 2016/0321381, and U. S. Publication No. 2018/0060459, the entire contents of each are incorporated herein by reference in their entirety.

[0026] Referring now to FIG. 1, an embodiment of a robotic system 100 for use in generation of one or more welding paths is provided. System 100 may include a plurality of components, portions of which may be designed for a particular application and/or task. The first component of the system may include a software system 102 for adding new processes to a database 104. Once built, database 104 may be reused by operators in the field or remotely. Operators may select an element from the database 104 using a graphical user interface 106 for execution by the control software 108 as is shown in FIG. 1. Procedures for the particular application and/or task (e.g. welding, robotic assembly, etc.) may be added to database 104 by experts offline. This database 104 may be used with a graphical user interface 106 and tasking software online to develop each procedure for each task. The software modules may include, but are not limited to training, tasking, and performance of the particular task, etc. All of this may be used to control the manner of operation of robotic hardware 110. Robotic hardware 110 may respond to the controls received from control software 108 and may include one or more sensing devices, scanners, and/or other various devices as is discussed in further detail hereinbelow.

[0027] In a robotic system, the phrase“degrees of freedom” may refer to specific, defined modes in which a mechanical device or system can move. The number of degrees of freedom may be equal to the total number of independent displacements or aspects of motion. For example, a six degrees of freedom (“6DOF”) scenario may refer to the freedom of movement of a rigid body in three-dimensional space. Specifically, the body may be free to change position as forward/backward (surge), up/down (heave), left/right (sway) translation in three perpendicular axes, combined with changes in orientation through rotation about three perpendicular axes, often termed yaw (normal axis), pitch (transverse axis), and roll (longitudinal axis). In contrast, placing a point in space may correspond three degrees of freedom, specifying a distance between two points on different links is one degree of freedom, etc.

[0028] In some embodiments, the phrase“robotic system”, as used herein, may include a system of one, two, and/or any number of robots. In this way, an entire robotic system DOF may refer to the sum of the DOFs on each of the individual robots. This may include one DOF for each single-axis joint, and six DOF for a free-moving base. For example, for a robotic system that includes two robots, one having 6DOF and the other having 5DOF, the available entire robotic system degrees of freedom may be 11DOF.

[0029] For example, and referring now to FIG. 2, a 3D rendering from a graphical user interface 200 depicting a welding tool having five degrees of freedom contrasted with a welding tool having six degrees of freedom, full 3D rotation and orientation, is provided. The difference between the two is the five degrees of freedom may be found by relaxing rotation about one axis. This particular model was tuned to optimize a scan and weld process. This involved configuring the bounding volumes around the scanner tool, as well as aligning the system and base primary frames to make the tool paths and point clouds appear in the correct locations. The tool offsets were also configured based on how the tool paths were created from the point clouds. The paths were created along the bottom of the seams, so an offset from the tip was configured so there were no collisions. As shown in FIG. 2, the arm may be constrained using two different constraint sets, with each one using a different tool offset, one for scanning the part and one for welding. The degrees of freedom set for scanning uses a six degree of freedom frame, and the welding set uses a five degree of freedom frame that allows the tip of the torch to freely rotate around the tip. This allows scanning for the part first, then relaxing of the degrees of freedom for the weld paths, which are more difficult for the robot to achieve given its workspace envelope.

[0030] In some embodiments, and referring now to FIG. 3, an embodiment depicting a flowchart having a number of operations consistent with a weld path generation process is provided. As will be discussed in further detail below, weld path generation process 10 may include scanning a welding area and generating 302 point cloud data. The process may further include converting 304 the point cloud data to a height field and applying 306 a local minimum filter. The process may also include performing 308 binary image thinning and branch removal. The established height field may be converted 310 back to a three dimensional space prior to simulation 312.

[0031] In some embodiments, and referring now to FIG. 4, another embodiment consistent with a weld path generation process is provided. The method may include providing 402 one or more robots or robotic systems and scanning 404 a welding area using one or more three dimensional sensor devices associated with the one or more robots to generate a scanned welding area. The system may include a processor configured to receive 406 the scanned welding area and to generate 408 a three dimensional point cloud based upon, at least in part the scanned welding area. The processor may be further configured to perform processing 410 on the three dimensional point cloud in a two-dimensional domain. The processor may be further configured to generate one or more three dimensional welding paths and to simulate 412 the one or more three dimensional welding paths.

[0032] Referring now to FIG. 5, a diagram 500 showing an embodiment consistent with weld path generation process is provided. Diagram 500 depicts an example of laser scanning and subsequent weld path generation. In some embodiments, a laser line scanner 502 may be used to pre-scan the welding area. In this particular example, scanner 502 may be mounted to the end of a robotic system. Weld path generation process may calculate the weld path and volume and provide that information as an input to the on-line process as is discussed below. The right portion of diagram 500 shows the resulting point cloud 504 that may be generated using this particular approach, here, having a resolution of 0.2mm.

[0033] Referring now to FIG. 6, an example of a scanning system 600 that may be used in accordance with weld path generation process is provided. For example, a laser line scanner may be used for both pre-scanning the welding area and also for on-line weld groove tracking. The laser line scanner may be mounted on the end effector of the robot and may be positioned to look a few centimeters ahead of the welding area to avoid the any light. It may be used to correct any weld path deviation away from the pre-scanned data.

[0034] As shown in FIG. 6, laser line scanner 602 may be configured to operate in conjunction with a TIG torch 604 and rod feeder 606. This scanning system is provided merely by way of example as numerous scanners may be used in accordance with weld path generation process.

For example, either two dimensional visible sensor or thermal sensors may be used to monitor the welding process. As such, the visible sensor may be used to record the entire process. The thermal information can be used to prevent overheating of the surrounding area. One related aspect of TIG welding involves keeping the filler rod in the puddle. This may be monitored using a conductivity sensor to sense the conductivity between the filler rod and the metal. When the filler rod is in the puddle, the rod and the metal are conducted, otherwise it is an open loop.

To prevent over feeding the filler rod, the feeding force may be monitored. If the filler rod was fed too quickly, it may hit the metal and the force may increase. By monitoring the feeding force via a force sensor and the conductivity between the filler rod and the metal, the filler rod can be maintained in the puddle. [0035] In some embodiments, scanner 502, 602 may be configured to pre-scan the area and find the weld path for offline planning and seam tracking for use in an online process. During the pre-scanning process, the robot or robotic system may be used to move the laser scanner linearly and collect points for the point cloud. The resolution of the scanning may be user specified and in this case, a 0.25mm spacing was used.

[0036] In some embodiments, weld path generation process may be configured to create a weld path from the point cloud. For example, the point cloud may be converted to a height field.

Then points that are local minimums may be identified. This may be followed by a thinning process and branch removal so the path becomes well defined. The point on the path may be used to calculate a path with normal and tangent defined.

[0037] Referring now to FIG. 7, an example showing a conversion from a three dimensional point cloud to a height field is provided. In this particular example, converting from the point cloud to the height field may be fairly efficient. The point cloud may be converted to a two dimensional array with the height as the value in the entries of the array. In this case the resolution of the grids may be 0.5mm and, accordingly, the resulting array size is 506 x 451.

This approach may work well for most cases if the weld path is visible from one side. The data is essentially converted into a two dimensional image and sent to an image processing tool. In situations where the weld path may not be visible from one side, such as welding to join two pipes, a different scanning pattern may be needed.

[0038] Referring now to FIG. 8, an example showing a local minimum filter being applied to the height field two dimensional array is provided. The result is shown as a binary image. As discussed above, and in order to identify the seam in the scanned data, a local minimum filter may be applied. Given the kernel size k, number of angles and threshold t, the filter may traverse through all of the points on the two dimensional array to see if the height difference from the center of the kernel to the edge exceeds the threshold. FIGS. 8-9 show an example with k = 10, number of angle = 8 and threshold = l.25mm. It can be seen that this is a fairly effective filter to select the local minimum from scanned data. The only drawback is that the resulting weld path is relatively thick.

[0039] In some embodiments, weld path generation process may utilize binary image thinning techniques for reducing the thickness of a region without changing the length. In contrast, image erosion, or dilation may end up changing the width and length at the same time. In this particular example, the end result of the thinning process provides regions of the image having a width of one pixel. As can be seen from FIG. 10, the resulting image blocks may not be ready to be used as welding paths, there are branches in the region that need to be removed.

[0040] In some embodiments, and with regard to branch removal, the goal is to have each image block include two leaf nodes and no branch node. As used herein, a leaf node is referred to as a pixel that has only one neighbor. A branch node is referred to as a pixel having more than two neighbors. The branch removal process may include, but is not limited to, creating a neighbor map, which registers each pixel with a code indicating the presence of the 8 neighbors. This also creates a list of the leaf nodes and a list of the branch nodes. The branch removal process may then traverse through all leaf nodes, removing branches that are shorter than a predefined threshold. The branch removal process further includes updating the neighbor map and visiting all of the branch nodes and removing the ones that do not affect connectivity. The process may then apply the thinning methodology and proceed back to the creation of the neighbor map operation until the image stops changing. [0041] Referring again to FIG. 10, an example of the branch removal process is provided. In this case, branch nodes are color coded as blue, which indicates that this pixel has more than two neighbors. Branch pixels are color coded as green and will be removed. Finally, the path created on the height field may be converted back to the 3D space and displayed side by side with the original point cloud as shown in FIG. 11.

[0042] In some embodiments, after the branch removal process, each image block may now have exactly two leaf nodes and no branch node. It is possible to connect the pixels in the block from one leaf nodes to the other without any ambiguity. Additional information such as the normal direction of the weld as well as the tangential direction of the path may be needed for orienting the weld head. In some cases it may be desirable to position the weld head normal to the surface and aligned with the tangential direction of the path. The tangential direction may be calculated as

[0043] is the i-th point of the path. The filter size k helps to smooth the otherwise noisy tangent calculation. The normal vector may then be calculated as

[0044] Where v up is the up direction unit vector. FIG. 12 shows an example of a conversion from three dimensional points to the path. Normal and tangential information may be added for orientation of the weld head. In this example of the converted path, blue indicates the normal and red indicating the tangent. [0045] In some embodiments, weld path generation process may utilize one or more graphical user interfaces to allow for user control of the process. For example, if jitter is identified in the weld path, the process may allow for the application of a moving average filter and

corresponding graphical user interface for smoothing the path. By applying a moving average filter of size 7, it can be seen that the path was a lot smoother after filtering. In the example shown in FIG. 16, a user may specify the filter size, start processing the point cloud, display the result, then save the path to a file.

[0046] In some embodiments, weld path generation process may be configured to use one or more extra degrees of freedom to help control the weld pool formation relative to the gravity. In this, the focus is on the parts being hold by the robot. For example, one robot/mechanism may be configured to hold the weld head while another holds the part. Accordingly, the combined motion may allow for the weld head to move along the weld path and/or maintain the weld pool formation against the direction of the gravity.

[0047] It should be noted that although many of the embodiments included herein are directed towards generating the weld path, with data obtained from scanning prior to the welding other operations are also possible without departing from the scope of the present disclosure. For example, the extra degrees of freedom for helping against gravity may be used during the welding process. In some embodiments, this may be determined during the planning, for example, before the welding as well. Accordingly, embodiments of weld path generation process may use one or more processors to maintain a welding formation with respect to gravity as a welding process is being performed. Maintaining the welding formation may be based upon, at least in part, one or more extra degrees of freedom associated with the robot. [0048] Additionally and/or alternatively, additional sensors and a laser scanner used for visual servoing (e.g., fine turning the path in anticipating that the path can deviate from the planned path due to environment changes) may be used as well.

[0049] As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a circuit,”“module” or“system.” Furthermore, aspects of the present disclosure may take the orm of a computer program product embodied in one or more computer readable medium(s) aving computer readable program code embodied thereon.

[0050] Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a

nonexhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable readonly memory (EPROM or Flash memory), an optical fiber, a portable compact disc read- only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.

[0051] A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave.

[0052] Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

[0053] Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).

[0054] Aspects of the present disclosure are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions.

These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

[0055] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.

[0056] The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

[0057] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may, but not always, represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

[0058] The terminology used herein is for the purpose of describing particular implementations only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps (not necessarily in a particular order), operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps (not necessarily in a particular order), operations, elements, components, and/or groups thereof.

[0059] The corresponding structures, materials, acts, and equivalents of all means or step plus function elements that may be in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications, variations, substitutions, and any combinations thereof will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the disclosure. The implementation(s) were chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various implementation(s) with various modifications and/or any combinations of implementation(s) as are suited to the particular use contemplated.

[0060] Having thus described the disclosure of the present application in detail and by reference to implementation(s) thereof, it will be apparent that modifications, variations, and any combinations of implementation(s) (including any modifications, variations, substitutions, and combinations thereof) are possible without departing from the scope of the disclosure defined in the appended claims.