Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ROBOT CELL SETUP SYSTEM AND PROCESS
Document Type and Number:
WIPO Patent Application WO/2020/231319
Kind Code:
A1
Abstract:
A robot cell setup system and process comprising interacting system components: an articulated vision-guided robot (11; 12), a robot controller (21), a cell generator (23). A robot edge computer (20) is configured to formulate instructions for robot operation within the robot cell based on a cell description file (25) of computer readable format imported from the cell generator (23), and to implement these instructions as robot control code in the robot controller (21). The cell description file (15) is confirmed or modified based on real-time feedback in the form of image data provided from the vision-guided robot.

Inventors:
FLORDAL OSKAR (SE)
Application Number:
PCT/SE2020/050493
Publication Date:
November 19, 2020
Filing Date:
May 14, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIBAP AB (SE)
International Classes:
B25J9/16; B25J13/08; G05B19/4061
Foreign References:
US20190047149A12019-02-14
US20170361461A12017-12-21
KR20110095700A2011-08-25
KR101919463B12019-02-08
CN108029340A2018-05-15
CN109048926A2018-12-21
Attorney, Agent or Firm:
PATENTFIRMAN HENRIK FRANSSON AB (SE)
Download PDF:
Claims:
CLAIMS

1. A robot cell setup system comprising interacting system components:

- a cell generator (23) comprising a programmable computer or processor programmed to generate cell description files (25) of computer readable format,

- a robot edge computer (20) comprising a programmable computer or processor programmed to formulate instructions for robot operation based on a cell description file (25) imported from the cell generator (23), and to implement these instructions as robot control code in a robot controller (21),

- a robot controller (21) comprising a programmable computer or processor programmed to execute robot control code,

- an articulated robot (1 1; 12) controllable in at least three axes of motion (X, Y, Z), the robot carrying image acquisition means (19) for vision guidance,

- wherein the robot (11; 12) is operative for feeding image data (19) to the cell generator (23) or the robot edge computer (20), and wherein based on said image data, the cell generator (23) is operative for verification or modification of the cell description file (25).

2. The system of claim 1, comprising image processing software integrated with the robot.

3. The system of claim 1 or 2, wherein the robot edge computer (20) comprises software designed to apply image data as input to at least one of a movement planner programme and a work planner programme installed in the robot edge computer software.

4. The system of any previous claim, wherein the cell generator (23) is configured to run a neural network algorithm for robot training, based on CAD -drawings and product specifications imported to the cell generator.

5. The system of any previous claim, comprising a visualization generator (27) producing a graphic presentation of the robot cell of the cell description file (25).

6. The system of any previous claim, comprising construction plans generator (27) for building installation of the robot cell based on the cell description file (25).

7. A robot cell setup process comprising:

- providing a cell generator (23) programmed to generate cell description files (25) of computer readable format,

- providing a robot edge computer (20) programmed to formulate instructions for robot operation based on a cell description file (25) imported from the cell generator (23), and to implement these instructions as robot control code in a robot controller (21),

- providing a robot controller (21) programmed to execute robot control code,

- providing an articulated robot (11 ; 12) controllable in at least three axes of motion (X, Y, Z), equipping the robot with image acquisition means (19) for vision guidance,

- feeding image data acquired by the robot to the cell generator (23), and operate the cell generator for verification or modification of the cell description file (25) based on said image data.

8. The process of claim 7, further comprising:

- generating a digital map of a robot cell area in a factory,

- defining a robot location (Z) within the robot cell area,

- calculating effective robot range with regard to physical factory constraints and robot load limits,

- determining pickup-, work- and delivery locations within the robot range and with regard to processed product specifications,

- generating a layout of the robot cell including means for feeding products to and from the robot cell,

- compiling the above data in the cell description file (25).

9. The process of claim 7 or 8, comprising the steps of:

- providing 2D or 3D image acquisition means (19) and image processing software integrated with the robot (11; 12),

- applying image data as input to at least one of a movement planner programme and a work planner programme installed in the robot edge computer (20) software.

10. The process of any of claims 7-9, comprising:

- defining the sequential steps of an automated production process based on CAD-drawings of products, product specifications, or digital 2D or 3D representations of products (24),

- determining the spatial locations, in the robot’s local coordinate system, of pickup position (6), manufacturing/ assembly position (13) and delivery position (9) of processed products,

- choosing the relevant tool (16; 17) for the robot work (assembly, manufacture, inspection etc.), and

- training the robot in a neural network (26) against feedback provided by the image acquisition means (19) of the vision guided robot.

1 1. The process of any of claims 7- 10, wherein compilation of the robot cell

description file (25) includes processing of digitized descriptions of the following process and product parameters:

• physical constraints in the form of available space, robot cell

limits and location according to factory plans and concurrent production,

• feed of products to and from the robot cell,

• coordination in time and space with nearby robot cells and robots,

• specification of work and choice of tools required for the steps of production in the robot cell.

12. The process of any of claims 7- 1 1, wherein the cell generator (23) software contains executable programmes configured to receive and process the following import data: • numerical factory data and digitized factory layout drawings,

• data on concurrent production in nearby robot cells,

• CAD drawing files,

• product specifications,

• digital 2D or 3D representation of processed products, and to generate at least one of the following export data:

• robot cell description files on computer readable format,

• graphic presentation of robot cells,

• robot cell manufacturing files,

• digital 2D or 3D representation of processed products,

• robot work instructions,

• robot training algorithms,

• time schedules and factory production integration files,

• production statistics.

13. An automation process comprising:

- setting up of a robot cell (1; 2) by implementation of a system according to any of claims 1 to 6 or a process according to any of claims 7 to 12,

- installing the robot cell physically in a factory taking into account one or some of the following requisites:

• physical constraints in the form of available space, robot cell limits and location according to factory plans and concurrent production,

• way of feeding products to and from the robot cell,

• coordination in time and space with nearby robot cells and

robots,

• specified work and tools required for the steps of production in the robot cell, and

- training the robot in a neural network (26) by comparison of CAD- drawings of products with digital representations of the physical products and their location in space using vision-based robot control (19).

14. The automation process of claim 13, wherein the digital representation of a physical product and its location in space is produced by means of a 2D or 3D image acquisition means and image processing software integrated with the robot.

15. A computer programme product storable on a computer usable medium

containing instruction for a processor of a cell generator or robot edge computer to execute the process of any of claims 6 to 12. 16. The computer program product of claim 15 provided at least in part over a data transfer network, such as Ethernet or Internet.

17. A computer readable medium, characterized in that it contains a computer programme product according to claim 15.

Description:
Robot cell setup system and process

TECHNICAL FIELD OF THE INVENTION

The present invention relates to a system and a process designed for configuration of robot cells for an automated industrial process. The invention relates also to an automation process implementing embodiments of the robot cell configuration system and process of the invention.

BACKGROUND AND PRIOR ART

In this context, a robot is an industrial robot which can be described as an

automatically controlled multipurpose manipulator, programmable in three or more axis of motion, which can be either fixed in position or mobile for operation in industrial automation applications. The system and process of the present invention can be implemented on any robot system that fits the definition, including but not limited to Articulated Robots, SCARA Robots, Cartesian or Parallel Robots, e.g.

Industrial robots can be used and programmed for various operations such as machining, assembly, picking, packaging etc. In operation the robot moves a tool or robot hand in repetitive cycles between pickup, manipulation and release, each manoeuvre performed at defined positions within the working range of the robot. The positions for pickup, manipulation and release, as well as the feed of objects for handling by the robot, are subjects that need consideration in the task of organizing a physical working area for the robot, a robot cell.

The concept of configuring and setting up a robot cell is however not limited to physical parameters only, but involves also non-physical parameters and

instructions that control the movements of the robot and the feed of objects inside or outside the robot cell.

Conventionally, automation takes a significant time to achieve, is usually costly and strongly tied to a single or few variants of a product. For illustration of the background to the present invention, a traditional typical model for setting up a robot cell for automated production in a factory facility comprises the following steps:

• drawing up a requirements specification for the work to be done in the robot cell,

• requesting quotations on physical cell construction cost from one or several system integrators (persons or companies),

• producing a virtual robot cell that is adapted to the physical constraints of the factory and to the current product flow,

• installing the robot cell and associated systems in the factory.

The robot cell is usually maintained in the factory until the end of life of the product, at which point the robot cell needs to be re-configured for a new product. Overall this means, that automation is not viable for small series of products or for products having a short life span. Previous attempts to improve the flow of robot cell setup are mostly aiming towards cost savings, such as by making robot programming easier, e.g.

SUMMARY OF THE INVENTION

It is an object of the present invention to provide a system and a corresponding process designed for creation of robot cells with a minimum of human intervention.

It is another object to provide an automation process wherein embodiments of the robot cell setup system and process of the invention are implemented.

The first mentioned object is met by a system as defined in appended claim 1 or by a process as defined in appended claim 6.

The second mentioned object is met by the automation process defined in appended claim 12.

The present invention is implemented in connection with a robot system wherein robots are capable of identification of objects and determination of object positions within the working range of the robot. This capability involves computing capacity paired with a dedicated sensing technology comprising image acquisition means carried on the robot.

In order to meet the object of the invention a robot cell setup system is provided, comprising a set of interacting system components:

• a cell generator comprising a programmable computer or processor

programmed to generate cell description files of computer readable format,

• a robot edge computer comprising a programmable computer or processor programmed to formulate instructions for robot operation based on a cell description file imported from the cell generator, and to implement these instructions as robot control code in a robot controller,

• a robot controller comprising a programmable computer or processor

programmed to execute robot control code,

• an articulated robot controllable in at least three axes of motion, the robot carrying image acquisition means for vision guidance,

• wherein the robot is operative for feeding image data to the cell generator, and wherein based on said image data, the cell generator is operative for verification or modification of the cell description file.

This way, the present invention provides automation in robot cell creation on a basic and elemental level.

In other words, figuratively speaking, the present invention provides automation of the automation setup. The cell generator and the vision-guided robot thus allowing to go directly from the requirement specification for a processed product to creation of the automation cell from scratch (or reusing /modifying a previously created robot cell, if appropriate) .

The robot control comprises 2D or 3D imaging. 2D and 3D imaging can be accomplished by use of single lens or double lens digital cameras and associated software.

Briefly, 2D and 3D image acquisition and analysis rely on software designed to translate captured images into computable data which form the basis for real time decisions made by the robot system. This data may include, e.g., object identification data, object location, object classification data, data defining relative motion between objects, relative motion between the robot system and the environment, or other data.

The 2D and 3D image acquisition- and analysis software provides a digital representation of a physical product or item and its location in space, which can be compared with a CAD-drawing file of the same product or item.

In one embodiment the robot cell setup system comprises image acquisition means and image processing software integrated with the robot.

The robot edge computer software is in one embodiment designed to apply image data as input to at least one of a movement planner programme and a work planner programme installed in the robot edge computer software.

In one embodiment, the cell generator is configured to run a neural network algorithm for robot training, based on CAD-drawings and product specifications imported to the cell generator.

In one embodiment, the system comprises a visualization generator which provides graphic presentation of the robot cell of the cell description file.

In a further embodiment, the system comprises construction plans generator for building installation of the robot cell physically in the factory, based on the cell description file.

In a process aspect of the invention, a robot cell setup process comprises:

• providing a cell generator programmed to generate cell description files of computer readable format,

• providing a robot edge computer programmed to formulate instructions for robot operation based on a cell description file imported from the cell generator, and to implement these instructions as robot control code in a robot controller,

• providing a robot controller programmed to execute robot control code,

• providing an articulated robot controllable in at least three axes of motion, equipping the robot with image acquisition means for vision guidance, • feeding image data acquired by the robot to the cell generator, and operate the cell generator for verification or modification of the cell description file based on said image data.

On a more detailed scale, one embodiment of the robot cell setup process comprises the following steps:

• generating a digital map of a robot cell area in a factory,

• defining a robot location within the robot cell area,

• calculating effective robot range with regard to physical factory constraints and robot load limits,

• determining pickup-, work- and delivery locations within the robot range and with regard to product specifications,

• generating a layout of the robot cell including means for feeding products to and from the robot cell,

• compiling the above data in the cell description file.

In one embodiment, the robot cell setup process comprises:

• providing 2D or 3D image acquisition means and image processing software integrated with the robot, and

• applying image data as input to at least one of a movement planner

programme and a work planner programme installed in the robot edge computer software.

Another embodiment of the robot cell setup process comprises:

• defining the sequential steps of an automated production process based on CAD-drawings of products, product specifications, or digital 2D or 3D representations of products,

• determining the spatial locations, in the robot’s local coordinate system, of pickup position, manufacturing/ assembly position and delivery position of processed products,

• choosing the relevant tool for the robot work (assembly, manufacture,

inspection etc.), and • training the robot in a neural network against feedback provided by the image acquisition means (19) of the vision guided robot.

In one embodiment of the robot cell setup process, compilation of the cell description file includes processing of digitized descriptions of the following process and product parameters:

• physical constraints in the form of available space, robot cell limits and

location according to factory plans and concurrent production,

• feed of products to and from the robot cell,

• coordination in time and space with nearby robot cells and robots,

• specification of work and choice of tools required for the steps of production in the robot cell.

In one embodiment of the robot cell setup process, the cell generator software contains executable programmes configured to receive and process the following import data:

• numerical factory data and digitized factory layout drawings,

• data on concurrent production in nearby robot cells,

• CAD drawing files,

• product specifications,

• digital 2D or 3D representation of processed products,

• digital 2D or 3D representation of physical constraints in the robot cell area, and to generate at least one of the following export data:

• robot cell description files on computer readable format,

• graphic presentation of robot cells,

• robot cell manufacturing files,

• digital 2D or 3D representation of processed products,

• robot work instructions,

• robot training algorithms,

• time schedules and factory production integration files,

• production statistics. In an automation aspect of the invention, embodiments of the above robot cell setup system and process can be implemented in an automation process comprising:

• creating a robot cell, including formulation of robot control code by means of computing software and a vision guided robot controllable in at least three axes of motion,

• installing the robot cell physically in a factory taking into account one or some of the following requisites:

• physical constraints in the form of available space, robot cell limits and

location according to factory plans and concurrent production,

• way of feeding products to and from the robot cell,

• coordination in time and space with nearby robot cells and robots,

• specified work and tools required for the steps of production in the robot cell, and

• training the robot in a neural network by comparison of CAD-drawings of products with digital representations of the physical products and their location in space using the vision-based robot control.

The present invention also relates to a computer programme product storable on a computer usable medium containing instruction for a processor of a cell generator or robot edge computer to execute the inventive process.

The computer programme product may be provided at least in part over a data transfer network, such as Ethernet or Internet.

The present invention further relates to a computer readable medium which contains the computer programme product.

SHORT DESCRIPTION OF THE DRAWINGS

Additional details and further embodiments of the invention are defined in the subordinated claims, and explained in the following detailed description of the invention with reference made to the accompanying drawings:

Fig. 1 is a schematic overview illustrating a pair of robot cells in a factory

installation, Fig. 2 is a block diagram illustrating system components and process flow in an automated robot cell configuration system and process.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

Fig. 1 shows two robot cells 1 and 2 operating in parallel. Each robot cell is served by a pair of supplying conveyors 3 and 4, and a discharge conveyor 5 respectively. The supplying conveyors 3 and 4 are each associated with a pickup table 6 and off- pushers 7 and 8 respectively at a turning end of the conveyors. The discharge conveyors 5 are each associated with a delivery table 9 and an on-pusher 10 at the turning end, respectively, of the discharge conveyors. Optical or other sensors (not shown) can be applied to initiate the off- and on-pushers.

In this connection it should be pointed out that instead of the conveyor feed illustrated in Fig. 1 , the processed products may be picked by a robot from a pallet or from a bin, if appropriate.

In the robot cells 1 and 2, a robot 1 1 or 12 is operable and programmed to pick an item or product from the pickup table 6, placing the item or product on a work table 13 to perform a value adding operation, before the item or product is placed on the delivery table 9 for discharge. The value adding operation can be any kind of assembling, machining, painting, inspection, testing etc. In the illustrated example, the robot cells 1 and 2 may be suitable for assembling of two or more incoming product parts into a composite, singular, outgoing item.

The robots 11 and 12 are articulated robots, driven and programmed for turning motion about at least three axes X, Y, and Z in an orthogonal coordinate space. Each robot has a robot arm 14 carrying in its end a robot hand 15. The robot arm 14 is extendable to reach the pickup, delivery and work tables by turning about the axes X, Y and Z. On one hand, axis Z represents a centre around which the robot and robot arm 14 can rotate in a horizontal plane, whereas on the other hand axis Z also represents a point of location of the robot in the robot cell. The axes X provide pivoting in vertical planes, while the axis Y provides rotation in any arbitrary inclined plane parallel with the X axes. Naturally, the robot 11 or 12 may include other pivot axes for additional mobility, such as in the robot hand, e.g. The robot hand 15 is arranged for docking with different kinds of tools for grasping, drilling, probing, measuring etc. In Fig. 1, the robots 1 1 and 12 have each put together an individual set of tools 16 and 17, chosen from a supply of tool boxes 18.

In operation, the robots 1 1 and 12 are each guided by a vision system 19 and a robot edge computer 20. In the drawing of Fig. 1 , the robot edge computer 20 is illustrated as being installed on-board the robot but it may alternatively be arranged separately, within or outside the robot cell. The vision system 19 may include 2D or 3D image acquisition means such as a single lens or double lens camera and associated image processing software.

The robot edge computer 20 communicates with a robot controller 21 via a wireless or cable network 22. Connected to the network 22 is a cell generator 23 comprising processor and software designed to organize the robot cells 1 and 2 based on digitized product and factory information 24. The cell generator 23 may be installed on a cloud server or on a private server as appropriate.

The block diagram of Fig. 2 shows a cell configuration system and process for a factory installation, substantially as the one illustrated in Fig. 1 as an example. Main components of the cell configuration system are the cell generator 23, the robot edge computer 20 and the robot controller 21. Each of the cell generator, the robot edge computer and the robot controller comprise data input and output functions, hardware or emulated processor and memory, and programmes executing software.

The cell configuration system and process use different kinds of input data 24 in order to set up the robot cell and to generate operating instructions for the robot.

The input data to the cell generator 23 comprises, but is not necessarily limited to, numerical factory data and digitized factory layout drawings; data on concurrent production in nearby robot cells; CAD-produced drawings on products; product specifications, e.g.

In the present invention, particularly, input to the cell generator 23 comprises digital 2D or 3D representation of physical items and constraints in the robot and cell area environment, provided in feedback from a vision-guided, seeing robot. In particular, attention being drawn to Fig. 1 from where it can be readily

appreciated that the cell generator 23 communicates, wireless or cable, with not only one robot but with two, several or all robots on the factory floor. Based on product and factory data paired with image data feedback from the robot(s), the cell generator 23 makes decisions on which robot, among the ones available, is the best choice for the new task with respect to current robot cell production, structural limitations and factory logistics (such as feed of material and products, electrical power, hydraulics, pneumatics, etc.). In this decision the cell generator also in practise decides the location of the new/ modified robot cell in the factory. From the various input as disclosed and illustrated in the application, the cell generator calculates and organizes a new robot work cell through interaction with an intelligent vision-guided robot, but essentially without human intervention.

Robot cell description file

The input data is processed in the cell generator 23 which is a programmable computer or processor that generates computer readable output in the form of a robot cell description file 25 containing, inter alia: digital 2D or 3D representation of products; robot work instructions; robot training algorithms; time schedules and factory production integration files; graphic presentation of robot cells; robot cell manufacturing files; production statistics, or other case specific data as appropriate.

Creation of the cell description file 25 involves various portions of a cell generator software which may include some or all of the following dedicated programmes:

Process and work planner: a computer programme which can use CAD drawings and product specifications to make decisions on assembly or manufacture, making choice between tools and operations (fitting, drilling, welding etc.), accounting for available production facilities such as access to power and supply of material, e.g.;

Cell location planner: a computer programme which can use physical factory data and logistic data on concurrent production to make decisions on appropriate location and timing of the new robot cell within the factory;

Robot selection planner: a computer programme which can use robot specifications to decide which robot among the ones available that is the best choice for operation in the new robot cell; Product infeed / outfeed planner: a computer programme which can use all the above data, or the output from the above planner programmes, to regulate the flow of products, material and tools in to and out from the new robot cell;

Product placement planner: a computer programme which can use image data returned from a vision-guided robot in addition to CAD drawings and product specifications, as well as output from the process and work planner programmes, to make decisions on product’s orientations at the pickup tables, work tables and delivery tables;

Robot movement planner: a computer programme which can use image data returned from a vision-guided robot in addition to input to the cell generator and/or output from the above planners, to generate movement patterns for the robot;

Robot training planner: a computer programme which can define neural networks and use image data returned from a vision-guided robot for deep learning and fine- tuning of robot movements and operations within the constructed robot cell.

In the present invention, several of the individual planner programmes of the cell generator 23 respond to input from image acquisition means 19 of a vision-guided robot. This is especially valid for the product placement planner, the robot movement planner and the robot training planner. However, image data returned from the robot may also serve as modifier to output from the other planner programmes of the cell generator.

The task of coding the different planners of the cell generator may lie within the ordinary skill of a computer programmer of industrial program code. In the present invention the leverage of improvement above the prior art lies in the combination of planners and the resulting cell description file which, by real-time feedback from an intelligent vision-guided robot, provides a higher degree of automation in the configuration and setup of a robot cell.

If not all, at least some of the above planners will be involved in the creation of the cell description file 25 from which robot control code can be generated automatically through digital processing in the robot edge computer 20. Running the programmes installed in the cell generator 23 comprises a set of procedural steps leading towards the cell description file 25: • generating a digital map of a robot cell area in a factory,

• defining a robot location (Z) within the robot cell area,

• calculating effective robot range with regard to physical factory constraints and robot load limits,

• determining pickup-, work- and delivery locations within the robot range and with regard to processed product specifications,

• generating a layout of the robot cell including means for feeding processed products to and from the robot cell,

• compiling the above data in a cell description file of computer readable format,

• verification or modification of the cell description file based on real-time image data feedback provided by the vision guided robot.

Creation of the cell description file 25 comprises processing of digitized descriptions of at least one or some of the following parameters:

• physical constraints in the form of available space, robot cell limits and

location according to factory plans and concurrent production,

• way of feeding products to and from the robot cell,

• coordination in time and space with nearby robot cells and robots,

• specification of work and choice of tools required for the steps of production in the new robot cell.

The cell generator 23 software contains executable programmes configured to receive and process import data such as:

• numerical factory data and digitized factory layout drawings,

• data on concurrent production in nearby robot cells,

• CAD drawing files,

• product specifications,

• digital 2D or 3D representation of processed products,

• digital 2D or 3D representation of physical constraints in the robot cell area, and to generate at least one of the following export data:

• robot cell description files on computer readable format,

• graphic presentation of robot cells,

• robot cell manufacturing files, • digital 2D or 3D representation of processed products,

• robot work instructions,

• robot training algorithms,

• time schedules and factory production integration files,

• production statistics.

One basic input to the cell generator is a CAD assembly file. The CAD assembly file is generated in a CAD programme that uses a design tool which determines the proper order of assembling the parts of the product, or defines other manipulation of the product to be done in the robot cell. The CAD assembly file can be generated under supervision by a human designer, or it can be automatically generated by simulation, such as by simulator reinforcement learning with trial and error using feedback from own actions until a satisfactory result is achieved.

The CAD assembly file provides computer readable data containing, inter alia, a set of constraints for calculation and generation of the cell description file, such as:

Grippers / gripper fingers required for assembly: can be defined using grasp planner software which may be built on brute force simulation, possibly assisted by algorithms such as constraint solvers or deep learning. The grasp planner software will optimize for a low number of grippers and fingers by finding a set of possible gripper configurations for each part and then minimize the required configurations by picking options that satisfy as many parts as possible. The grasp planner software may also identify the required machining tools for welding, drilling, deburring, etc., and determines the suitable size of machining tools, if appropriate;

Supplements to the robot cell: requisition of magnetic tables, shakers for screws etc., and other requisites that are unsuitable for picking by the robot;

Infeed / outfeed of products from the robot cell: the CAD assembly file defines which products/ product parts need to be fed in to and out from the robot cell. By help of information on how parts are packaged, it is possible to calculate necessary entrance and exit routes as well as defining ways of transport, such as conveyor belts, pallet drop off points, or other ways of supplying products/ product parts to the robot cell.

The cell generator software is configured to generate a robot cell layout that satisfies all incoming constraints and selects a robot that has sufficient range to reach all stations in the robot cell (or assigns more robots /stations as needed). The robot cell layout forms a basis for making part lists, drawings for assembly, etc. In other words, the sum of data imported to the cell generator are processed therein for generation of the cell description file, which by its content on computer readable format, enables automatic generation of robot control code by processing through the planner programmes of the robot edge computer.

Robot control code

The cell description file 25 is exported to the robot edge computer 20 and used by its software to create instructions for robot movements and work operations within the cell. These instructions are exported to the robot controller 21 in the form of robot control code. More precisely, the robot controller 21 is a programmable computer or processor configured to run software designed to execute robot control code. The character of the robot control code is in the form of orthogonal x/y/z-coordinates, polar coordinates in terms of distance + direction + elevation, speed of motion, time lapse and duration of holds and halts in the robot moves and operations, or other data appropriate for the control of an automation robot as is known per se.

Creation of the robot control code involves various portions of the robot edge computer software such as:

Movement planner: a computer programme which uses vision guidance 19 and/or a deep learning algorithm 26 in neural network training of the robot for fine-tuning of robot movements;

Work planner: a computer programme which uses vision guidance 19 and/or a deep learning algorithm 26 in neural network training of the robot for fine-tuning of robot tool operations.

Running the programmes installed in the robot edge computer 20 comprises a set of procedural steps leading towards a robot control code (in this case for an assembly process used as example):

• reading the cell description file for spatial coordinates on robot location and object’s pick-up, work and delivery locations,

• reading the cell description file for work specification and digital drawings, • moving robot hand to pick-up location object A,

• detecting object’s position at pick-up location,

• grasping and placing object A at assembly location,

• moving robot hand to pick-up location object B,

• detecting object’s position at pick-up location,

• grasping and placing object B with object A at assembly location,

• visually scan the resulting assembly and compare a digital image with work specification and digital drawings,

• if assembly is successful, memorize the robot movements as new robot control code,

• if assembly is un-successful, make corrections to robot movements based on the comparison of assembled objects with work specification and digital drawings, and run through the programme until assembly is successful.

The compilation of computer readable data in a cell description file as provided results in automation of instructions and code for robot control through computer processing in the robot edge computer, substantially without need for human intervention.

In one embodiment, the robot edge computer 20 can be carried on-board the robot and integrated with 3D imaging and processing means 19. This embodiment provides unmatched flexibility in the configuration and setup of robot cells, and a fast and flexible learning process including visual feedback at the very end of the automation line.

Sensor- and vision-based robot control

In training and in productive operation as well, vision guidance of robots can be applied to provide feedback to robot control. Depending on the nature of products and production, sensor guidance through limit switches, proximity sensors, or other sensing technology such as light detection and ranging, laser, lidar etc., may be supplementary applied if appropriate.

In vision-guided robot control, single or double lens cameras can be fitted on the robot close to the robot hand 15 to capture a view whose centre is lined up with the robot arm 14. In the present invention, the vision-guided robot is additionally used in the process of setting up the robot cell. More precisely, the vision guidance system is utilized for gathering information on structural robot cell components and physical constraints in the robot’s environment, and provide this information as input and feedback to the cell generator for creation or modification of the cell description file.

In a parallel way, the image data captured by the vision-guided robot may likewise be provided as input and feedback to the robot edge computer for creation or modification of the robot control code.

Thus, in accordance with embodiments of the invention, a process of automated configuration of a robot cell may also be summarized as follows:

• providing a vision guided robot, programmable / reprogrammable in three or more axes of motion, the robot being served by a robot edge computer and vision-based robot control,

• defining, for the subject robot cell, the sequential steps of an automated

production process based on CAD-drawings of products, product

specifications, or digital 2D or 3D representations of products,

• determining the spatial locations, in the robot’s local coordinate system, of pickup position, manufacturing/ assembly position and delivery position of processed products,

• choosing the relevant tool for the robot work (assembly, manufacture,

inspection etc.), and

• training the robot in a neural network against feedback provided by the

vision-based robot control.

In other words, the claimed invention is a system and a process for robot cell generation which, by 2 -way communication and interaction between a software cell generator and the image processing software of an intelligent vision guided robot, not only organizes the work within the robot cell but also integrates the new robot cell with existing production in a factory based on digital data on the product to be processed, on production process and on production facilities. In this perspective, the claimed invention is a highly dynamic and adaptable, global approach to robot cell design. Neural network training

Generally speaking, machine learning or deep learning and training of a robot in a neural network may involve recognition of characteristic object elements in images captured by a digital camera on-board the robot, and comparison of the spatial position of these elements with the position coordinates for the same elements of a digitized representation of the object, in the robot’s coordinate system. Each robot cell however may require individually designed training algorithms, and a detailed description of any specific neural network for robot training in a specific robot cell will not be given or required in this disclosure. For persons skilled in the art of robot control, guidance can be found in the extensive literature on neural network training.

Eventually, with reference to Fig. 2, the reference number 27 indicates a generator for visual/ graphic display of the robot cell, and/or a generator for cell

manufacturing, i.e. creating construction plans for the robot cell based on data and information compiled in the cell description file 25.

In other words, disclosed herein is a system and a process to initiate the whole automation sequence from the product itself in a flexible process. This is done by taking, inter alia, CAD file(s) and informing various parts of the system as explained in text and shown in the drawings. Combined with the fact that central parts of the automation itself is handled by a sensor- or vision guided robot there is provided automation which is flexible, requires little or no subsequent external programming, can generate graphic representations and other sales material, and can be used to generate the automation cell itself or variations thereof.

The present invention may be implemented as software, hardware, or a combination thereof. The computers and processors applied in the system can be realized as hardware components, or in the form of emulated software components if

appropriate. A computer program product or a computer program implementing the process or a part thereof comprises software or a computer program run on a general purpose or specially adapted computer, processor or microprocessor. The software includes computer program code elements or software code portions that make the computer perform the process. The program may be stored in whole or part, on, or in, one or more suitable computer readable media or data storage means such as a magnetic disk, CD-ROM or DVD disk, hard disk, magneto -optical memory storage means, in RAM or volatile memory, in ROM or flash memory, as firmware, on a data server, or a cloud server. Such a computer program product or a computer program can also be supplied via a network, such as Internet.

It is to be understood that the embodiments described above and illustrated in the drawings are to be regarded only as non-limiting examples of the present invention and may be modified within the scope of the appended claims.

Accordingly, in the present invention, an intelligent vision-guided robot takes an active part in creation of a robot cell by identification of structural constraints in the robot’s environment and feeding this information to a cell generator which creates and adapts the robot cell design based, at least partially, on this feedback from the vision-guided robot.