Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR PROGRAMMING A ROBOT
Document Type and Number:
WIPO Patent Application WO/2021/190769
Kind Code:
A1
Abstract:
A method for programming a robot comprises the steps of a) providing a 3D representation of workpieces to be handled by the robot, b) providing a 3D representation of a working environment comprising an initial position where each workpiece is to be seized by the robot, and a final position where the workpiece is to be installed by the robot, c) synthesizing and displaying a view of the working environment comprising an image of the workpieces at respective initial positions; d) enabling a user to select one of the displayed workpieces; e) identifying matching features of the selected workpiece and of the working environment which are able to cooperate to hold the workpiece in a final position in the working environment, and a skill by which the matching features can be brought to cooperate; f) based on the skill and on the final position, identifying an intermediate position from where applying the skill to the workpiece moves the work-piece to the final position; g) adding to a motion program for the robot a routine for moving the workpiece from its initial position to the intermediate position and for applying the skill to the workpiece at the intermediate position.

Inventors:
DAI FAN (DE)
LI NUO (DE)
DIX MARCEL (DE)
CAO DONGLIANG (DE)
Application Number:
PCT/EP2020/058868
Publication Date:
September 30, 2021
Filing Date:
March 27, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ABB SCHWEIZ AG (CH)
International Classes:
B25J9/16
Other References:
SCHNEIDER S A ET AL: "EXPERIMENTAL OBJECT-LEVEL STRATEGIC CONTROL WITH COOPERATING MANIPULATORS", INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, SAGE SCIENCE PRESS, THOUSAND OAKS, US, vol. 12, no. 4, 1 August 1993 (1993-08-01), pages 338 - 350, XP000383995, ISSN: 0278-3649
YAO Y X ET AL: "A pragmatic system to support interactive assembly planning and training in an immersive virtual environment (I-VAPTS)", THE INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, SPRINGER, BERLIN, DE, vol. 30, no. 9-10, 9 December 2005 (2005-12-09), pages 959 - 967, XP019440735, ISSN: 1433-3015, DOI: 10.1007/S00170-005-0069-Y
Attorney, Agent or Firm:
BEETZ & PARTNER MBB (DE)
Download PDF:
Claims:
Claims

1. A method for programming a robot, compris ing the steps of a)providing a 3D representation of at work- pieces to be handled by the robot, b) providing a 3D representation of a work ing environment comprising an initial posi tion where each workpiece is to be seized by the robot, and a final position where the workpiece is to be installed by the ro bot, c)synthesizing and displaying a view of the working environment comprising an image of the workpieces (9-14) at respective initial positions ; d)enabling a user to select one of the dis played workpieces (9-14); e)identifying matching features (17) of the selected workpiece (9) and of the working environment which are able to cooperate to hold the workpiece (9) in a final position (9") in the working environment, and a skill by which the matching features can be brought to cooperate; f) based on the skill and on the final po sition^"), identifying an intermediate po sition (9') from where applying the skill to the workpiece (9) moves the workpiece to the final position (9"); g) adding to a motion program for the robot a routine for moving the workpiece (9) from its initial position to the intermediate position and for applying the skill to the workpiece (9) at the intermediate position.

2. The method of claim 1, wherein step f) com- prises displaying the workpiece (9) at the intermediate position.

3. The method of claim 2, wherein the method proceeds from step f) to step g) only after approval of the match by a user.

4. The method of claim 2 or 3, further com prising enabling the user to drag the image of the workpiece (9) to a desired position.

5. The method of claim 4, wherein the match is regarded as disapproved by a user if the user drags the image of the workpiece away from the intermediate position.

6. The method of any of the preceding claims, further comprising the step h) updating the working environment by in cluding in it the workpiece at its final position.

7. The method of claim 6, wherein after step h) the method returns to step c). 8. The method of any of the preceding claims, comprising the preparatory step of deriving the 3D representation of the workpiece from CAD data. 9. The method of claim 8, wherein features of the workpiece (9) to be matched with a fea ture (17) of the working environment are identified in said CAD data.

10. The method of claim 8 or 9, wherein the CAD data comprise a 3D representation of the product to be assembled from the workpiec es, and in step c) each workpiece (9-14) is displayed in the orientation it has in the product .

11. The method of any of the preceding claims wherein the matching features are

- a projection and a recess that are en- gageable in a given direction and optional ly have identical cross sections, and the associated skill is pushing the workpiece in the given direction; or

- male and female threads, and the associ ated skill is screwing;

- plane surfaces, and the associated skill is placing the surfaces in contact, option ally accompanied by pressing and/or heat ing .

12. The method of any of the preceding claims, further comprising a step of identifying matching features of the workpiece and the working environment taking into account a skill specified by the user.

13. A computer system comprising a computer, a display and a coordinate input means, wherein the computer is programmed to carry out the method of any of claims 1 to 12 based on user input provided via the coor dinate input means.

14. A computer program which, when carried out by a computer system, causes the computer system to carry out the method of any of claims 1 to 12.

Description:
Method and system for programming a robot The present invention relates to a method for pro gramming a robot, and to a system for carrying out the method.

Programming an industrial robot is a time-consuming task, especially for applications where several workpieces have to be assembled into a product.

Conventional CAD tools can provide very detailed information about workpieces that are to be assem- bled into a given product, but, due to the large variety of geometric features of different work- pieces that might have to engage with each other in an assembly process, of CAD data formats, and of unknown—parameters such as material properties, de- sign tolerances etc. there is currently no system capable of deriving an assembly program for a robot directly from CAD data of the workpieces to be as sembled . In automation industry, there are various software products to support programming industrial robots such as ABB PowerPac. Such software can assist the user to define workspaces, work objects, and focus es on automatically generating paths for a robot processing a single stationary workpiece, e.g. by machining or welding, but provides only limited support for assembly processes that involve dis placing workpieces.

It is an object of the present invention to provide a method which facilitates programming of assembly tasks to be carried out by a robot. The object is achieved by a method for programming a robot, comprising the steps of a) providing a 3D representation of at least one workpiece to be handled by the robot, b) providing a 3D representation of a working en- vironment comprising an initial position where the workpiece is to be seized by the robot, and a final position where the workpiece is to be installed by the robot, c) synthesizing and displaying a view of the working environment comprising an image of the workpieces at respective initial positions; d) enabling a user to select one of the displayed workpieces; e) identifying matching features of the selected workpiece and of the working environment which are able to cooperate to hold the workpiece in a final position in the working environment, and a skill by which the matching features can be brought to coop erate; f) based on the skill and on the the final posi tion, identifying an intermediate position from where applying the skill to the workpiece moves the workpiece to the final position; g) adding to a motion program for the robot a routine for moving the workpiece from its initial position to the intermediate position and for ap plying the skill to the workpiece at the intermedi ate position.

In this method, what the user is required to do is to define the order in which the workpieces are as sembled. The tasks of determining a routine by which the robot moves the selected workpiece to the intermediate position and of controlling the skill by which the robot brings it from the intermediate position to the final position can be automatized.

Displaying the currently selected workpiece at the intermediate or final position can be helpful in that it enables the user to check whether the sys tem is planning to install the workpiece at the po sition where it actually belongs. This is particu larly relevant if there are several identical work- pieces, and there is a possibility of installing one at a final position where it would block the subsequent installation of other workpieces.

Therefore the method should proceed from step f) to step g) only after approval of the match by a user.

If the method allows the user to drag the image of the workpiece to a desired position, this can help the method to identify a suitable intermediate po- sition, assuming that the user is actually dragging the workpiece towards a position where it should be installed. On the other hand, if the user drags the workpiece away from an intermediate position where it is cur rently displayed, it is evident that the user dis approves of this intermediate position and wishes the workpiece to be installed elsewhere.

When a final position has been determined for a first workpiece, the working environment should be updated by including in it the workpiece at its fi- nal position. Thus, when a second workpiece is se lected by the user, the search for matching fea tures of the second workpiece and of the working environment can automatically disregard the feature occupied by the first workpiece, and calculation of a path by which the robot can move the second work- piece from its initial to its intermediate position can take account of a contour of the working envi ronment modified by addition of the first work- piece.

The 3D representation of the workpiece used for synthesizing the view and for finding matching fea tures is preferably derived in a preparatory step from CAD data of the workpiece.

Finding features of the workpiece that might match features of the working environment can be facili tated if such features are labeled in the CAD data. Such a label may explicitly characterize the fea- ture by the way in which it is supposed to connect to a matching feature of the working environment, or by a reference to a skill by which it is to be connected to its counterpart feature, i.e. by de fining the feature to be e.g. a male or female thread, a welding surface, a plug, a socket or the like, or it may simply specify that the feature is expected to connect to some matching feature of the working environment, leaving to the computer system or to a user seeing the feature displayed in the view of the working environment the task of identi fying the matching feature and a suitable skill e.g. based on geometrical characteristics of the feature.

If the CAD data comprise a 3D representation of the product to be assembled from the workpieces, the orientation of the workpieces in the product can be extracted from the CAD data. In that case the us- er's task can be simplified by displaying to him, in the view of the working environment, all work- pieces in the orientation they are going to have in the assembled product. Obviously, there can be as many different types of matching features as there are skills for joining workpieces, and in principle, the present invention is applicable to any of these. As an illustration, the matching features can be - a projection and a recess that are engageable in a given direction. In that case, the associated skill would be pushing the workpiece in the given direction. Optionally, a projection and a recess can be regarded as matching if they have identical cross sections. Alternatively; the matching fea tures can be male and female threads, in which case the as sociated skill is screwing; or plane surfaces, in which case the associated skill can be gluing, welding or the like..

For a user who sees the workpiece in the synthe- sized view, a skill by which the workpiece is to be installed in the working environment is often imme diately apparent. E.g. when the workpiece is a screw, it is obvious for a human user that it has to be screwed, and the only problem may be, in a complicated environment, to find the correct hole for the screw. Therefore, if the user specifies to the computer system the skill by which the work- piece is to be installed, this greatly reduces the system's choice of candidates for matching fea- tures, so that matching pairs of features of the workpiece and the working environment can be found much more quickly.

The invention can also be embodied in a computer system comprising a computer, a display and a coor dinate input means, wherein the computer is pro grammed to carry out the method described above based on user input provided via the coordinate in put means, or in a computer program which, when carried out by a computer system, causes the com puter system to carry out the method.

Further features and advantages will become appar ent from the subsequent description of embodiments thereof referring to the appended drawings.

Fig. 1 is a block diagram of a computer system; Fig. 2-5 are views of a working environment gener ated by the computer system in the pro cess of carrying out the method of the invention.

The computer system of the present invention com prises a general purpose computer 1 having a CPU 2, program and data storage 3, 4, a display 5 and a coordinate input device 6. Program storage 3 holds a program whose instructions enable the computer to carry out the method described below. Data storage 4 holds 3D representations, typically CAD data, of an initial working environment, of a product to be assembled and of the workpieces to be assembled in- to the product. These representations comprise all data that are needed for generating a realistic or at least unambiguously recognizable image of each workpiece on display 5. They further comprise de tailed information on features of the workpieces by which these are to be connected to the environment or to each other, by which the computer can judge whether two such features can be connected to each other or not. A robot for which the system is to generate a program that will enable the robot to assemble the physical workpieces doesn't have to be part of the system.

In an elementary case, the initial working environ ment is a solid surface 7 such as a tabletop, and in a first step of the method, a first workpiece 8 is virtually fixed on said surface by the computer 1, whereby a secondary working environment is ob tained. The computer 1 synthesizes a view of this secondary working environment and of some workpiec- es 9-14 that are not yet installed, as shown in Fig. 2, and shows it on display 5.

In this view, some of the virtual workpieces 9-14 are shown in an orientation in which their physical counterparts wouldn't be stable on the surface of the working environment. The reason is that the computer 1 derives from the 3D representation of the product to be assembled the orientation the workpieces 9-14 are going to have in this product, and displays them in this orientation. In this way, the way in which the workpieces might be installed is easier for a user to recognize from the view. For a human user, it is readily apparent that the workpieces 9-14 are of different types and will have to be joined to the workpiece 8 by different skills. In the present example, workpiece 8 has matching features for each one of workpieces 9-14; in a more complex scenario, there might be unin stalled workpieces for which there is no matching feature yet in the working environment, but will be formed in the process of installing other workpiec es only; in that case there will be workpieces in the view which cannot yet be installed, and the us er has to select a workpiece which can.

For the assembly of a product, several workpieces of a same type e.g. screws, may be required. In that case, several positions will be available in the working environment where a screw can be in stalled, but the computer 1 as a rule has no crite ria by which to decide where a particular screw should go. This decision should be made by the user and input into the computer system as will be de scribed below.

Workpiece 9 is a screw. The computer 1 can be made aware of the fact if in the 3D representation men tioned above, the workpiece is explicitly labeled as a screw. Alternatively, the computer might be programmed to identify workpiece 9 as a screw based on its geometrical characteristics. Further alter- natively, the information that workpiece 9 is a screw may be input by the user, for example when selecting it or in a preliminary step in which all workpieces 9-14 are successively characterized. The user selects workpiece 9 in the usual way by placing a cursor 15 on it in the view on display 5, using coordinate input device 6, and pressing a key. When the workpiece 9 is selected, the image of the workpiece 9 will move as if attached to the cursor 15 when the user moves the cursor 15 fur ther.

The coordinate input device 6 might be a 3D input device, colloquially referred to as a "space mouse" by which not only a coordinate triplet but also orientation angles of the workpiece in a coordinate system of the working environment can be specified. Preferably, simpler and cheaper input devices are used. For example, in the present case, means for specifying orientation angles can be dispensed with, either because the orientation of the work- pieces displayed in the view doesn't have to be changed, or because, if a rotation should become necessary, the computer determines the rotation without requiring input from the user. Further, in putting merely two space coordinates can be suffi cient, since the computer 1 can choose the third coordinate so that the workpiece is located immedi- ately adjacent to a surface of the working environ ment that is shown in the view.

Suppose the user drags the screw towards a hole 16 of workpiece 8 (Fig. 3) using coordinate input de- vice 6. Based on the 3D representation, the comput er 1 checks whether the screw would fit in hole 16. In the affirmative, the user is made aware of the fact by e.g. the image of the screw flashing, changing its colour, or the like. If the user is aware that the screw 9 isn't supposed to go into hole 16, he will drag the screw further, and the image of the screw changes back to normal.

When the screw 9 is moved to the vicinity of hole 17, the system again detects that the screw might fit, and makes the user aware thereof. The user confirms that the screw 9 is to go into hole 17, e.g. by releasing or by pressing once more the key used earlier for selecting the workpiece.

Insertion of the physical screw 9 in hole 17 would require a screwing action by the robot. Based on the coordinates of the hole 17, the computer 1 cal culates an intermediate position 9' (Fig. 4) from which the screw can be inserted in the hole 17, i.e. a position close to the surface of workpiece 8 in which axes of the screw 9 and of the hole 17 are aligned. Then, it calculates a routine by which the robot can first move the physical screw from its initial position to said intermediate position ad jacent the workpiece 8, and from there screw it in, and appends it to the working program for the ro bot.

The position in the vicinity of hole 17 where the user has dragged the image of the screw and where the computer 1 detects that the screw might fit in hole 17 will generally not be identical to the above-mentioned intermediate position. Therefore, for making the user aware of a possible fit, the computer 1 can, in addition or as an alternative to the methods mentioned above, abruptly move the im age of the screw (or any other workpiece which hap- pens to be selected) from the position set by the user to the intermediate position 9'. Since the screw is thus moved with respect to the cursor 15 - in Fig. 4 it is actually detached from the cursor 15 - the user cannot fail to notice the displace- ment, even if small.

When the virtual screw 9 has been inserted in hole 17, thus reaching its final position 9 shown in Fig. 5, the hole 17 is no longer available for in- serting a workpiece therein, and the presence of the head of the screw outside the hole 17 may have an influence on how other workpieces can be ap proached to the workpiece 8 and connected to others of its features. Therefore, a new secondary working environment is calculated which comprises not only workpiece 8, but also screw 9, and which will be used for processing the next workpiece selected by the user. Workpiece 10 is a rectangular plug. Once this fact is recognized by the system, based on the stored 3D representation or from input by the user, the com puter 1 begins to search the working environment for an appropriate socket. The process may be speeded up by the user selecting workpiece 10 and dragging it towards socket 18, thereby indicating to the computer 1 a region of the working environ ment where the final position of workpiece 10 might be found, and where a search for this final posi tion should best begin. When the match between workpiece 10 and its associated feature, such as socket 18, in the working environment is recog nized, the computer 1 autonomously calculates an intermediate position adjacent to the socket 18 in which longitudinal axes of the plug and the socket 18 are aligned, so that from the intermediate posi tion the physical plug can be pressed into its fi nal position in the socket 18 of physical workpiece 8 by a linear displacement of the robot, and the computer 1 places the image of workpiece 10 in said intermediate position in the view shown on display 5, so as to make the user aware of the match. Based on the 3D representation, the computer 1 may be able to identify the position where a workpiece has to be installed in a very short time, or may even have identified it before the user has select ed the workpiece. This is possible in particular if a workpiece, such as the plug, occurs just once in the product to be assembled. In such a case, the computer will move the image of the workpiece to its intermediate or final location in the very mo ment the workpiece is selected by the user. Workpiece 11 is a clip. A human user will readily recognize that, of all features of workpiece 8, the clip can only go into hole 19. A computer will a priori not do so, for if only geometrical features are compared, it will regard the barbs 20 of the clip 11 as not fitting into hole 19. Here, explic itly labeling the workpiece 11 as an elastic clip, be it by a label included in the 3D representation or by user input, enables the system to disregard the barbs 20, to realize that a stem 21 of the clip would indeed fit the cross section if the hole 19, and to make the user aware of the fact in any of the ways described above. Based on this infor- mation, the system is further able to program the robot so that when the clip is moved from an inter mediate position in front of hole 19 to its final position inside the hole, enough pressure is ap plied to deflect the barbs 20 so that they will en- ter the hole 19.

Workpiece 13 is a cylindrical rod. Its selection by the user, dragging to and finally inserting it in hole 16, can be carried out according to the prin- ciples described above. However, the system cannot judge a priori from the geometrical characteristics of the workpiece 13 whether it is to be immobile after installation, or whether it is to be rotata bly mounted. Again, such information has to be pro- vided in the 3D representation of either workpiece 13 or workpiece 8, or to be input by the user. De pending on this information, computer 1 determines whether the robot program for mounting the rod in cludes a skill of e.g. soldering, ultrasonic or friction welding or the like in addition to that of pushing the rod into the hole 16.

Reference numerals

1 computer

2 CPU

3 data storage

4 data storage 5 display

6 coordinate input device

7 solid surface

8-14 workpiece

15 cursor 16 hole

17 hole

18 socket

19 hole

20 barb

21 stem