Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHODS, SYSTEMS AND COMPUTER PROGRAM PRODUCTS FOR SHAPE RECOGNITION BASED PROGRAMMING OF SEWING ROBOTS
Document Type and Number:
WIPO Patent Application WO/2018/044176
Kind Code:
A1
Abstract:
Methods, apparatuses and computer program products for carrying out automated sewing operations based on shape recognition of a workpiece. Described are shape based generation of sewing instructions, shape based programming of an automated sewing station, and update of sewing instructions during the sewing process.

Inventors:
GJELSTENLI TOR RONNY (NO)
RIKSHEIM TERJE (NO)
BLAKSTAD SVEIN EVEN (NO)
REVNE KENNETH (NO)
DRANSFELD SEBASTIAN (NO)
WETTERWALLD LARS ERIK (NO)
Application Number:
PCT/NO2017/050214
Publication Date:
March 08, 2018
Filing Date:
August 31, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
AMATEC AS (NO)
International Classes:
G05B19/401; B25J9/16; D05B19/02; D05B25/00
Foreign References:
US4998489A1991-03-12
Other References:
DENGSHENG ZHANG; GUOJUN LU: "Review of shape representation and description techniques", PATTERN RECOGNITION, vol. 37, 2004
Attorney, Agent or Firm:
ZACCO NORWAY AS (NO)
Download PDF:
Claims:
CLAIMS

1. A method for creating a library of sewing instructions for an automated sewing station, said sewing instructions being specific to respective workpieces, the method comprising: obtaining an image of a workpiece; processing the obtained image to create a shape representation; generating a sewing path with a defined relationship with the shape representation; and storing the shape representation together with the sewing path in a database. 2. A method according to claim 1, wherein the shape representation includes a set of points positioned along the edge of the shape.

3. A method according to claim 1 or 2, wherein the sewing path is represented as a set of points positioned with a predetermined distance from the edge of the shape.

4. A method according to one of the previous claims, further comprising: displaying at least one of the obtained image and the shape representation on a display of a computer; and receiving a set of points representative of the sewing path as user input generated with a pointing device operated in conjunction with said display.

5. A method according to one of the previous claims, further comprising: generating and storing additional sewing parameters in association with the shape representation.

6. A method according to one of the previous claims, wherein said image of the workpiece is obtained while the workpiece is positioned on a high contrasting surface.

7. A method for identifying a workpiece and providing sewing instructions to an automated sewing station, comprising: obtaining an image of a workpiece; processing the obtained image to create a shape representation; performing a search in a database containing a library of sewing instructions associated with shape representations to find a closest matching shape representation; upon finding the shape representation in the library of sewing instructions that is a closest match to the shape representation created from the obtained image of the workpiece, retrieving sewing instructions at least including a sewing path with a defined relationship with the closest matching shape representation; and transferring said sewing instructions to a controller computer configured to control a sewing operation in the robotized sewing workstation. 8. A method according to claim 7, wherein said shape representations include sets of points positioned along the edge of the shape.

9. A method according to one of the claims 7 and 8, wherein said closest matching shape representation is determined based on a metric that produces a numerical value representing a measurement of difference between two shape representations. 10. A method according to one of the claims 7 through 9, wherein said image of the workpiece is obtained while the workpiece is positioned on a high contrasting surface.

11. A method for performing a sewing operation in an automated sewing station, comprising: identifying a workpiece that is delivered to the sewing station based on shape recognition; transferring sewing instructions associated with the identified workpiece from a database of sewing instructions associated with shape representations to a controller computer configured to control a sewing operation in the automated sewing station; controlling a robot to transfer said workpiece to a sewing machine; controlling a robot and said sewing machine to perform a sewing operation based on said sewing instructions associated with the identified workpiece; and removing said workpiece from the sewing machine upon completion of said sewing operation.

12. A method according to claim 11, further comprising: utilizing a camera to continuously obtain images of the sewing operation, said images including a view of a needle of the sewing machine and an edge of the workpiece; using the obtained images to measure the distance between the needle of the sewing machine and the edge of the workpiece; and upon determining that said distance deviates from a desired value with more than a predetermined threshold, updating the instructions to at least one of said robot and said sewing machine.

13. A method according to claim 11, wherein said automated sewing station is one of a plurality of sewing stations in a production cell, wherein each automated sewing station includes at least one sewing machine and one sewing robot, the method further comprising: selecting one of said plurality of sewing stations; controlling a service robot to perform said transfer of said workpiece to the sewing machine that is part of the selected one of said plurality of sewing stations; and controlling the sewing robot and sewing machine that are included in said selected one of said plurality of sewing stations to perform said sewing operation.

14. A workstation for creating a library of sewing instructions for an automated sewing station, said sewing instructions being specific to respective workpieces, the workstation comprising: a camera (102) configured to capture images of workpieces (111); a computer (112) configured to receive images from said camera (102), generate shape representations and sewing paths (305) from images of workpieces (111) and store respective ones of said shape representations together with an associated sewing path (305) in a database.

15. A workstation according to claim 14, further comprising a display (116) and a pointing device (117); and wherein said computer (112) is further configured to displaying at least one of images received from said camera (102) and said generated shape representations, and receive user input from said pointing device (117) representing coordinates defining points that are to be included in said sewing paths (305).

16. A workstation according to claim 14 or 15, further comprising a contrasting surface (111); and wherein said camera (102) is directed towards said contrasting surface (101) and said workpieces (111) are placed on said contrasting surface (111) when said images are obtained by the camera (102).

17. A workstation according to claim 16, wherein said contrasting surface (111) is a light table.

18. A workstation according to one of the claims 14 to 17, wherein said shape representation includes a set of points positioned along the edge of the shape. 19. A workstation according to one of the claims 14 to 18, wherein said sewing path is represented as a set of points positioned with a predetermined distance from the edge of the shape.

20. An automated sewing station, comprising: a first camera (102) configured to capture images of workpieces (111); an image processing computer (112) configured

- receive an image from said camera (102),

- generate a shape representation from said received image of a workpiece (111),

- perform a search in a database containing a library of sewing instructions associated with shape representations to find a closest matching shape representation, and

- upon finding the shape representation in the library of sewing instructions that is a closest match to the shape representation created from the obtained image of the workpiece, retrieving sewing instructions at least including a sewing path with a defined relationship with the closest matching shape representation, and; a controller computer (113) configured to receive said sewing instructions form said image processing computer (112); a sewing machine (103); and a robot (104); wherein said controller computer (113) is further configured to control a sewing operation in the automated sewing station by controlling said sewing machine (103) and said robot (104) based on said sewing instructions.

21. An automated sewing station according to claim 20, wherein said shape representation includes a set of points positioned along the edge of the shape.

22. An automated sewing station according to one of the claims 20 and 21, wherein said sewing path is represented as a set of points positioned with a predetermined distance from the edge of the shape.

23. An automated sewing station according to one of the claims 20 to 22, further comprising: a second camera (115) configured to continuously obtain images of the sewing operation including a needle of the sewing machine (103) and an edge of the workpiece (111); an edge tracking computer (114) configured to receive images from said second camera (115) and process said received images to determine a distance between the needle of the sewing machine (103) and the edge of the workpiece (111) and to provide the result of said determining to the controller computer (113); wherein said controller computer (113) is further configured to update the sewing instructions based on a predetermined rule and said determined distance.

24. An automated sewing station according to one of the claims 20 to 23, wherein said image processing computer (112) and said controller computer (113) are implemented as respective software modules executed by one or more processors that are part of the same computer system. 25. An automated sewing station according to one of the claims 20 to 24, wherein said image processing computer (112) and said controller computer (113) are implemented as respective computer systems.

26. An automated sewing station according to one of the claims 20 to 25, further comprising a contrasting surface (111); and wherein said camera (102) is directed towards said contrasting surface (101).

27. An automated sewing station according to one of the claims 20 to 26, wherein said contrasting surface (111) is a light table.

28. A production cell, comprising: a first camera (102) connected to an image processing computer (112) for identification of workpieces (111) delivered to said production cell; a master controller computer (113') configured to receive sewing instructions from a database based on an identification of a workpiece (111) by said image processing computer (112); a service robot (104'); and a plurality of automated sewing stations including:

- a sewing machine (103),

- a sewing robot (104), and

- a sewing station controller computer (113); wherein said master controller computer (113') is further configured to - select one of said plurality of automated sewing stations,

- control said service robot (104') to transfer said workpiece (111) to the selected automated sewing station, and

- transfer said received sewing instructions to the sewing station controller computer (113) of the selected automated sewing station; and said sewing station controller computer (113) is configured to control the sewing operation at said selected automated sewing station.

29. A production cell according to claim 28, wherein said automated sewing stations further include a second camera (115) connected to an edge tracking computer (114) for determining a distance between a needle of the sewing machine (103) and an edge of the workpiece (111); and wherein said sewing station controller computer (113) is further configured to receive said determined distance from said edge tracking computer (114) and to update the sewing instructions based on a predetermined rule and said determined distance.

30. A production cell according to claim 28 or 29, wherein said image processing computer (112), service computer (113'), sewing station controller computers (113) and sewing station edge tracking computers (114) are implemented as respective software modules installed on one or more computer systems.

31. Computer program product comprising a computer readable storage medium carrying computer readable program code enabling one or more computers to operate in accordance with any one of the claims 1 to 13.

Description:
METHODS, SYSTEMS AND COMPUTER PROGRAM PRODUCTS FOR SHAPE RECOGNITION BASED

PROGRAMMING OF SEWING ROBOTS

TECHNICAL FIELD

[0001] The present invention relates to flexible robotic sewing, and in particular a method and a system for flexible generation and implementation of sewing paths.

BACKGROUND

[0002] The sewing industry has to a large extent been outsourced to low cost countries over the last decades. This is the case for the clothing industry as well as for sewing of upholstery for the furniture industry. And while similar trends have been somewhat reduced by the introduction of robotic process automation with respect to industrial processing of hard or firm materials like metals and wood, development of robotic processing of soft and pliable materials like textile, hides, foam rubber, etc. has to a large extent not been successful.

[0003] In order for automated sewing stations to be able to operate efficiently and with sufficient quality, and also with a flexibility that enables one workstation to handle different tasks, further development is necessary. If a robotic workstation can only handle one or a limited set of tasks, investment in such workstations can only be made for large production volumes, and not for smaller series and production made to order. Challenges associated with reconfiguration and reprograming of robotic sewing stations for smaller production series must therefore be met. Furthermore, the fact that pieces of fabric and leather are soft and pliable means that their shape may change during a sewing operation, which means that a preprogrammed sewing path may have to be updated during the sewing operation.

[0004] It is therefore desirable to develop sewing stations that can easily be reconfigured to handle new tasks and that can do so cost efficiently even for relatively small production series.

SUMMARY OF THE DISCLOSURE [0005] In order to meet some of these requirements and provide methods and equipment that can perform automated sewing operations in an efficient and economically viable manner, the present invention has been conceived and developed.

[0006] According to a first aspect of the invention a method is provided for creating a library of sewing instructions for an automated sewing station. The sewing instructions are specific to respective workpieces, meaning that they are determined by the shape of the workpiece, and that workpieces of different shape will be subject to different sewing processes. The method includes such steps as obtaining an image of a workpiece, processing the obtained image to create a shape representation, generating a sewing path with a defined relationship with the shape representation, and storing the shape representation together with the sewing path in a database. In that way sewing instructions are associated with shapes and they can be retrieved based on an identification of the shape with which they are associated.

[0007] In some embodiments the shape representation includes a set of points positioned along the edge of the shape. This set of points can be connected by line segments to create a polygon representation of the shape. However, other shape representations are known in the art and may be used in other embodiments of the invention.

[0008] The sewing path can be represented as a set of points positioned with a

predetermined distance from the edge of the shape. This set of points may be used to generate a set of line segments or, for example, of curve splines. The resulting curve can be used to control the sewing robots, as will be described in further detail below. However, in some embodiments of the invention the sewing curve may be defined implicitly, as a predefined distance from the edge of the workpiece.

[0009] Some embodiments of the invention may allow manual input from an operator to define the sewing path. In these embodiments, the method may include displaying at least one of the obtained image and the shape representation generated from the image on a display of a computer and receiving a set of points representative of the sewing path as user input generated with a pointing device operated in conjunction with the display.

[0010] The method may further include generation and storage of additional sewing parameters in association with the shape representation. Such parameter may include stitch length, gathering rate etc., and they may be received as user input, or generated based on default values and rules.

[0011] In order to efficiently obtain an image that can easily be analyzed for shape representation, the image of the workpiece is in some embodiments obtained while the workpiece is positioned on a high contrasting surface, for example a light table. [0012] According to a second aspect of the invention a method is provided for identifying a workpiece and providing sewing instructions to an automated sewing station. Such a method may include obtaining an image of a workpiece and processing the obtained image to create a shape representation, much like the corresponding steps in the method according to the first aspect. A search may then be performed in a database containing a library of sewing instructions associated with shape representations in order to find a closest matching shape representation. The library of sewing instructions associated with shape representations may have been generated using a method corresponding to the method of the first aspect described above.

[0013] Upon finding the shape representation in the library of sewing instructions that is a closest match to the shape representation created from the obtained image of the workpiece, sewing instructions at least including a sewing path with a defined relationship with the closest matching shape representation can be retrieved and transferred to a controller computer configured to control a sewing operation in the automated sewing station.

[0014] A number of methods for performing shape based searches are known in the art, and the best method may depend on the method chosen for shape representation. In some embodiments, the closest matching shape representation is determined based on a metric that produces a numerical value representing a measurement of difference between two shape representations.

[0015] According to a third aspect of the invention, a method has been provided for performing a sewing operation in an automated sewing station. Such a method may include identifying a workpiece that is delivered to the sewing station based on shape recognition, and transferring sewing instructions associated with the identified workpiece from a database of sewing instructions associated with shape representations to a controller computer configured to control a sewing operation in the automated sewing station. These initial steps of the method may correspond to the method of the second aspect of the invention described above. A robot may be controlled to transfer the received workpiece from the area where it was delivered to the sewing station (which may be the area where it was positioned for identification) to a sewing machine, and a robot and the sewing machine may then be controlled to perform a sewing operation based on said sewing instructions associated with the identified workpiece. Upon completion of the sewing operation, the workpiece may be removed from the sewing machine. Thus the sewing machine will be ready to receive a new workpiece.

[0016] In some embodiments of the invention, the method further includes utilization of a camera to continuously obtain images of the sewing operation including a needle of the sewing machine and an edge of the workpiece and use of the obtained images to measure the distance between the needle of the sewing machine and the edge of the workpiece. Upon determining that that the distance deviates from a desired value with more than a

predetermined threshold, the instructions to at least one of said robot and said sewing machine may be updated. The distance may be used directly in order to bring the seam gradually closer to the desired distance to the edge, or it may be used in an intermediate calculation of an angle between the sewing direction and the direction of the edge of the workpiece.

[0017] In some embodiments the automated sewing station is one of a plurality of automated sewing stations in a production cell, wherein each automated sewing station includes at least one sewing machine and one sewing robot. The method may then include selection of one of said plurality of sewing stations, controlling a service robot to perform the transfer of the workpiece to the sewing machine that is part of the selected one of the plurality of sewing stations, and controlling the sewing robot and sewing machine that are included in the selected sewing station to perform the sewing operation. [0018] According to a fourth aspect of the invention, a workstation for creating a library of sewing instructions for an automated sewing station is provided. This workstation may be used to implement or perform a method corresponding to the method according the first aspect of the invention described above, and again the sewing instructions are specific to respective workpieces. Such a workstation may include a camera configured to capture images of workpieces, a computer configured to receive images from the camera, generate shape representations and sewing paths from images of workpieces and store respective ones of said shape representations together with an associated sewing path in a database.

[0019] The shape representation may include a set of points positioned along the edge of the shape, and the sewing path may be represented as a set of points positioned with a predetermined distance from the edge of the shape.

[0020] A workstation according to this aspect of the invention may in some embodiments include a display and a pointing device, and wherein the computer can be further configured to display at least one of images received from said camera and the generated shape representations, and receive user input from the pointing device representing coordinates defining points that are to be included in the sewing paths.

[0021] In some embodiments the workstation may include a contrasting surface, and the camera may then be directed towards the contrasting surface. The workpieces may then be placed on the contrasting surface when the images are obtained by the camera. [0022] According to a fifth aspect of the invention, an automated sewing station is provided. The automated sewing station may be used to perform methods corresponding to the methods of the second and/or the third aspect of the invention described above. In a first embodiment such an automated sewing station may comprise a first camera configured to capture images of workpieces, an image processing computer configured to receive an image from said camera, generate a shape representation from said received image of a workpiece, perform a search in a database containing a library of sewing instructions associated with shape representations to find a closest matching shape representation, and upon finding the shape representation in the library of sewing instructions that is a closest match to the shape representation created from the obtained image of the workpiece, retrieving sewing instructions at least including a sewing path with a defined relationship with the closest matching shape representation. A controller computer is configured to receive the sewing instructions form the image processing computer, and the sewing machine also includes a sewing machine and a robot (104). The controller computer is configured to control a sewing operation in the automated sewing station by controlling the sewing machine and the robot based on the sewing instructions.

[0023] In some embodiments of the invention an automated sewing station according to the fifth aspect may further include a second camera configured to continuously obtain images of the sewing operation including the needle of the sewing machine and the edge of the workpiece. An edge tracking computer is configured to receive images from the second camera and process the received images to determine a distance between the needle of the sewing machine and the edge of the workpiece and to provide the result of this determination to the controller computer. The controller computer is configured to update the sewing instructions based on a predetermined rule and the determined distance.

[0024] In some embodiments the image processing computer and the controller computer are implemented as respective software modules executed by one or more processors that are part of the same computer system. However, the image processing computer and the controller computer may equally well be implemented as respective computer systems.

[0025] Like the workstation according to the fourth aspect of the invention described above, the automated sewing station may include a contrasting surface, and the camera may then be directed towards the contrasting surface. The workpieces may then be placed on the contrasting surface when the images are obtained by the camera. The contrasting surface, which may, for example, be a light table, may also serve as the delivery area for workpieces that are delivered to the workstation.

[0026] According to a sixth aspect of the invention a production cell is provided, the production cell substantially corresponds to an automated sewing cell according to the fifth aspect, but in the production cell several automated sewing cells share certain common components, primarily the components responsible for identifying workpieces, providing controlling computers with corresponding sewing instructions and distributing work between the several automated sewing stations.

[0027] A production cell may include a first camera connected to an image processing computer for identification of workpieces delivered to the production cell, a master controller computer configured to receive sewing instructions from a database based on an

identification of a workpiece by the image processing computer, a service robot (104') and a plurality of automated sewing stations. Each sewing station may include a sewing machine, a sewing robot, and a sewing station controller computer. The master controller computer is configured to select one of the plurality of automated sewing stations, control the service robot to transfer the workpiece to the selected automated sewing station, and also transfer the received sewing instructions to the sewing station controller computer of the selected automated sewing station. The sewing station controller computer is configured to control the sewing operation at said selected automated sewing station.

[0028] In some embodiments of a production cell according to the sixth aspect of the invention the automated sewing stations further include a second camera connected to an edge tracking computer for determining a distance between the needle of the sewing machine and an edge of the workpiece. The sewing station controller computer may then be further configured to receive the determined distance from the edge tracking computer and to update the sewing instructions based on a predetermined rule and said determined distance. [0029] In a production cell according to this aspect of the invention, all computers may be implemented as separate systems. However, the image processing computer, service computer, sewing station controller computers and sewing station edge tracking computers may equally well be implemented as respective software modules installed on one or more computer systems.

BRIEF DESCRIPTION OF THE DRAWINGS

[0030] FIG. 1 is an illustration of a sewing station according to the invention;

[0031] FIG. 2 is a flow chart illustrating a process of generating shape representations and sewing instructions and storing them in a shape library or database; [0032] FIG. 3 shows an exemplary user interface for generating a sewing path from a shape image or shape representation shown on a display;

[0033] FIG. 4 is a flow chart illustrating a process of identifying a workpiece by shape and programming an automated sewing station with corresponding sewing instructions;

[0034] FIG. 5A and 5B are flow charts illustrating a process of performing a sewing operation on a workpiece; and

[0035] FIG. 6 is an illustration of a production cell with a plurality of sewing stations.

DETAILED DESCRIPTION

[0036] In the following description various examples and embodiments of the invention are set forth in order to provide the skilled person with a more thorough understanding of the invention. The specific details described in the context of the various embodiments and with reference to the attached drawings are not intended to be construed as limitations. Rather, the scope of the invention is defined in the appended claims.

[0037] In the exemplary embodiments, various features and details are shown in

combination. The fact that several features are described with respect to a particular example should not be construed as implying that those features by necessity have to be included together in all embodiments of the invention. Conversely, features that are described with reference to different embodiments should not be construed as mutually exclusive. As those with skill in the art will readily understand, embodiments that incorporate any subset of features described herein and that are not expressly interdependent have been contemplated by the inventor and are part of the intended disclosure. Explicit description of all such embodiments would, however, not contribute to the understanding of the principles of the invention, and consequently some permutations of features have been omitted for the sake of simplicity.

[0038] The drawings illustrate mechanical devices such as robots and sewing machines. These illustrations are not necessarily to scale, and do not attempt to show the exact mechanical configuration of the various components or how they should be dimensioned, supported or attached. Instead, the illustrations primarily illustrate the functional relationship between the components and they are intended to facilitate understanding of this

functionality. [0039] Reference is first made to FIG. 1, which shows an overview of a sewing station including a light table 101, a registration camera 102, a sewing machine 103, a sewing robot 104, and a work surface table 105. The sewing robot 104 has a manipulation arm 106 with a workpiece gripper 107. The arm 106 is capable of being controllably moved up and down in the Z direction. In addition it can be rotated about the vertical axis, the Z-axis, by way of a motor 108 and also to be moved linearly in the horizontal X and Y directions by way of rails 109, 110. The workpiece gripper 107 may simply be a piston with a lower surface that is pressed down on the workpiece 111 such that the workpiece 111 will follow the workpiece gripper's movements due to higher friction between the workpiece 111 and the workpiece gripper 107 than between the workpiece and the surface upon which the workpiece 107 is placed. Other alternatives may introduce claws or pincers or vacuum, enabling additional manipulation including lifting of the workpiece 107 from the surface.

[0040] While this example includes only one sewing machine 103 and one sewing robot 104, other embodiments of the invention may include multiple sewing machines and multiple robots in various configurations, as will be explained in further detail below.

[0041] When a new workpiece 111 with a particular geometric shape is positioned on the light table 101 it is viewed from above by the shape registration camera 102 and an image of the shape against the contrasting background is registered. The light table 101 provides a bright background, ensuring that there will be a high contrast between the image of the workpiece 111 and the background. As part of the preprocessing, or preparation, phase, the shape of the workpiece 111 is registered and stored in an image processing computer 112. This image can later be used to recognize workpieces 111 that have been placed on the light table, and also to provide the path the robot 104 should follow when sewing, as will be described in further detail below.

[0042] The sewing path generated from the image of the workpiece 111 is transferred to controller computer 113. The controller computer 113 controls the movement of the robot 104 and it also controls the sewing machine 103. In order to enable the sewing robot 104 to adapt to changes in the geometry of the workpiece 111 during the sewing operation, for example caused by gathering seams or by pressure from the presser foot of the sewing machine, an edge tracking computer 114 is connected to an edge detection camera 115. The edge detection camera 115 continuously sends an image of the needle of the sewing machine 103 and the workpiece to the edge tracking computer 114. The edge tracking computer 114 measures the angle and distance between the edge of the workpiece 111 and the needle of the sewing machine 103 and provides the results to the controller computer 113. The controller computer 113 may then adjust the sewing path in order to compensate for any deviation from the predetermined distance from the edge of the workpiece 111 to the seam caused by deformation of the workpiece 111 or by any other reason.

[0043] Some embodiments of the invention may include a display 116 and a pointing device 117 enabling manual user input for generation or adjustment of the sewing path. [0044] The directions of movement for the robot 104 will be described based on a system of coordinates where the X-axis and Y-axis define the horizontal plane, the Z-axis is the vertical direction, and rotation will be referred to as the A-axis. The rotational axis may, but does not have to be exactly in the Z direction, and in some embodiments additional degrees of freedom may be introduced, as those with skill in the art will readily understand. [0045] The robot 104 manipulates the workpiece 111 by moving the manipulator arm 106 in the X and Y direction along the rails 109, 110 until it is immediately above the workpiece 111. The arm 106 is then extended downward until the workpiece 111 is held between the surface of the light table 101 and the workpiece gripper 107 at the end of the manipulator arm 106. The surfaces of the light table 101 and the work surface table 105 have a sufficiently low friction to allow the workpiece to follow the movement of the workpiece gripper 107 when the manipulator arm is moved in its extended position. In this manner the workpiece can be moved to any position on the work surface that is reachable by the manipulator arm 106. The A-axis motor 108 makes it possible to rotate the workpiece 111 around a vertical axis.

[0046] The first phase of operations, before a new workpiece 111 can be subjected to a sewing process, is to create a representation of its geometrical shape in a library of registered shapes in the image processing computer 112. The images stored in the image processing computer 112 may serve two purposes. They are used to identify workpieces during production, and they can be used to generate the sewing path associated with that type of workpiece. [0047] A number of features of the robot 104, the rails 109, 110, and the manipulator arm 106 will not be described in detail since they are well known and understood by those with skill in the art who may choose from a number of options, including electrical motors, servos, hydraulics and pneumatics. In the embodiment illustrated in FIG. 1 rail 109 may be moved in the Y-direction for example on an additional rail inside the robot 104 and pulled by belts or by rack and pinion. Similarly, rail 110 may be pulled in the X-direction by a belt or by rack and pinion inside rail 109. The manipulator arm 106 may be moved up and down telescopically, for example using hydraulics or pneumatics, or by rack and pinion.

[0048] Reference is now made to FIG. 2, which is a flow chart showing an example of how a process of registering a new workpiece in the library of workpieces can be performed in some embodiments of the invention. It should be noted that while this example will assume that the registration of a new workpiece is performed using the light table 101, camera 102 and image processing computer 112 of the production cell, this does not have to be the case. Registration of components as well as generation of sewing paths may equally well be performed on a workstation particularly adapted for this purpose. Such a special purpose workstation may, but does not have to, include all of the production equipment such as a sewing machine 103, a sewing robot 104, and a controller computer 113. Whether sewing paths and the library of geometric shapes are generated on special purpose equipment or on equipment that is part of a production cell, they may be transferred to other production cells to be used there.

Consequently, it is not necessary to generate shapes and paths on each production cell.

[0049] In a first step 201 the workpiece is placed on a high contrasting surface such as light table 101. In a following step 202 a camera 102 which is capable of obtaining undistorted images of the shape of the workpiece, for instance by being positioned directly above the contrasting surface, obtains one or more images of the workpiece. This image is then forwarded to an image processing computer 112 in step 203.

[0050] In step 204 the image is processed in the image processing computer 112 and a representation of the shape of the workpiece is generated. A number of methods for representation of shapes are known in the art. Generally they can be classified as contour- based and region-based, and as global or structural. In contour-based methods shape features are extracted only from the contour and not from the rest of the shape region. In structural methods the shape is represented not as a whole, but as segments or sections referred to as primitives. In some embodiments of the invention the contour based, structural method of representing shape segments as polygons are used, but other methods are consistent with the principles of the invention. An overview of various methods is presented in "Review of shape representation and description techniques" by Dengsheng Zhang and Guojun Lu, published in Pattern Recognition 37, 2004, and hereby incorporated by reference. In the exemplary embodiments discussed below the shape is represented as the sum of edges connecting vertices (or points, or nodes). The vertices are represented as the coordinates of respective points along the edge of a shape and the edges are represented as a straight line between two adjacent points.

[0051] An alternative method to the process described in steps 201 through 204 is to receive the same input data as that which is used to control a cutting machine used to cut the workpiece from a larger piece of e.g. cloth or hide. [0052] The generated shape representation is stored in a library of workpiece shapes in step 204. This library is searchable and can be used to identify workpieces during production, as will be described in further detail below.

[0053] In a next step 205 a sewing path is generated. Various methods can be used for this step. In some embodiments the path is generated manually from the image of the workpiece obtained in step 202. The image, or a processed version of the image, for example one with enhanced contrast, or even a synthetic image based on the shape representation generated in step 204, is presented on a display of a monitor which is part of the image processing computer 112 and an operator uses a pointing device such as a computer mouse to mark control points along the sewing path. The path can then be generated as line or curve segments between each control point. In most cases it will be sufficient to generate the line segments as straight lines, but in some embodiments they may be generated as curves, for example as polygons or B-splines. [0054] In alternative embodiments the sewing path is generated automatically from the shape representation generated in step 204. The sewing path may then be generated as a path parallel to and a predetermined distance from the edge of the shape. It is, of course, also possible to use one shape representation method or algorithm for generation of the shape representation and a different method or algorithm for generating the sewing path. [0055] Some embodiments may define the sewing path implicitly, for example by specifying a starting point, an ending point and a distance from the edge of the workpiece 111. When the present description refers to the sewing path and does not explicitly describe how the path is defined, all alternatives described above as well as substantially equivalent variations are intended to be included. As will be described in further detail below, the sewing path may also be defined both by an explicit path definition and a rule, for example distance to the edge of the workpiece, and the rule may then be used to handle deformation of the workpiece during sewing, something that will require deviation from the predefined sewing path.

[0056] In a final step 206 the sewing path is stored in the shape library in a manner that associates it with the shape representation. The exact structure and organization of the database that constitutes the shape library is not essential. The important point is that it should be possible to obtain a corresponding sewing path description based on an

identification of a shape representation of a workpiece.

[0057] Reference is now made to FIG. 3, which shows an exemplary user interface for generating the sewing path from a shape image or shape representation shown on a display. The example is somewhat simplified in that it does not show tools, controls or other user interface elements that are unnecessary for this description.

[0058] The user interface is shown as a viewport 301 that may be anything from the entirety of a display screen to the inside of one of a plurality of windows. Such a window will typically include a frame with borders and widgets giving access to various menus, tools and other functions, but the chrome part of the window is not included in the drawing.

[0059] The user interface includes a ruler display element 302 which helps the operator determine distances in the displayed image. The ruler scales according to zoom level and is therefore capable of showing correct distances independently of whether the user zooms into or out of the image. [0060] A mouse pointer element 303 may be displayed as an arrow, which is a well-known convention in the art. The display further shows a representation of the shape of a workpiece 311. As already mentioned this representation may be the image captured by camera 102, it may be a processed version of that image, for example with enhanced contrast, or it may be based on the shape representation generated and stored in the workpiece library.

[0061] By moving the mouse pointer 303, positioning it a predetermined distance from the edge of the workpiece 311, and providing user input for example in the form of a mouse click, the operator can create a mark or point 304. In some embodiments of the invention the mouse pointer changes appearance, for example to a cross cursor (also known as precision cursor or crosshair cursor). In the example illustrated in FIG. 3 straight lines are generated to connect adjacent points 304, and collectively these straight lines represent the sewing path 305. As mentioned above, curves other than straight lines may also be used in some embodiments of the invention. It should be noted that the points 304 can be positioned relatively far apart along sections of the workpiece edge that are relatively straight, while along sections with a sharper curvature the points 304 must be much closer to each other.

[0062] The resulting sewing path curve 305 may now be stored in association with its corresponding workpiece shape as described above. The sewing path may be stored simply as a set of coordinates representing the points 304, since the rules for generating the lines or curve segments between the points are known and can be repeated. However, there may also be necessary to associate the sewing path with one or more points of reference on the workpiece.

[0063] In some embodiments the curve representing the sewing path alone may represent sufficient input to the controller computer 113. The controller computer 113 may be programmed to map the sewing path curve 305 to the workpiece shape in accordance with general rules such as a predetermined distance from the edge of the workpiece and shape comparisons. However, it may be more efficient to include additional data with the

description of the path, for example a common reference point (e.g. the center-of-mass or a specific corner of the workpiece) and defined directional axes, as well as a starting point for the sewing process. Other production related data may also be included. Examples include sewing parameters such as stitch length, gathering rate etc. All this information is associated with the corresponding shape such that it is possible to retrieve it based on an identification of the shape. [0064] Reference is now made to FIG. 4, which is a flowchart illustration of a process of receiving a workpiece for sewing, identifying the workpiece based on the information in the shape library, loading the sewing path and associated parameters and performing the sewing operation.

[0065] In a first step 401 a workpiece is placed on a high contrasting surface such as the light table 101. This step is similar to the first step of the registration process, but now the purpose is not to register the shape of the workpiece, but to identify the workpiece based on its shape. [0066] An image is obtained by a camera 102 mounted above the light table 101 in step 402, again in a manner similar to the corresponding step of the registration process, and the image is transferred to the image processing computer 112 for shape recognition in step 403.

[0067] In step 404 the image processing computer 112 generates a shape representation using the same method as has been used to generate the shape representations stored in the shape library as described above. The image processing computer 112 then searches for the closest match to this shape in the shape library. For example, in embodiments where the shapes are represented by polygons defined in terms of the coordinates of a set of vertices, a metric can be defined to measure the difference between two polygons as a distance (e.g. the sum of the distances between the positions of corresponding points) or as cumulative measure of the angles through which a polygonal curve turns. These and other methods of polygon matching, or shape recognition in general, are well known in the art and are often part of commercially available computer vision systems.

[0068] When the shape has been identified, the sewing path and other parameters stored in association with the identified shape, as described with reference to FIG. 2 above, can be retrieved from the shape library in step 405, and these parameters can be transferred to the controller computer 113 in step 406.

[0069] Finally, in step 407, the sewing path and the other parameters that have been received by the controller computer 113 are loaded and the appropriate sewing operation can be performed.

[0070] With reference to FIG. 5A, a description of the sewing operation will now be given. FIG. 5 is a flowchart presenting a number of steps that may be performed in exemplary embodiments of the invention. It should be noted that the embodiment illustrated in FIG. 1 and the sewing process described below include only one sewing machine and one robot. In other embodiments of the invention several sewing machines may be included in one production cell. Such embodiments may include one sewing robot per sewing machine and one service robot providing configured to move workpieces from the light table to the sewing machine. Some of the steps described as performed by the robot in the following description may then be performed by the service robot while others may be performed by the sewing robot.

[0071] The following description assumes that the process of identifying the workpiece and loading the sewing instructions described above has already been performed. It is, however, consistent with the principles of the invention to perform at least parts of these processes in parallel, except, of course, that information provided by the recognition process must be received by and loaded by the controller computer 113 before steps in the sewing process that rely on this information can actually be performed.

[0072] In a first step 501 of the sewing process, the position of the workpiece on the light table 101 is determined by the image processing computer 112 based on the image received from the camera 102. This position is sent to the controller computer 113. In step 502 the controller computer 113 instructs the robot 104 to move the workpiece to the sewing machine 103. In production cells including several sewing machines this step may also include a determination of which sewing machine should receive the workpiece. [0073] In step 503 the position of the workpiece is adjusted to the correct starting position. This is the position where the needle of the sewing machine is directly above the point on the workpiece 111 from where the sewing path begins according to the data received from the shape library. The workpiece shall also be correct oriented as determined by rotation of the workpiece gripper 107, i.e. the position on the A-axis. [0074] In step 504 the sewing operation is performed in accordance with the parameters from the shape library as received from the image processing computer 112. The sewing operation will be described in further detail below.

[0075] After the sewing operation is finished, the controller computer 113 instructs the robot 104 to remove the finished workpiece. The robot 104 may be instructed to move the workpiece 111 to a specific place for removal or temporary storage, but such details are not illustrated in the drawing.

[0076] FIG. 5B is a flowchart illustrating further details of an exemplary method for performing the sewing operation in step 504. It should be noted that while this flowchart illustrates the process as one cycle of consecutive steps, several of the steps may be performed in parallel or may be reinitiated without waiting for the previous cycle to complete.

[0077] In a first step 5041, an image is obtained with the edge detection camera 115. This step is performed continuously and as soon as they are available, images are transferred to the edge tracking computer 114 in step 5042. When an image is received by the edge tracking computer 114 it is processed in step 5403 in order to provide information making it possible to determine whether the current sewing position and direction is consistent with the

programmed sewing path, for example by measuring the distance from the needle of the sewing machine 103 to the edge of the workpiece 111 and the angle of the edge of the workpiece 111 with respect to the sewing direction or direction of motion of the workpiece.

[0078] In step 5044 the results obtained by the edge tracking computer 114 are transferred to the controller computer 113 for further processing. The controller computer 113 compares the received values with the programmed sewing path and determines, in step 5045, whether an adjustment is necessary. If not, the sewing process continues in accordance with the programmed sewing path (which may involve adjustment of the robot axis if the programmed sewing path is curved). The process is repeated for the next image obtained by the edge detection camera 115, starting again with step 5041. As noted above, step 5041 is not necessarily initiated by the completion of step 5045, and processing of the next image may already be in progress. [0079] If it is determined in step 5045 that adjustment of the sewing path is necessary, the necessary adjustments are calculated in step 5046, and in step 5047 the controller computer instructs the robot 104 accordingly. If the adjustments also require an update of the instructions to the sewing machine 103 such instructions are also generated and transferred in this step. The process is then repeated for the next image obtained by the edge detection camera 115 as described above.

[0080] The mathematics and system design used to determine and execute the adjustment of the sewing path can be based on well-known control theory, for example in the form of a closed loop system (feedback control system). [0081] In some embodiments of the speed with which the workpiece is moving is measured either by processing of consecutive images from the edge detection camera 115 or by a separate speed sensor (not shown). If this information is available the calculation of necessary adjustments and resulting update of the sewing machine 103 and robot 104 may take it into consideration in order to synchronize the speed of the robot 104 and the sewing machine 103. [0082] When the sewing operation has concluded the process shown in FIG. 5B can be terminated and step 505 in FIG. 5A can be performed.

[0083] In some embodiments of the invention several sewing stations are combined into a larger production cell. Such an embodiment is illustrated in FIG. 6. In order to avoid cluttering the drawing unnecessarily, communication connections between the various components are not included in the drawing, and certain other details are also left out, as will be explained below. The reference numbers in FIG. 6 are the same as the reference numbers used for corresponding components in FIG. 1. The reference numbers do not distinguish between components of the same type, so for example all the sewing machines 103 have the same reference number.

[0084] The production cell has a contrasting surface such as a light table 101 which is used to receive workpieces 111 and which is observed from above by a registration camera 102. There are four sewing stations, each including a sewing machine 103, a sewing robot 104, a work surface table 105 and an edge detection camera 115. The manipulation arm 106, workpiece gripper 107, motor 108 and rails 109, 110 shown in FIG. 1 are collectively represented by robot arm 120.

[0085] In the area between the sewing stations and the light table 101 is a common work surface table 105'. A service robot 104' with a motor 108' and rails 109', 110' is positioned such that it can reach the light table 101 as well as all work surface tables 105, 105'. The service robot 104' also has a manipulation arm and a workpiece gripper corresponding to, respectively, manipulation arm 106 and workpiece gripper 107 in FIG. 1, but not shown in FIG. 6. [0086] An image processing computer 112 receives images from the registration camera and performs the processing described above. The image processing computer 112 also includes the library of workpieces already described. The image processing computer 112 is connected to a master controller computer 113'. The master controller computer 113' receives information from the image processing computer 112 and determines which of the sewing stations in the production cell should receive the next workpiece 111. The determination may be based on whether a sewing station is currently idle, on which sewing station will be the first to finish its current task and become idle, on historic workload such that workloads may be distributed evenly over time, or on a combination of these and other factors. The controller computer 113' may then control the service robot 104' to move the workpiece 111 from the light table 101 over the common work surface table 105' and positions it on the work surface table 105 of the designated sewing station. The controller computer 113 of the designated sewing station receives the sewing instructions from the master controller computer 113' and the process continues under control of the sewing station controller computer 113. The sewing station controller computer 113 receives edge tracking information from the sewing station edge tracking computer 114 based on images from the sewing station's edge detection camera 115, and the sewing station's controller computer 113 also controls the sewing station's sewing robot 104.

[0087] In this embodiment the processes described with reference to the drawings in FIG. 2 - FIG. 5 are essentially the same. Tasks are distributed between additional computers and robots, but that could also be the case for a production cell with only one sewing station. For example, a single station production cell may equally well have one service robot and one sewing robot. It is also possible to distribute the various tasks between the different computers in a number of different ways. In principle everything could be handled by one single computer, regardless of the number of sewing machines and robots. Conversely, additional computers with a different distribution of tasks would also be consistent with the principles of the invention. Distributed computing and embedded computer systems are well known in the art and these concepts could be utilized to achieve any configuration of computers and robots as long as the system would be able to perform the tasks necessary to operate in accordance with the invention.

[0088] In order to avoid obscuring the description of the features of the invention with unnecessary details of staple components of computers, production robots and sewing machines, such components have mostly been left out of the present disclosure. Those with skill in the art have ample insight in the workings of such equipment. The computers will include one or more processors configured to operate in accordance with instructions written in computer code and stored in persistent memory in the respective computers. The computers will further comprise working memory, one or more system buses for internal communication between various components of the computer, and interfaces for

communication with external devices such as user interfaces, e.g. display 116, user input devices, e.g. pointing device 117 and keyboards, and other computers, at the same location or remotely connected to a computer network such as the Internet. The database holding the shape library may be stored in the image processing computer 112 itself, or in a separate server (not shown). The shape library may also be distributed, in whole or in part, over several computers. [0089] The sewing machines 103 may be standard industrial type sewing machines capable of being controlled by the controller computer 113.