Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A METHOD AT A SLAUGHTERHOUSE
Document Type and Number:
WIPO Patent Application WO/2020/104636
Kind Code:
A1
Abstract:
A method, comprising at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes:recording at least one image of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor;performing classification of the meat product by a computer-implemented classifier receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types;acquiring, from the manufacturing execution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes;recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.

Inventors:
LAURIDSEN THOMAS (DK)
NIELSEN FLEMMING BONDE (DK)
Application Number:
PCT/EP2019/082187
Publication Date:
May 28, 2020
Filing Date:
November 22, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
FRONTMATEC SMOERUM AS (DK)
International Classes:
G01N33/12; A22B5/00
Domestic Patent References:
WO2018167089A12018-09-20
Foreign References:
GB2362710A2001-11-28
Other References:
J LECLERE1 ET AL: "3.II-P9 A BEEF CARCASS CLASSIFICATION BY ON-LINE IMAGE ANALYSIS", 1 January 2001 (2001-01-01), XP055659177, Retrieved from the Internet [retrieved on 20200117]
Attorney, Agent or Firm:
ZACCO DENMARK A/S (DK)
Download PDF:
Claims:
CLAIMS

1. A method, comprising: at a slaughterhouse with a conveyor (102), transporting trays (104) with meat products (106), and a manufacturing execution system (108) in

communication with one or more sensors (110) tracking the trays (104) with meat products (106) and engaged in accordance with one or more production recipes: recording at least one image of a meat product (106) while the meat product (106), placed in a tray (104), passes an image recording station (114) at the conveyor (102); performing classification of the meat product (106) by a computer- implemented classifier (116) receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types; acquiring, from the manufacturing execution system (108),

identification of one or more current production recipes; and assisting the computer-implemented classifier (116) by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier (116) and/or by restricting classification to product types comprised by the one or more current production recipes; recording with the manufacturing execution system (108), a

classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.

2. A method according to claim 1 , wherein the conveyor is configured with a load cell, the method comprising: acquiring from the load cell a mass value representing the mass of the meat product and the tray; and including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier.

3. A method according to any of the above claims, wherein an electronic device with a user interface for interaction with a human operator is in communication with the manufacturing execution system; and wherein the computer-implemented classifier performs classification in accordance with a mapping function; the method comprising: via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray; associating the operator input with a tray identifier associated with the tray at a visual inspection position; and determining that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier. 4. A method according to claim 3, wherein the user interface displays a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.

5. A method according to any of the above claims, wherein the computer- implemented classifier comprises one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.

6. A method according to any of the above claims, wherein the image recording station comprises multiple cameras arranged above the trays passing on the conveyor and inclined along respective multiple axes;

comprising: recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor; including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.

7. A method according to any of the above claims, comprising: generating a computer-readable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computer- implemented classifier. 8. A method according to any of the above claims, wherein the at least one image of a product is one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.

9. A method according to any of the above claims, wherein the image recording station comprises one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation.

10. A method according to any of the above claims, wherein a tray comprises an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.

11. A method according to any of the above claims, wherein the

manufacturing execution system registers and maintains unique sequence numbers for respective unique trays.

12. A method according to any of the above claims, comprising routing the tray on a conveyor system with multiple tracks and/or storage compartments in accordance with the classification product type associated with the tray.

13. A manufacturing system for a slaughterhouse with a conveyor (102) for transporting trays (104) with meat products (106), comprising: one or more sensors (110) tracking the trays (104) with meat products

(106); one or more cameras, at an image recording station, for recording at least one image of a meat product (106) while the meat product (106), placed in a tray (104), passes the image recording station; and a computer configured to perform the method in accordance with any of claims 1 -12.

Description:
A method at a slaughterhouse

A method at a slaughterhouse or abattoir for classifying meat products, also known as cuts.

It is observed that there is a need for more efficient and robust classification of meat products in slaughterhouses.

SUMMARY

There is provided a method comprising: at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes: recording at least one image of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor; performing classification of the meat product by a computer- implemented classifier receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types; acquiring, from the manufacturing execution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes; and recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.

Thereby it is possible to reduce misclassification rates for computer- implemented classification systems in slaughterhouses (aka. abattoirs) and/or to free human operators occupied with classifying meat products arriving in a fast and steady stream from that wearisome manual task. In one or more aspects, the likelihood values may be represented as it is known in the art e.g. as a estimated probabilities. The classifier may be implemented as one classifier or an array of binary classifiers. The classifier may implement a so-called Sigmoid function or a so-called Softmax function.

In one or more aspects, one or more current production recipes may be acquired from the manufacturing execution system. There may be one current production recipe for the slaughterhouse or for a particular production line in the slaughterhouse to be accordingly acquired. In some situations, e.g. in connection with transitioning from one production recipe to another and/or in connection with receiving different inputs to a production line, there may be multiple current production recipes. A current production recipe represents which products the manufacturing execution system is engaged to produce along the production line at a current time slot or period of time.

It should be understood that restricting classification to product types comprised by the one or more current production recipes may comprise ignoring likelihood values associated with product types not comprised by one or more current production recipes.

As it is common practice, a person skilled in the art knows how to record a classification product type associated with a tracking identifier associated with the tray at or with the manufacturing execution system. This may involve database queries.

In one or more aspects, a production recipe comprises a list or another collection of meat product types. A production recipe may be related to a particular portion of the carcass input at the slaughterhouse. In some examples, a production recipe comprises four meat product types, e.g. for “pork side”: ham, loin, belly, and shoulder. In another example, a production recipe comprises five meat product types, e.g. for“pork loin”: bone out loin, boin ribs, rind, fat, and trimmings. In some embodiments, the conveyor is configured with a load cell and the method may comprise acquiring from the load cell a mass value representing the mass of the meat product and the tray. The method may further comprise including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier. It has been observed during experimentation that including the mass value, acquired from the load cell, in the feature descriptors input to the computer- implemented classifier improves classification by reducing misclassification.

The mass value may be represented in Kilograms or Pounds or in accordance with another system. In some embodiments, an electronic device with a user interface for interaction with a human operator may be in communication with the manufacturing execution system; and the computer-implemented classifier may perform classification in accordance with a mapping function; the method may comprise via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray. The method may further comprise associating the operator input with a tray identifier associated with the tray at a visual inspection position. The method may further comprise determining that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.

Thereby the computer-implemented classifier can be trained or retrained by supervised learning in accordance with human visual inspection. Despite involving human supervision, such a method greatly reduces the amount of human labour and gradually improves the classification towards an acceptable level of classification performance.

The user interface may comprise a touch-sensitive display screen encapsulated in accordance with standard specifications applicable or required at slaughterhouses.

In some embodiments, the user interface may display a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.

The graphical indicator may comprise one or more of a name e.g. a short- name of the product and a graphical image or icon. Displaying such a graphical indicator greatly assists the human operator in correcting or entering a classification.

In some embodiments, the computer-implemented classifier may comprise one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.

In some aspects images acquired at the image recording station are provided as feature descriptors input to the classifier. In some aspects the feature descriptors are supplemented by a mass value and the identification of one or more current production recipes. The manufacturing execution system sets or gets such production recipes including current production recipes.

In some embodiments, the image recording station may comprise multiple cameras arranged above the trays passing on the conveyor and inclined along respective multiple axes; the method may comprise recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. The method may further comprise including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.

In some embodiments, the method may comprise generating a computer- readable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computer-implemented classifier.

Information about the computer-readable three-dimensional model may comprise information about geometrical dimensions of identified portions of the three-dimensional model (3D model) and/or information about texture. The geometrical dimensions may include one or more of distances and curvatures. In some embodiments, the at least one image of a product may be one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product. In particular, one or more colour images representing visible light may be supplemented by an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.

In some embodiments, the image recording station may comprise one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation. In some embodiments, a tray may comprise an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.

The identification member may be a label with a printed code e.g. a bar code or a QR code or similar. Alternatively or additionally the identification member may comprise a radio frequency transmitter and receiver e.g. a so-called RFID tag.

In some embodiments, the manufacturing execution system may register and may maintain unique sequence numbers for respective unique trays. In some embodiments, the method may comprise routing the tray on a conveyor system with multiple tracks and/or storage compartments in accordance with the classification product type associated with the tray.

Generally, herein, “associated with” may be understood as a relation in database, a list or table or in other data structures between data items. In case the term is used in connection with a physical item that is“associated with” a data item, some type of code attached to, connected to or located in a predefined way relative to the physical item serves establish“associated with”.

Generally, herein, a manufacturing execution system (MES) is a computer- implemented system which may comprise programmed computers, sensors and actuators communicating via a data network. A manufacturing execution system is used in slaughterhouses to track and document the transformation from live stock or carcass bodies to meat products e.g. as generally known to a consumer.

MES may provide information that helps manufacturing decision makers understand how current conditions on a production line can be optimized to improve production output. MES works in real time to enable the control of multiple elements of the production process (e.g. inputs, personnel, machines and support services). MES may operate across multiple function areas, for example: management of product definitions, resource scheduling, order execution and dispatch, production analysis and downtime management for overall equipment effectiveness (OEE), Product Quality, or materials track and trace. MES are especially important in regulated industries, such as food industry, where documentation and proof of processes, events and actions may be required.

BRIEF DESCRIPTION OF THE FIGURES

A more detailed description follows below with reference to the drawing, in which: fig. 1 shows a block diagram of a slaughterhouse production line and a classification module in communication with a manufacturing execution system; and fig. 2 shows a flowchart of classification and assisting a computer- implemented classifier at a slaughterhouse. DETAILED DESCRIPTION

Fig. 1 shows block diagram of a slaughterhouse production line (101 ) and a classification module (116) in communication with a manufacturing execution system (108). The slaughterhouse production line (101 ) comprises a conveyer (102) for transporting trays (104) with meat products (106) along the conveyer. The slaughterhouse production line (101 ) may comprise an image recording station (114), configured such that at least one image (IMGx) is recorded when the trays (104) with the meat products (106) passes by. The slaughterhouse production line (101 ) may comprise one or more sensors (110a, 1 10b) along the conveyer. A manufacturing execution system (108) may be in communication with the one or more sensors (110a,110b) for tracking the trays (104) with meat products (106).

A computer-implemented classifier (116) may be configured to perform classification of the meat product (106). The classifier (116) may receive input such as feature descriptors input comprising information about the at least one image (IMGx). The classifier (116) may provide output such as outputting an array of likelihood values with likelihood values associated with respective meat product types (P1 ,...Pi). The likelihood values may be associated with meat product types (P1 ,...Pi) selected from one or more production recipes (R1 ,...Ri) each listing multiple meat product types (P1 ,...Pi).

The manufacturing execution system (108) may be configured to identify one or more current production recipes (R1 ,...Ri). The identification of one or more current production recipes (R1 , ... R,) may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer-implemented classifier (116) in classification of the meat product (106). The classification of the meat product (106) may be restricted to classification of meat product types (P1 ,...Pi) comprised by the one or more current production recipes (R1 ,...Ri), thereby assisting the computer- implemented classifier (116) in classification of the meat product (106). The manufacturing execution system (108) may record a classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.

The slaughterhouse production line (101 ) may comprise a load cell (118). The load cell (118) may be configured to provide a mass value representing the mass of the meat product (106) and the tray (104).

The mass value may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer- implemented classifier (116) in classification of the meat product (106). The image recording station (114) may comprise multiple cameras arranged above the trays passing on the conveyor (102) and inclined along respective multiple axes. The image recording station (114) may record respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. Information about each of the respective multiple images, acquired from the respective cameras, may be included in the feature descriptors input to the computer-implemented classifier (116).

The slaughterhouse production line (101 ) may comprise an electronic device with (120) a user interface for interaction with a human operator configured to be in communication with the manufacturing execution system (108). The computer-implemented classifier (116) may perform classification in accordance with a mapping function. A human operator at a visual inspection position (122) may perform visual inspection of the meat product (106) in the tray (104). The human operator, at the visual inspection position (122), may provide input to the electronic device (120) via the user interface. The input may be selecting a product type in response to the visual inspection of the meat product (106) in the tray (104).

The computer-implemented classifier (116) may be part of a classification module (130). The classification module (130) may further comprise a selector (126) and an image processor (128).

The manufacturing execution system (108) may comprise a data base (124). Alternatively, the manufacturing execution system (108) may be connected with or may be in communication with a data base (124). Fig. 2 shows a flowchart of a method (201 ) of classification and assisting a computer-implemented classifier at a slaughterhouse.

The method is related to the features of fig. 1.

The method may comprise acquiring (203) an image recorded by an image recording station. The method may comprise querying (204) a data base of a manufacturing execution system for the current production recipes each listing multiple meat product types. The method may comprise reading (205) a mass value from a load cell. The method may comprise inputting (206) the image, current production recipe and mass value to a computer-implemented classifier. The method may comprise performing (207) classification. The computer-implemented classifier may perform classification in accordance with a mapping function.

The method may comprise recording (208) classification at the manufacturing execution system together with an identifier. The identifier may be selected by a selector based on an output from the computer-implemented classifier. The output from the computer-implemented classifier may by an array of likelihood values with likelihood values associated with respective meat product types. The likelihood values may be associated with meat product types selected from one or more production recipes each listing multiple meat product types.

The method may comprise reading (209) the identifier at a visual inspection position. The method may comprise receiving (210) input from the human operator. The input from the human operator may be associated with a tray identifier associated with the tray at a visual inspection position. The method may comprise updating (211 ) a mapping function, based on a determination that the that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.