Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
FAST APPROXIMATE PREDICTION OF MATERIAL PROPERTIES OF LATTICE STRUCTURES
Document Type and Number:
WIPO Patent Application WO/2020/046273
Kind Code:
A1
Abstract:
A computer-implemented method includes training a prediction model to learn a mapping from a plurality of training feature vectors to a plurality of training sets of material properties. A plurality of component feature vectors are constructed to represent a plurality of component lattices. Together, the plurality of component lattices form a new lattice, and each component feature vector of the plurality of component feature vectors represents a respective component lattice of the plurality of component lattices. Using the prediction model, a respective component set of material properties is predicted for each component feature vector of the plurality of component feature vectors.

Inventors:
MUSUVATHY SURAJ RAVI (US)
MIRABELLA LUCIA (US)
TANG TSZ LING ELAINE (US)
ARISOY ERHAN (US)
Application Number:
PCT/US2018/048254
Publication Date:
March 05, 2020
Filing Date:
August 28, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SIEMENS AG (DE)
SIEMENS CORP (US)
International Classes:
G06N3/10
Other References:
M. SADEGH YAZDI ET AL: "Optimization of geometrical parameters in a specific composite lattice structure using neural networks and ABC algorithm", JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, vol. 30, no. 4, April 2016 (2016-04-01), pages 1763 - 1771, XP055531795, ISSN: 1738-494X, DOI: 10.1007/s12206-016-0332-1
A.O. AREMU ET AL: "A voxel-based method of constructing and skinning conformal and functionally graded lattice structures suitable for additive manufacturing", ADDITIVE MANUFACTURING, vol. 13, 2017, pages 1 - 13, XP055482697, ISSN: 2214-8604, DOI: 10.1016/j.addma.2016.10.006
Attorney, Agent or Firm:
RASHIDI-YAZD, Seyed Kaveh E. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A computer-implemented method comprising: training a prediction model to learn a mapping from a plurality of training feature vectors to a plurality of training sets of material properties; constructing a plurality of component feature vectors representing a plurality of component lattices, wherein the plurality of component lattices together form a new lattice, and wherein each component feature vector of the plurality of component feature vectors represents a respective component lattice of the plurality of component lattices; and predicting, using the prediction model, a respective component set of material properties of each component feature vector of the plurality of component feature vectors.

2. The computer-implemented method of claim 1, further comprising: receiving a plurality of component voxel models representing the plurality of component lattices; wherein the training the prediction model to learn the mapping from the plurality of training feature vectors to the plurality of training sets of material properties comprises training the prediction model to learn the mapping from a plurality of training pairs, each training pair comprising a respective training voxel model of a plurality of training voxel models and a respective training feature vector of the plurality of training feature vectors, to the plurality of training sets of material properties; and wherein the predicting the respective component set of material properties of each component feature vector of the plurality of component feature vectors comprises predicting the respective component set of material properties of each component pair of a plurality of component pairs, each component pair comprising a respective component voxel model of the plurality of component voxel models and a respective component feature vector of the plurality of component feature vectors.

3. The computer-implemented method of claim 1, wherein the constructing the plurality of component feature vectors representing the plurality of component lattices comprises: receiving a plurality of component voxel models representing the plurality of component lattices, wherein each component voxel model of the plurality of voxel models represents a respective component lattice of the plurality of component lattices; and for each component lattice of the plurality of component lattices: computing a respective set of descriptive features describing the component lattice, based on the respective component voxel model; and combining the respective set of descriptive features into the respective component feature vector.

4. The computer- implemented method of claim 1, further comprising predicting a final set of material properties of the new lattice based on the plurality of component sets of material properties of the plurality of component lattices.

5. The computer-implemented method of claim 1, wherein the training the prediction model comprises: receiving a plurality of training voxel models representing a plurality of training lattices; constructing a plurality of training feature vectors, comprising a respective training feature vector based on each training voxel model in the plurality of training voxel models; determining a plurality of training sets of material properties, comprising a respective training set of material properties of each training lattice in the plurality of training lattices; and training the prediction model with the plurality of training feature vectors as input and with the plurality of training sets of material properties as output.

6. The computer-implemented method of claim 5, further comprising training the prediction model with the plurality of training voxel models as input.

7. The computer-implemented method of claim 5, wherein the plurality of training lattices comprise a first plurality of unit lattices.

8. The computer- implemented method of claim 7, wherein the plurality of component lattices comprise a second plurality of unit lattices.

9. A system comprising: a memory having computer-readable instructions; and one or more processors for executing the computer-readable instructions, the computer- readable instructions comprising: training a prediction model to learn a mapping from a plurality of training feature vectors to a plurality of training sets of material properties; constructing a plurality of component feature vectors representing a plurality of component lattices, wherein the plurality of component lattices together form a new lattice, and wherein each component feature vector of the plurality of component feature vectors represents a respective component lattice of the plurality of component lattices; and predicting, using the prediction model, a respective component set of material properties of each component feature vector of the plurality of component feature vectors.

10. The system of claim 9, the computer-readable instructions further comprising: receiving a plurality of component voxel models representing the plurality of component lattices; wherein the training the prediction model to learn the mapping from the plurality of training feature vectors to the plurality of training sets of material properties comprises training the prediction model to learn the mapping from a plurality of training pairs, each training pair comprising a respective training voxel model of a plurality of training voxel models and a respective training feature vector of the plurality of training feature vectors, to the plurality of training sets of material properties; and wherein the predicting the respective component set of material properties of each component feature vector of the plurality of component feature vectors comprises predicting the respective component set of material properties of each component pair of a plurality of component pairs, each component pair comprising a respective component voxel model of the plurality of component voxel models and a respective component feature vector of the plurality of component feature vectors.

11. The system of claim 9, wherein the constructing the plurality of component feature vectors representing the plurality of component lattices comprises: receiving a plurality of component voxel models representing the plurality of component lattices, wherein each component voxel model of the plurality of voxel models represents a respective component lattice of the plurality of component lattices; and for each component lattice of the plurality of component lattices: computing a respective set of descriptive features describing the component lattice, based on the respective component voxel model; and combining the respective set of descriptive features into the respective component feature vector.

12. The system of claim 9, the computer-readable instructions further comprising predicting a final set of material properties of the new lattice based on the plurality of component sets of material properties of the plurality of component lattices.

13. The system of claim 9, wherein the training the prediction model comprises: receiving a plurality of training voxel models representing a plurality of training lattices; constructing a plurality of training feature vectors, comprising a respective training feature vector based on each training voxel model in the plurality of training voxel models; determining a plurality of training sets of material properties, comprising a respective training set of material properties of each training lattice in the plurality of training lattices; and training the prediction model with the plurality of training feature vectors as input and with the plurality of training sets of material properties as output.

14. The system of claim 13, the computer-readable instructions further comprising training the prediction model with the plurality of training voxel models as input.

15. A computer-program product for predicting material properties, the computer- program product comprising a computer-readable storage medium having program instructions embodied therewith, the program instructions executable by a processor to cause the processor to perform a method comprising: training a prediction model to learn a mapping from a plurality of training feature vectors to a plurality of training sets of material properties; constructing a plurality of component feature vectors representing a plurality of component lattices, wherein the plurality of component lattices together form a new lattice, and wherein each component feature vector of the plurality of component feature vectors represents a respective component lattice of the plurality of component lattices; and predicting, using the prediction model, a respective component set of material properties of each component feature vector of the plurality of component feature vectors.

16. The computer-program product of claim 15, the method further comprising: receiving a plurality of component voxel models representing the plurality of component lattices; wherein the training the prediction model to learn the mapping from the plurality of training feature vectors to the plurality of training sets of material properties comprises training the prediction model to learn the mapping from a plurality of training pairs, each training pair comprising a respective training voxel model of a plurality of training voxel models and a respective training feature vector of the plurality of training feature vectors, to the plurality of training sets of material properties; and wherein the predicting the respective component set of material properties of each component feature vector of the plurality of component feature vectors comprises predicting the respective component set of material properties of each component pair of a plurality of component pairs, each component pair comprising a respective component voxel model of the plurality of component voxel models and a respective component feature vector of the plurality of component feature vectors.

17. The computer-program product of claim 15, wherein the constructing the plurality of component feature vectors representing the plurality of component lattices comprises: receiving a plurality of component voxel models representing the plurality of component lattices, wherein each component voxel model of the plurality of voxel models represents a respective component lattice of the plurality of component lattices; and for each component lattice of the plurality of component lattices: computing a respective set of descriptive features describing the component lattice, based on the respective component voxel model; and combining the respective set of descriptive features into the respective component feature vector.

18. The computer-program product of claim 15, the method further comprising predicting a final set of material properties of the new lattice based on the plurality of component sets of material properties of the plurality of component lattices.

19. The computer-program product of claim 15, wherein the training the prediction model comprises: receiving a plurality of training voxel models representing a plurality of training lattices; constructing a plurality of training feature vectors, comprising a respective training feature vector based on each training voxel model in the plurality of training voxel models; determining a plurality of training sets of material properties, comprising a respective training set of material properties of each training lattice in the plurality of training lattices; and training the prediction model with the plurality of training feature vectors as input and with the plurality of training sets of material properties as output.

20. The computer-program product of claim 19, the method further comprising training the prediction model with the plurality of training voxel models as input.

Description:
FAST APPROXIMATE PREDICTION OF MATERIAL PROPERTIES OF LATTICE

STRUCTURES

BACKGROUND

[0001] The present invention relates to lattice structures and, more specifically, to fast approximate prediction of material properties of lattice structures.

[0002] Some three-dimensional mechanical parts are designed to be light, while also providing adequate support for expected usage. Hollowing these objects can lighten them, making them easier to construct as well as less expensive. However, a hollow object can be less sturdy than a solid one. Thus, lattice structures are often used in the interior of such objects to provide support without the weight that comes from filling the objects.

[0003] A lattice structure, or lattice, is formed of a pattern of interconnected geometric structures. For instance, a lattice structure can be made of beams connected together by diamond or square structures to form the pattern. A lattice structure can provide improved mechanical properties, such as higher stiffness-to-weight ratio and higher vibration damping, to the object it supports. When designing a lattice structure, it is important to predict the mechanical properties of the lattice structure, for instance, to ensure that the lattice structure properly supports the three-dimensional object for which it is being used. There are various conventional methods for designing a lattice structure to ensure the desired set of mechanical properties. For instance, after a preliminary model of the lattice structure is designed, finite element analysis (FEA) is used to analyze the model’s mechanical properties. If those properties are not as desired, then the model is fine-tuned until it meets specifications, at which point a final lattice is built is according to the model. SUMMARY

[0004] Embodiments of the present invention are directed to a computer-implemented method for predicting material properties. A non-limiting example of the computer- implemented method includes training a prediction model to learn a mapping from a plurality of training feature vectors to a plurality of training sets of material properties. A plurality of component feature vectors are constructed to represent a plurality of component lattices. Together, the plurality of component lattices form a new lattice, and each component feature vector of the plurality of component feature vectors represents a respective component lattice of the plurality of component lattices. Using the prediction model, a respective component set of material properties is predicted for each component feature vector of the plurality of component feature vectors.

[0005] Embodiments of the present invention are directed to a system for predicting material properties. A non-limiting example of the system includes a memory having computer-readable instructions and one or more processors for executing the computer- readable instructions. The computer-readable instructions include training a prediction model to learn a mapping from a plurality of training feature vectors to a plurality of training sets of material properties. Further according to the computer-readable instructions, a plurality of component feature vectors are constructed to represent a plurality of component lattices. Together, the plurality of component lattices form a new lattice, and each component feature vector of the plurality of component feature vectors represents a respective component lattice of the plurality of component lattices. Further according to the computer-readable instructions, using the prediction model, a respective component set of material properties is predicted for each component feature vector of the plurality of component feature vectors.

[0006] Embodiments of the invention are directed to a computer-program product for predicting material properties, the computer-program product including a computer- readable storage medium having program instructions embodied therewith. The program instructions are executable by a processor to cause the processor to perform a method. A non-limiting example of the method includes training a prediction model to learn a mapping from a plurality of training feature vectors to a plurality of training sets of material properties. Further according to the method, a plurality of component feature vectors are constructed to represent a plurality of component lattices. Together, the plurality of component lattices form a new lattice, and each component feature vector of the plurality of component feature vectors represents a respective component lattice of the plurality of component lattices. Further according to the method, using the prediction model, a respective component set of material properties is predicted for each component feature vector of the plurality of component feature vectors.

[0007] Additional technical features and benefits are realized through the techniques of the present invention. Embodiments and aspects of the invention are described in detail herein and are considered a part of the claimed subject matter. For a better understanding, refer to the detailed description and to the drawings.

BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

[0008] The specifics of the exclusive rights described herein are particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features and advantages of the embodiments of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:

[0009] FIG. 1 is a block diagram of a prediction system, according to some

embodiments of the invention;

[0010] FIG. 2 is a flow diagram of a method for training a classification model to predict material properties of a lattice structure, according to some embodiments of the invention;

[0011] FIG. 3 is a flow diagram of method for predicting material properties of a lattice structure, according to some embodiments of the invention; and [0012] FIG. 4 is a block diagram of a computer system for implementing some or all aspects of the prediction system, according to some embodiments of the invention.

[0013] The diagrams depicted herein are illustrative. There can be many variations to the diagram or the operations described therein without departing from the spirit of the invention. For instance, the actions may be performed in a differing order or actions may be added, deleted or modified.

DETAILED DESCRIPTION

[0014] Various embodiments of the invention are described herein with reference to the related drawings. Alternative embodiments of the invention can be devised without departing from the scope of this invention. Various connections and positional relationships (e.g., over, below, adjacent, etc.) are set forth between elements in the following description and in the drawings. These connections or positional relationships, unless otherwise specified, can be direct or indirect, and the present invention is not intended to be limited in this respect. Moreover, the various tasks and process operations described herein can be incorporated into a more comprehensive procedure or process having additional steps or functionality not described in detail herein, or one or more tasks or operations may be optional without departing from the scope of the invention.

[0015] The following definitions and abbreviations are to be used for the interpretation of the claims and the specification. As used herein, the terms“comprises,”“comprising,” “includes,”“including,”“has,”“having,”“con tains,” or“containing,” or another variation thereof, are intended to cover a non-exclusive inclusion. For example, a composition, mixture, process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but can include other elements not expressly listed or inherent to such composition, mixture, process, method, article, or apparatus.

[0016] Additionally, the terms“at least one” and“one or more” may be understood to include a number greater than or equal to one (e.g., one, two, three, four, etc.). The term “a plurality” may be understood to include a number greater than or equal to two (e.g., two, three, four, five, etc.). The terms“about,”“substantially,” or“approximately,” or variations thereof, are intended to include a degree of error associated with measurement of the particular quantity based upon the equipment available.

[0017] For the sake of brevity, conventional techniques related to making and using aspects of the invention may or may not be described in detail herein. In particular, various aspects of computing systems to implement the various technical features described herein may be well known. Accordingly, in the interest of brevity, some conventional implementation details are only mentioned briefly herein or are omitted entirely without providing the well-known system or process details.

[0018] Turning now to an overview of technologies that are more specifically relevant to aspects of the invention, lattice design is a time- intensive and energy-intensive process. For instance, computational tools, such as FEA, can take time to set up, run, and post process to return mechanical properties of a model of a lattice structure. This makes interactive design difficult because designers must wait for results before fine-tuning the model. Additionally, to obtain desired mechanical properties for the model, design optimization is often performed, which can require multiple iterations of FEA. This can be both computationally expensive and time-consuming. Only after the desired specifications are obtained is a final lattice structure built based on a model.

[0019] One or more embodiments of the invention address the above-described shortcomings of the prior art by providing a fast and approximate mechanism of determining material properties of a lattice. The material properties determined may include, for instance, a combination of mechanical properties, thermal properties, and electrical properties. Specifically, the material properties of each of a set of unit lattice structures may be determined. A neural network or other prediction model may be trained with descriptive features of the unit lattice structures, along with the determined material properties of the unit lattice structures. While designing a new lattice structure, the new lattice structure may be broken down into new unit lattice structures. For each new unit lattice structure, the descriptive features of that new unit lattice structure may be submitted to the prediction model, which may thus output a set of material properties corresponding to the descriptive features, and thus corresponding to the new unit lattice structure. The material properties of the various new unit lattice structures may then be used to determine the material properties of the new lattice structure as a whole.

[0020] FIG. 1 is a block diagram of a prediction system 100 according to some embodiments of the invention. The prediction system 100 may predict, or determine, material properties 115 of a lattice 110, or lattice structure. Various material properties 115 may be predicted, and the set of material properties 115 predicted may vary between embodiments of the invention. Generally, for instance, the material properties 115 predicted may include mechanical properties, thermal properties, or electrical properties, or a combination of these and others. More specifically, for example, the prediction system 100 may predict one or more of the following material properties 115 of a lattice 110: stiffness-to-weight ratio, stiffness-to-density ratio, strength-to-density ratio, vibration damping, shear modulus, Poisson’s ratio, thermal conductivity, energy absorption, compressive stiffness, compressive strength, shear stiffness, shear strength, and fracture toughness.

[0021] As shown, the prediction system 100 may include a prediction model 120 and a feature builder 130. The prediction model 120 may be, for example, a neural network, random forest, decision tree, support vector machine, a classifier, or another model useable for machine learning. As described in more detail below, the prediction model 120 may be trained with training data 140 and may thereby learn a mapping 125. The training data 140 may include a plurality of voxel models 150 and feature vectors 160 representing a plurality of training lattices 110, as well as sets of material properties 115 applicable to the training lattices 110. Each feature vector 160 may include descriptive features of a corresponding lattice 110, and the feature builder 130 may construct each feature vector 160 based on the lattice 110 being represented. After training, the prediction model 120 may map a new feature vector 160, representing a new lattice 110, to a set of material properties 115 of the new lattice 110. More specifically, the prediction model 120 may map a voxel model 150 and feature vector 160 pair to a set of material properties 115. Each of the prediction model 120 and the feature builder 130 may include hardware, software, or a combination of both. For instance, each may be a software module or a specialized hardware circuit, or a combination of both.

[0022] A feature vector 160 may be an ordered set of values of descriptive features, including a respective value in each field of the feature vector 160. Each field may represent a specific descriptive feature, which is a computable property of a lattice 110. Because each feature is a computable property, one of skill in the art will understand how to compute the descriptive features and thus build the feature vector 160 corresponding to a lattice 110 in question. For example, and not by way of limitation, one or more of the following descriptive features may be incorporated into a feature vector 160: surface- area-to-volume ratio, dimensions, density, weight, surface area, and volume.

[0023] The number of dimensions in the feature vectors 160, as well as the features incorporated, need not be as described herein. Rather, the features described above as included in the feature vectors 160 are provided for illustrative purposes only, and it will be understood that the dimensions and descriptive features of the feature vectors 160 may vary from one embodiment to another. In some embodiments of the invention, however, the features incorporated into feature vectors 160 and thus considered by the prediction model 120 may be fixed through training of the prediction model 120 and later prediction of material properties 115. A feature vector 160 may be viewed as a description of the lattice 110 it represents, and the feature vector 160 may be used to determine the material properties 115 of that lattice 110. Specifically, after training, the prediction model 120 may map a feature vector 160 for a lattice 110 to a set of material properties 115 for the lattice 110.

[0024] Some embodiments of the prediction system 100 use machine learning to map feature vectors 160 to material properties 115. The phrase“machine learning” broadly describes a function of electronic systems that learn from data. A machine learning system, engine, or module may be trained to learn functional relationships between inputs and outputs, in this case feature vectors 160 and material properties 115 respectively. In some embodiments of the invention, the prediction model 120 provides machine learning functionality and may be implemented as a neural network, or other classification model, having the capability to be trained to perform this function. In machine learning and cognitive science, neural networks are a family of statistical learning models inspired by the biological neural networks of animals. Neural networks can be embodied as neuromorphic systems of interconnected processor elements, which act as simulated neurons and exchange messages between one another in the form of electronic signals. Similar to the plasticity of synaptic neurotransmitter connections that carry messages between biological neurons, the connections in neural networks that carry electronic messages between simulated neurons may be provided with numeric weights that correspond to the strength or weakness of a given connection. The weights may be adjusted and tuned based on training or other experience, making neural networks adaptive to inputs and capable of learning.

[0025] FIG. 2 is a flow diagram of a method 200 of training the prediction model 120, which may be a neural network, to learn a mapping 125 from voxel models 150 and feature vectors 160 to corresponding sets of material properties 115, according to some embodiments of the invention. Although a single prediction model 120 is described as being used in this disclosure, it will be understood that the prediction model 120 may be a single prediction model 120 or a combination of other classification sub-models, each of which may also be a prediction model 120 itself. For instance, for each material property in the set of material properties 115 desired to be predicted, a respective prediction model 120 may be trained and used to predict that material property. In this case, each classification sub-model may learn to map a voxel model 150 and a feature vector 160 to a respective material property. The prediction model 120 may thus combine outputs from the classification sub-models to map each pair of a voxel model 150 and a feature vector 160 to a set of material properties 115. For illustrative purposes only, however, this disclosure refers to a single prediction model 120 being used. [0026] As shown in FIG. 2, at block 205, a plurality of voxel models 150 may be received and may be based on a plurality of training lattices 110. In some embodiments of the invention, these voxel models 150 are included in the training data 140. As mentioned above, training data 140 may be used to train the prediction model 120. The training data 140 may include the plurality of voxel models 150 representing the plurality of training lattices 110 and, for each voxel model 150 representing a respective training lattice 110, a corresponding feature vector 160 also representing the respective training lattice 110. Thus, the training data 140 may be based on the training lattices 110.

[0027] The training lattices 110 may be unit lattices 110, for example. A unit lattice 110 is a representative structure of connected beams that can be repeated in a pattern to create a lattice 110. For example, and not by way of limitation, the training lattices 110 may include unit lattices 110 in one or more of the following shapes: a simple cube without a diagonal, a cube with a diagonal, a honeycomb, a hexagon, and a pyramid. Further, the training lattices 110 may include a variety of materials, such that not all training lattices 110 are composed of the same material. For instance, for each shape in the plurality of training lattices 110, that shape may be represented by one or more unit lattices 110, each in a different material. A benefit of using unit lattices 110 is that, for a unit lattice 110, the voxel model 150 may be relatively easy to create, and its descriptive features may be relatively easy to compute. In some embodiments of the invention, each voxel model 150 may be generated manually according to techniques known in the art, for example, with the assistance of a computer-based tool. One of skill in the art will understand how to perform this task to generate a plurality of voxel models 150 based on the plurality of training lattices 110.

[0028] At block 210, a plurality of feature vectors 160 may be generated and included in the training data 140, where the feature vectors describe the plurality of training lattices 110. Specifically, this may be performed by the feature builder 130. Given each voxel model 150 representing a respective training lattice 110, the prediction system 100 may generate a corresponding feature vector 160, which may represent the respective training lattice 110 as well. To this end, the prediction system 100 may determine a set of descriptive features describing the voxel model 150. The prediction system 100 may arrange the descriptive features into a feature vector 160. In this manner, a feature vector 160 may be generated for each training lattice 110, and the plurality of feature vectors 160 representing the plurality of training lattices 110 may be included in the training data 140.

[0029] At block 215, a set of material properties 115 of each training lattice 110 may be determined and included in the training data 140. This may be performed in accordance with conventional methods, such as through the use of FEA. Because the training lattices 110 may be unit lattices 110, the task of determining the material properties 115 of the training lattices 110 may be less time consuming than would be the case with more complex lattices 110. Further, in some embodiments of the invention, training the prediction model 120 is an initiation task and need not be performed more than once. This training may be performed again to fine-tune or change the mapping 125 of the prediction model 120 if the training data 140 has been updated (e.g., to include an additional unit lattice 110 in the training lattices 110). However, the potential burden of determining material properties 115 for each training lattice 110 need not be undertaken repeatedly after the prediction model 120 is trained.

[0030] It will be understood that, although the above describes that the plurality of voxel models 150 are received at block 205 and the plurality of feature vectors 160 are generated at block 210, these operations need not be performed as distinct activities where the plurality of voxel models 150 are generated entirely before the plurality of feature vectors 160 are generated. Rather, for example, each training lattice 110 may be handled in turn, with the respective voxel model 150 being received and the respective feature vector 160 being generated. In this case, the various voxel models 150 may be received and feature vectors 160 generated for the various training lattices 110 in sequence or in parallel, or a combination of both. Further, it will be understood that determining the material properties 115, which is performed at block 215, may be performed before, after, or in parallel with the generation of voxel models 150 and feature vectors 160.

[0031] At block 220, the prediction system 100 may train the prediction model 120 with the training data 140, by way of machine learning. Specifically, for each training lattice 110, the corresponding feature vector 160 and set of material properties 115 may be submitted to the prediction model 120 for training. With the submission, the feature vector 160 may be labeled as input, while the set of material properties 115 may be labeled as output. In some embodiments of the invention, the plurality of voxel models 150 may be included in the training data 140 as well and may be labeled as input when submitted to the prediction model 120.

[0032] At block 225, as a result of training with the training data 140, the prediction model 120 establishes a mapping 125 from feature vectors 160 to sets of material properties 115. More specifically, when the voxel models 150 are included in the training, the mapping 120 may be from voxel model 150 and feature vector 160 pairs to sets of material properties 115. In other words, the mapping may map a voxel model 150 and its corresponding feature vector 160 to a set of material properties 115. When the voxel model 150 and feature vector 160 represent a lattice 110, the set of material properties 115 output by the prediction model 120 according to the mapping 125 may be deemed a prediction of the material properties 115 of the lattice 110.

[0033] As needed, the prediction model 120 may learn from new training data 140 provided to it. Thus, at block 230, feedback may be received to be used as new or updated training data 140 for the prediction model 120. Generally, such feedback may include the addition of another triplet of a voxel model 150, feature vector 160, and set of material properties 115, where this new triplet represents a new training lattice 110, or the modification of an existing triplet included in the training data 140. Upon receipt of this feedback, the prediction model 120 may be updated at block 220, through retraining or fine-tuning. [0034] FIG. 3 is a flow diagram of a method 300 for predicting material properties 115 of a lattice 110, according to some embodiments of this invention. This method 300 may use the prediction model 120 that was previously trained.

[0035] At block 305, component lattices within a new lattice 110 may be identified. In other words, the new lattice 110 may be viewed as a composition of various component lattices 110, which are themselves lattices 110, and these various component lattices 110 may be identified. For example, and not by way of limitation, each component lattice 110 may be a unit lattice 110, such that the new lattice is a combination of unit lattices 110. More specifically, a component lattice may be a new unit lattice 110 that was not represented in the training data 140. A prediction of material properties 115 may be desired for the new lattice 110. For instance, the new lattice 110 may be not yet finalized, such that a prediction of the material properties 115 of the new lattice 110 is desired.

After the material properties 115 are predicted, the new lattice 110 may be finalized, or those material properties 115 may be used as a reference in modifying the design of the new lattice 110. Due to the relatively fast nature of the prediction system 100, the prediction of material properties 115 of the new lattice 110 can be used during interactive design of the new lattice 110.

[0036] At block 310, a plurality of component voxel models 150 may be generated, with each component voxel model 150 representing a respective component lattice 110 of the new lattice 110. Thus, taken together, the plurality of voxel models 150 may represent the new lattice 110. One of skill in the art will understand how to generate a component voxel model 150 from a component lattice 110. This may be performed manually, for example, through interaction with a computer-based tool. Alternatively, however, this may be performed automatically by the prediction system 100 or by another automated process.

[0037] At block 315, the prediction system 100 may then receive the plurality of component voxel models 150. Each component voxel model 150 and respective component lattice 110 may be considered in turn in the following iterative loop. However, it will be understood that the unit lattices 110 making up the new lattice 110 may be considered in sequence or in parallel, or a combination of both.

[0038] At block 320, at the beginning of the loop, a component voxel model 150 not yet considered may be selected from the plurality of component voxel models 150 that together represent the new lattice 110. At block 325, a feature vector 160 may be generated based on the component voxel model 150. Thus, the feature vector 160 may describe the component lattice 110 represented by the component voxel model 150. The generation of the feature vector 160 may be performed in the same manner or in a similar manner to doing so during training of the prediction model 120, as discussed with respect to block 210. At block 330, the feature vector 160 may be submitted to the prediction model 120 for a prediction. If the prediction model 120 learned to map pairs of voxel models 150 and feature vectors 160, then the voxel model 150 may be submitted to the prediction model 120 along with the feature vector 160. At block 335, the prediction model 120 may predict a set of material properties 115 of the respective component lattice 110, based on the feature vector 160 or based on the feature vector 160 combined with the voxel model 150. At decision block 340, it may be determined whether an additional component voxel model 150 of the new lattice 110 has yet to be selected for consideration. If such a component voxel model 150 remains, then the method 300 may return to block 320. Otherwise, all component lattices 110 of the new lattice 110 have been considered, and the method 300 may proceed to block 345.

[0039] At block 345, the prediction system 100 may determine a set of material properties 115 of the new lattice 110 as a whole, based on the material properties 115 of the various component lattices 110 making up the new lattice 110. For example, and not by way of limitation, determining the material properties 115 of the new lattice 110, based on the material properties 115 of the component lattices, may be performed through the use of a physics simulation, such as FEA.

[0040] FIG. 4 is a block diagram of a computer system 400 for implementing some or all aspects of the system, according to some embodiments of this invention. The prediction systems 100 and methods described herein may be implemented in hardware, software (e.g., firmware), or a combination thereof. In some embodiments, the methods described may be implemented, at least in part, in hardware and may be part of the microprocessor of a special or general-purpose computer system 400, such as a personal computer, workstation, minicomputer, or mainframe computer.

[0041] In some embodiments, as shown in FIG. 4, the computer system 400 includes a processor 405, memory 410 coupled to a memory controller 415, and one or more input devices 445 and/or output devices 440, such as peripherals, that are communicatively coupled via a local I/O controller 435. These devices 440 and 445 may include, for example, a printer, a scanner, a microphone, and the like. Input devices such as a conventional keyboard 450 and mouse 455 may be coupled to the I/O controller 435. The I/O controller 435 may be, for example, one or more buses or other wired or wireless connections, as are known in the art. The I/O controller 435 may have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications.

[0042] The I/O devices 440, 445 may further include devices that communicate both inputs and outputs, for instance disk and tape storage, a network interface card (NIC) or modulator/demodulator (for accessing other files, devices, systems, or a network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, and the like.

[0043] The processor 405 is a hardware device for executing hardware instructions or software, particularly those stored in memory 410. The processor 405 may be a custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the computer system 400, a semiconductor-based microprocessor (in the form of a microchip or chip set), a macroprocessor, or other device for executing instructions. The processor 405 includes a cache 470, which may include, but is not limited to, an instruction cache to speed up executable instruction fetch, a data cache to speed up data fetch and store, and a translation lookaside buffer (TLB) used to speed up virtual-to-physical address translation for both executable instructions and data. The cache 470 may be organized as a hierarchy of more cache levels (Ll, L2, etc.).

[0044] The memory 410 may include one or combinations of volatile memory elements (e.g., random access memory, RAM, such as DRAM, SRAM, SDRAM, etc.) and nonvolatile memory elements (e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD- ROM), disk, diskette, cartridge, cassette or the like, etc.). Moreover, the memory 410 may incorporate electronic, magnetic, optical, or other types of storage media. Note that the memory 410 may have a distributed architecture, where various components are situated remote from one another but may be accessed by the processor 405.

[0045] The instructions in memory 410 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions. In the example of FIG. 4, the instructions in the memory 410 include a suitable operating system (OS) 411. The operating system 411 essentially may control the execution of other computer programs and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.

[0046] Additional data, including, for example, instructions for the processor 405 or other retrievable information, may be stored in storage 420, which may be a storage device such as a hard disk drive or solid-state drive. The stored instructions in memory 410 or in storage 420 may include those enabling the processor to execute one or more aspects of the prediction systems 100 and methods of this disclosure.

[0047] The computer system 400 may further include a display controller 425 coupled to a display 430. In some embodiments, the computer system 400 may further include a network interface 460 for coupling to a network 465. The network 465 may be an IP- based network for communication between the computer system 400 and an external server, client and the like via a broadband connection. The network 465 transmits and receives data between the computer system 400 and external systems. In some embodiments, the network 465 may be a managed IP network administered by a service provider. The network 465 may be implemented in a wireless fashion, e.g., using wireless protocols and technologies, such as WiFi, WiMax, etc. The network 465 may also be a packet- switched network such as a local area network, wide area network, metropolitan area network, the Internet, or other similar type of network environment. The network 465 may be a fixed wireless network, a wireless local area network (LAN), a wireless wide area network (WAN) a personal area network (PAN), a virtual private network (VPN), intranet or other suitable network system and may include equipment for receiving and transmitting signals.

[0048] Prediction systems 100 and methods according to this disclosure may be embodied, in whole or in part, in computer program products or in computer systems 400, such as that illustrated in FIG. 4.

[0049] The description of the present invention has been presented for the purpose of illustration. This description is not intended to be exhaustive or to limit the invention to the forms disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiments of the invention discussed herein were chosen and described in order to best explain the principles of the invention and the practical applications, and to enable others of ordinary skill in the art to understand the invention. While certain embodiments of the invention have been described, it will be understood that those skilled in the art, both now and in the future, may make various improvements and enhancements that fall within the scope of the claims that follow.