Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
THERMOCHROMIC DYE TEMPERATURES
Document Type and Number:
WIPO Patent Application WO/2023/043433
Kind Code:
A1
Abstract:
Examples of methods are described. In some examples, a method may include dispensing a thermochromic dye. In some examples, the thermochromic dye may be dispensed on a fusing layer of a build volume. In some examples, the method may include measuring a temperature of the fusing layer indicated by an optical image of the thermochromic dye.

Inventors:
WRIGHT JACOB TYLER (US)
LEYVA MENDIVIL MARIA FABIOLA (MX)
KOTHARI SUNIL (US)
CHEN LEI (CN)
WYCOFF KYLE DOUGLAS (US)
CATANA SALAZAR JUAN CARLOS (MX)
ZENG JUN (US)
Application Number:
PCT/US2021/050310
Publication Date:
March 23, 2023
Filing Date:
September 14, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HEWLETT PACKARD DEVELOPMENT CO (US)
International Classes:
B22F12/90; B33Y10/00; B33Y50/02; C09D11/03; G06T1/40; G06T15/08; G06T19/20
Domestic Patent References:
WO2020153949A12020-07-30
WO2018199955A12018-11-01
WO2016168488A12016-10-20
Foreign References:
US20200316975A12020-10-08
Attorney, Agent or Firm:
MCFARLAND, Elena K. et al. (US)
Download PDF:
Claims:
CLAIMS

1 . A method, comprising: dispensing a thermochromic dye on a fusing layer of a build volume; and measuring a temperature of the fusing layer indicated by an optical image of the thermochromic dye.

2. The method of claim 1 , further comprising calibrating a thermal prediction procedure of three-dimensional (3D) additive manufacturing based on the temperature.

3. The method of claim 2, wherein calibrating the thermal prediction procedure comprises: predicting, using a first machine learning model, a thermal image based on geometrical data; and determining, from the thermal image, a thermal image region that satisfies a criterion, wherein dispensing the thermochromic dye on the fusing layer comprises dispensing the thermochromic dye in a region of the build volume corresponding to the thermal image region.

4. The method of claim 3, wherein calibrating the thermal prediction procedure further comprises training a second machine learning model based on the thermal image region and the temperature.

5. The method of claim 4, further comprising: predicting, using the first machine learning model, a second thermal image based on second geometrical data; and predicting, using the second machine learning model, a second temperature based on the second thermal image. 6. The method of claim 1 , further comprising calibrating optical image processing based on color swatches associated with temperatures.

7. The method of claim 6, wherein calibrating the optical image processing is performed before printing.

8. The method of claim 6, wherein calibrating the optical image processing is performed after printing.

9. The method of claim 1 , further comprising: capturing a thermal image of the fusing layer when a carriage has cleared a view; and determining a fusing layer cooling rate based on the temperature and the thermal image.

10. An apparatus, comprising: a memory; and a processor coupled to the memory, wherein the processor is to: determine a temperature of a fusing layer based on an optical image of thermochromic dye of the fusing layer; and train a calibration machine learning model based on the temperature, wherein the calibration machine learning model is to predict a calibrated thermal image based on a thermal image.

11 . The apparatus of claim 10, wherein the thermal image is predicted by a prediction machine learning model.

12. The apparatus of claim 10, wherein the thermochromic dye is ejected from a first print head that is separate from a second print head to eject agent.

13. A non-transitory tangible computer-readable medium comprising instructions when executed cause a processor of an electronic device to: calibrate optical image processing based on a first optical image of color swatches captured by a camera; and measure a temperature of a fusing layer indicated by a second optical image of thermochromic dye captured by the camera based on the calibrated optical image processing.

14. The non-transitory tangible computer-readable medium of claim 13, further comprising instructions when executed cause the processor to calibrate a thermal prediction procedure based on the temperature.

15. The non-transitory tangible computer-readable medium of claim 14, further comprising instructions when executed cause the processor to predict, using the calibrated thermal prediction procedure, a calibrated thermal image based on geometrical data.

Description:
THERMOCHROMIC DYE TEMPERATURES

BACKGROUND

[0001] Three-dimensional (3D) solid parts may be produced from a digital model using additive manufacturing. Additive manufacturing may be used in rapid prototyping, mold generation, mold master generation, and short-run manufacturing. Additive manufacturing involves the application of successive layers of build material. In some additive manufacturing techniques, the build material may be cured or fused.

BRIEF DESCRIPTION OF THE DRAWINGS

[0002] Figure 1 is a flow diagram illustrating an example of a method for measuring temperature based on thermochromic dye;

[0003] Figure 2A is a block diagram illustrating an example of engines that may be utilized in accordance with some examples of the techniques described herein;

[0004] Figure 2B is a block diagram illustrating an example of engines that may be utilized in accordance with some examples of the techniques described herein;

[0005] Figure 3 is a block diagram of an example of an apparatus that may be used in measuring thermochromic dye temperatures;

[0006] Figure 4 is a block diagram illustrating an example of a computer- readable medium 448 for temperature measurement;

[0007] Figure 5 is a diagram illustrating an example of optical image processing calibration; and [0008] Figure 6 is a diagram illustrating an example of concurrent thermal image and optical image capture.

DETAILED DESCRIPTION

[0009] Additive manufacturing may be used to manufacture three- dimensional (3D) objects. 3D printing is an example of additive manufacturing. Some examples of 3D printing may selectively deposit an agent or agents (e.g., droplets) at a pixel level to enable control over voxel-level energy deposition. For instance, thermal energy may be projected over material in a build area, where a phase change (for example, melting and solidification) in the material may occur depending on the voxels where the agents are deposited.

[0010] A voxel is a representation of a location in a 3D space. For example, a voxel may represent a volume or component of a 3D space. For instance, a voxel may represent a volume that is a subset of the 3D space. In some examples, voxels may be arranged on a 3D grid. For instance, a voxel may be rectangular or cubic in shape. In some examples, voxels may be arranged along axes. An example of three-dimensional (3D) axes includes an x dimension, a y dimension, and a z dimension. In some examples, a quantity in the x dimension may be referred to as a width, a quantity in the y dimension may be referred to as a length, and/or a quantity in the z dimension may be referred to as a height. The x and/or y axes may be referred to as horizontal axes, and the z axis may be referred to as a vertical axis. Other orientations of the 3D axes may be utilized in some examples, and/or other definitions of 3D axes may be utilized in some examples.

[0011] Examples of a voxel size dimension may include 25.4 millimeters (mm)/150 ~ 170 microns for 150 dots per inch (dpi), 490 microns for 50 dpi, 2 mm, etc. The term “voxel level” and variations thereof may refer to a resolution, scale, or density corresponding to voxel size. In some examples, the term “voxel” and variations thereof may refer to a “thermal voxel.” In some examples, the size of a thermal voxel may be defined as a minimum that is thermally meaningful (e.g., greater than or equal to 42 microns or 600 dots per inch (dpi)). A set of voxels may be utilized to represent a build volume.

[0012] A build volume is a volume in which an object or objects may be manufactured. A “build” may refer to an instance of 3D manufacturing. A layer is a portion of a build volume. For example, a layer may be a cross section (e.g., two-dimensional (2D) cross section) or 3D portion (e.g., rectangular prism) of a build volume. In some examples, a layer may refer to a horizontal portion (e.g., plane) of a build volume. In some examples, an “object” may refer to an area and/or volume in a layer and/or build volume indicated for forming a physical object.

[0013] In some examples of 3D manufacturing (e.g., Multi Jet Fusion (MJF)), each voxel in the build volume may undergo a thermal procedure (approximately 15 hours of build time (e.g., time for layer-by-layer printing) and approximately 35 hours of additional cooling). The thermal procedure of voxels that include an object may affect the manufacturing quality (e.g., functional quality) of the object.

[0014] Factors affecting production yield in 3D printing may include packing density, powder recyclability, and manufacturing accuracy (e.g., degree of an object defect(s)). Tighter packing may allow printing more objects in a single build. However, if the objects are packed too closely, the objects and/or powder may overheat, which may result in a reduction in reusable powder quality and increased object defects. Accordingly, objects may be packed to keep the build volume within a range of temperatures to preserve powder quality and/or reduce object defects.

[0015] Some thermal information (e.g., build volume wall temperatures alone, thermal sensor data alone, etc.) may provide a limited observation of temperatures occurring in a build volume. It may be difficult to directly observe peak temperature in the build volume, which may indicate which powder will undergo higher thermal stress. For example, a voxel may experience the voxel’s peak temperature when fuse lamps are over the voxel. This peak temperature may be difficult to observe due to occlusion by a fusing lamp carriage. [0016] The thermal journey of each powder voxel may provide data to calculate the thermal stress each build exerts on the powder, and to predict potential temperature-related defects in objects. Each voxel may have an individual thermal history. It may be difficult to physically sense the temperature of every voxel during the thermal procedure. In some approaches, peak temperature may drive the cooling history of the voxels (given thermal diffusion and wall conditions, for instance). Accurate peak temperature data may enable calibrating fusing conditions for thermal prediction procedures, may increase accuracy in powder degradation prediction, and/or may enable the prediction of defects such as clogged holes, thermal bleed, and/or hot-spot thermal bleed at a top (e.g., outer) surface.

[0017] Some examples of the techniques described herein may provide approaches for in-situ sensing of peak temperature for powder voxels using thermochromic dyes. In some examples, peak temperature data may be utilized to validate and/or increase the accuracy of voxel-level thermal simulation and/or prediction for different builds and/or printers. For instance, some of the techniques described herein may help to determine a peak temperature that each voxel in a build volume will experience as a result of the geometry being printed and the thermal signature of the printer in question. Some examples of the techniques described herein may provide accurate measurement and/or prediction of the thermal behavior in the build volume during printing. Some examples of the techniques described herein may utilize thermochromic dye to evaluate peak powder temperature for powder (e.g., a portion or portions of the build volume and/or across the full build volume). Some examples of the techniques described herein may utilize thermochromic dye data to calibrate boundary conditions at a fusing layer in a 3D printing simulation and/or prediction.

[0018] It may be useful to provide thermal information at or near print resolution (e.g., 75 dpi) for guiding the placement of an agent or agents (e.g., fusing agent, detailing agent, and/or other thermally relevant fluids). An example of print resolution is 42 microns in x-y dimensions and 80 microns in a z dimension. [0019] In some examples, thermal information or thermal behavior may be mapped as a thermal image. A thermal image is a set of data indicating temperature(s) (or thermal energy) in an area. A thermal image may be sensed, captured, simulated, and/or predicted.

[0020] While plastics (e.g., polymers) may be utilized as a way to illustrate some of the approaches described herein, some the techniques described herein may be utilized in various examples of additive manufacturing. For instance, some examples may be utilized for plastics, polymers, semi-crystalline materials, metals, etc. Some additive manufacturing techniques may be powderbased and driven by powder fusion. Some examples of the approaches described herein may be applied to area-based powder bed fusion-based additive manufacturing, such as Stereolithography (SLA), Multi Jet Fusion (MJF), Metal Jet Fusion, Selective Laser Melting (SLM), Selective Laser Sintering (SLS), liquid resin-based printing, etc. Some examples of the approaches described herein may be applied to additive manufacturing where agents carried by droplets are utilized for voxel-level thermal modulation.

[0021] In some examples, “powder” may indicate or correspond to particles. In some examples, an object may indicate or correspond to a location (e.g., area, space, etc.) where particles are to be sintered, melted, and/or solidified. For example, an object may be formed from sintered or melted powder.

[0022] Some examples of the techniques described herein may include machine learning. Machine learning is a technique where a machine learning model is trained to perform a task or tasks based on a set of examples (e.g., data). Training a machine learning model may include determining weights corresponding to structures of the machine learning model. Artificial neural networks are a kind of machine learning model that are structured with nodes, layers, and/or connections. Deep learning is a kind of machine learning that utilizes multiple layers. A deep neural network is a neural network that utilizes deep learning.

[0023] Examples of neural networks include regression networks (e.g., isotonic regression models), convolutional neural networks (CNNs) (e.g., basic CNN, deconvolutional neural network, inception module, residual neural network, etc.) and recurrent neural networks (RNNs) (e.g., basic RNN, multilayer RNN, bi-directional RNN, fused RNN, clockwork RNN, etc.). Different depths of a neural network or neural networks may be utilized in accordance with some examples of the techniques described herein.

[0024] Throughout the drawings, similar reference numbers may designate similar or identical elements. When an element is referred to without a reference number, this may refer to the element generally, with and/or without limitation to any particular drawing or figure. In some examples, the drawings are not to scale and/or the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples in accordance with the description. However, the description is not limited to the examples provided in the drawings.

[0025] Figure 1 is a flow diagram illustrating an example of a method 100 for measuring temperature based on thermochromic dye. The method 100 and/or an element or elements of the method 100 may be performed by an electronic device. For example, the method 100 may be performed by the apparatus 324 described in relation to Figure 3.

[0026] The apparatus may dispense 102 a thermochromic dye on a fusing layer of a build volume. Thermochromic dye is a substance that changes color based on heat exposure. For instance, the color of thermochromic dye may correspond to a peak temperature reached by the thermochromic dye. Examples of thermochromic dye may include K200-NH or Acid Blue 9. In some examples, dispensing 102 the thermochromic dye may include controlling a printhead(s) and/or sending instructions to a printer to print (e.g., eject, extrude, etc.) the thermochromic dye onto a fusing layer of a build volume. In some examples, a printhead or printheads (e.g., a printhead with a reservoir or “pen”) may be utilized to eject thermochromic dye. In some examples, the thermochromic dye may be ejected from a first print head that is separate from a second printhead to eject agent (e.g., fusing agent, detailing agent, etc.). The thermochromic dye may be dispensed over a whole fusing layer or over a region (e.g., subset) of the fusing layer. [0027] A fusing layer is an exposed layer, a top layer, or a layer undergoing fusing of material. For example, a fusing layer may be a top layer of material in a build volume that is exposed to a print head and/or thermal projector for sintering. A buried layer is a covered layer. For instance, a buried layer may be a layer beneath or under the fusing layer.

[0028] In some examples, a thermochromic, water-based dye such as K200- NH may be ejected through a printhead to any region of the fusing layer (e.g., region(s) where high temperatures may occur). In some examples, a thermochromic dye may have an operating range of (60-200° Celsius (C)) and may change color from white to magenta. In some examples, some regions may also include agent (e.g., fusing agent and/or detailing agent), where the thermochromic dye may indicate the approximate temperature. In some examples, a calibration procedure may be performed to compensate for a cooling and/or heating effect the thermochromic dye may have on the powder.

[0029] In some examples, a thermochromic dye such as Acid Blue 9 may be utilized to indicate build volume temperatures. For materials lacking chromophoric degradation pathways, Acid Blue 9 may be ejected to any region of the fusing layer such that the thermal degradation occurring over the course of the build may be captured.

[0030] The apparatus may measure 104 a temperature of the fusing layer indicated by an optical image of the thermochromic dye. For example, the apparatus may capture and/or receive an optical image of the fusing layer. The optical image may be captured by a camera with a view that includes the fusing layer. For instance, the camera may be mounted in a printer above the build volume. In some examples, the build volume (e.g., fusing layer) may be illuminated with a white light source (e.g., white light-emitting diode (LED)). For instance, the camera and/or printer may include a white LED that may illuminate the fusing layer when the optical image is captured. In some examples, the camera may capture a still image or images and/or video frames. In some examples, the optical image may be a still image or video frame. In some examples, the optical image may depict a whole fusing layer or a region of the fusing layer. [0031] The optical image may indicate a color or colors of the thermochromic dye on the fusing layer. In some examples, the apparatus may measure 104 the temperature of the fusing layer by determining a temperature corresponding to a color in the optical image. For instance, the apparatus may utilize a look-up table or function to map pixel color from the optical image to temperature. For instance, different pixel colors and/or shades may correspond to different temperatures (e.g., peak temperatures) experienced by the thermochromic dye. The apparatus may measure the temperature of the fusing layer by mapping a color in the optical image to a corresponding temperature. In some examples, the apparatus may assign and/or record the temperature or temperatures corresponding to a pixel or sets of pixels (e.g., areas of the optical images with the same color or within a color range).

[0032] In some examples, the apparatus may utilize a spatial mapping between the optical image and voxels of the build volume. For example, the apparatus may apply a transformation or transformations (e.g., unprojection) to the pixels in the optical image to map the color(s) and/or corresponding temperature(s) to the voxels of the fusing layer in the build volume (e.g., locations in 3D space).

[0033] In some examples, the apparatus may perform optical image processing. Optical image processing is an operation or operations to process optical data (e.g., pixels) of the optical image. Examples of optical image processing may include color compensation, white balance compensation, and/or lens distortion correction, etc. For instance, the apparatus may adjust the optical image to increase color accuracy of the optical image.

[0034] In some examples, the method 100 may include calibrating optical image processing based on color swatches associated with temperatures. A color swatch is an item with a sample color. For instance, color swatches of the same color (and/or corresponding to the same temperature of the thermochromic dye) may be placed at different locations within a camera’s field of view. Due to environmental variations (e.g., lighting, etc.) and/or sensing variations (e.g., lens distortion, etc.), swatches with the same physical color may vary in the optical image (e.g., color may vary in different locations of the optical image) captured by the camera. In some examples, a camera may inaccurately sense the color of a swatch as a slightly different color in the optical image. The apparatus may calibrate (e.g., adjust) the optical image processing to reduce color sensing inaccuracy and/or spatial color variation between swatches of the same color. For instance, a color swatch may have a designated color and/or corresponding temperature. For instance, the apparatus may receive the designated color (e.g., red-green-blue (RGB) value) of a color swatch or swatches from an input device (e.g., keyboard, mouse, touchscreen, etc.). The apparatus may capture and/or receive an optical image that depicts the swatch or swatches. The apparatus may determine color compensation (e.g., a difference or bias) between the designated color and the captured color (and/or compensation between a temperature corresponding to the designated color and a temperature corresponding to the captured color). The apparatus may determine color compensation for spatial variation in color sensing.

[0035] In some examples, calibrating the optical image processing may be performed before printing. For instance, the calibration (e.g., compensation) may be performed before printing a build. In some examples, the apparatus may capture and/or receive a calibration image of a color swatch or swatches and compensate the optical image processing before a build is printed, before thermochromic dye is dispensed, and/or before an optical image of a fusing layer with thermochromic dye is captured. For instance, the calibration (e.g., compensation) may be applied to an optical image processing pipeline, such that compensation is automatically applied during processing of an optical image.

[0036] In some examples, calibrating the optical image processing may be performed after printing. For instance, the calibration (e.g., compensation, postprint correction, etc.) may be performed after printing a build. In some examples, the apparatus may capture and/or receive a calibration image of a color swatch or swatches and compensate the optical image processing after a build is printed, after thermochromic dye is dispensed, and/or after an optical image of a fusing layer with thermochromic dye is captured. For instance, the calibration (e.g., compensation) may be applied as compensation to a previously captured optical image and/or temperature. In some examples, the apparatus may apply the determined compensation (e.g., color compensation) to an optical image to increase color accuracy in the image, which may increase temperature measurement accuracy. In some examples, the apparatus may apply the determined temperature compensation to a temperature determined from the optical image to increase temperature measurement accuracy.

[0037] In some examples, optical image processing calibration may be performed for multiple colors and/or color shades. In some examples, optical image processing calibration may be performed in accordance with the example described in relation to Figure 5.

[0038] In some examples, the method 100 may include capturing a thermal image of the fusing layer when a carriage has cleared a view. For instance, a carriage may move over the fusing layer to eject agent(s) and/or thermochromic dye on the fusing layer. The carriage may move between a thermal image sensor (e.g., infrared (IR) sensor) and the fusing layer (and/or between an optical camera and the fusing layer). When the carriage is not obstructing a view of the thermal image sensor, the thermal image sensor may capture the thermal image. When the carriage is not obstructing a view of the camera, the camera may capture the optical image. For example, the apparatus may capture and/or receive color images of the thermochromic dye to measure peak temperature. The color mark of peak temperature may be preserved. In some examples, the apparatus may concurrently capture the thermal image and/or may receive a concurrently captured thermal image with the optical image. As used herein, the term “concurrently” and variations thereof may mean in overlapping time frames and/or approximately at a same time. The thermal image may indicate thermal data (e.g., temperature over the thermal image) and the optical image may indicate peak temperature after the carriage clears the sensors’ views. In some examples, the optical image and/or thermal image may be captured after image processing calibration.

[0039] In some examples, the method 100 may include determining a fusing layer cooling rate based on the temperature (e.g., peak temperature from the optical image) and the thermal image (e.g., thermal image from the image sensor). For instance, the apparatus may analyze the temperature and the thermal image to determine how fast the fusing layer cools down in a period between the application time of the heat source and the time the thermal sensor captures the thermal image. In some examples, optical image capture, thermal image capture, and/or cooling rate determination may be performed in accordance with the example described in relation to Figure 6.

[0040] In some examples, the method 100 may include calibrating a thermal prediction procedure of 3D additive manufacturing based on the temperature (e.g., the temperature measured from the thermochromic dye). A thermal prediction procedure is a procedure in which thermal behavior (e.g., a thermal image or images) in a build volume is predicted. For instance, the apparatus may utilize a machine learning model or models to predict the thermal behavior (e.g., a thermal image) of 3D additive manufacturing. The measured temperature may be utilized to calibrate (e.g., compensate, adjust, etc.) the predicted thermal behavior to increase accuracy.

[0041] In some examples, calibrating the thermal prediction procedure may include predicting, using a first machine learning model, a thermal image based on geometrical data. Geometrical data is data indicating a model or models of an object or objects. For example, geometrical data may indicate the placement and/or model of an object or objects in a build volume. A model is a geometrical model of an object or objects. A model may specify shape and/or size of a 3D object or objects. In some examples, a model may be expressed using polygon meshes and/or coordinate points. For example, a model may be defined using a format or formats such as a 3D manufacturing format (3MF) file format, an object (OBJ) file format, computer aided design (CAD) file, and/or a stereolithography (STL) file format, etc. In some examples, the geometrical data indicating a model or models may be received from another device and/or generated. For instance, the apparatus may receive a file or files of geometrical data and/or may generate a file or files of geometrical data. In some examples, the apparatus may generate geometrical data with model(s) created on the apparatus from an input or inputs (e.g., scanned object input, user-specified input, etc.). Examples of geometrical data include model data, shape image(s), slice(s), contone map(s), etc.

[0042] In some examples, the first machine learning model may be trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data. For instance, the first machine learning model may be trained using layers (e.g., slices) of geometrical data as input data and captured thermal images and/or simulated thermal images as ground truth data. After training, the first machine learning model may predict or infer the thermal behavior (e.g., thermal image(s)) in a build volume.

[0043] In some examples, the temperature measured from thermochromic dye may be utilized to calibrate (e.g., validate and/or adjust) the thermal prediction procedure to increase the accuracy of the thermal prediction procedure. In some cases of additive manufacturing, each fusing layer may have a different thermal distribution and/or different peak temperature. For instance, due to differences between peak temperature indicated by thermochromic dye and ground truth data (e.g., captured thermal images using a thermal sensor, simulated thermal images, etc.), there may be a disparity between predicted thermal behavior (e.g., a thermal image) and peak temperature measured using thermochromic dye. Some examples of the first machine learning model may utilize a buried layer as ground truth data for predicting fusing layers’ thermal distributions, differentiating the difference between even and odd layers’ hotter temperature region. Due to rapid cooling and camera occlusion, measurement of the buried layers may include some inaccuracy in representing peak temperature during printing. For example, a measured buried layer (measured by an IR sensor, for instance) may have a different thermal distribution curve as compared to actual peak thermal distribution. In some examples, the predicted thermal behavior may be calibrated to more accurately reflect the actual peak temperature. For instance, the peak temperature measured using thermochromic dye may be utilized to increase the accuracy of predicted thermal behavior.

[0044] In some examples, calibrating the thermal prediction procedure may include determining, from a thermal image (e.g., a predicted thermal image from the first machine learning model), a thermal image region that satisfies a criterion. For instance, the apparatus may utilize the first machine learning model to predict a thermal image of a fusing layer. The apparatus may evaluate and/or segment the thermal image in thermal image regions (e.g., image patches of the same size). The apparatus may determine (e.g., compute) a metric or metrics (e.g., average temperature, mean temperature, median temperature, maximum temperature, etc.) for each of the thermal image regions. An example of a criterion is a greatest temperature, greatest mean temperature, etc., among the thermal image regions. For instance, thermal image region with the greatest mean temperature may be determined as a peak thermal image region. The determined thermal image region may be utilized as a suggested region for thermochromic dye ejection. In some examples, the suggested region may be adjusted to have a size similar to the size of the thermal image region.

[0045] In some examples, dispensing 102 the thermochromic dye on the fusing layer may include dispensing the thermochromic dye in a region of the build volume corresponding to the thermal image region. For instance, the apparatus may dispense 102 the thermochromic dye on the fusing layer in a region corresponding to the determined thermal image region (e.g., peak thermal image region). The apparatus may capture and/or receive an optical image using a color camera. The apparatus may measure, based on the color of the thermochromic dye in the optical image, the temperature from a region of the optical image corresponding to the determined thermal image region. The temperature may indicate a peak temperature for the fusing layer. In some examples, the thermal image prediction, thermal image region determination, thermochromic dye dispensing, and/or temperature measurement may be repeated for multiple fusing layers.

[0046] In some examples, calibrating the thermal prediction procedure may include training a second machine learning model based on the thermal image region and the temperature. For example, the second machine learning model may be trained to produce a temperature based on a predicted thermal image. An example of the second machine learning model may include a regression network. For instance, to calibrate the first machine learning model (e.g., thermal image prediction network), a regression network may be trained to map the predicted thermal image (e.g., thermal image region with peak temperature) to the temperature (e.g., peak temperature) measured based on the optical image. During training, the input to the second machine learning model may be a thermal image region (with predicted peak temperature, for instance) from a predicted thermal image, and the ground truth may be the temperature (e.g., peak temperature) measured from a corresponding region of an optical image. In some examples, the input and the ground truth may be normalized.

[0047] The apparatus may utilize a training objective function to train the second machine learning model. In some examples, a training objective function may be utilized that reduces (e.g., minimizes) a pixel-wise difference. For instance, the training object function may reduce (e.g., minimize) a pixel-wise temperature difference between the temperatures of the thermal image region and the temperatures measured from a corresponding region of the optical image. In some examples, the apparatus may train the second machine learning model with an isotonic regression model, where the temperature distribution value for input and ground truth may be divided into temperature range blocks with probabilities of pixels falling into a temperature range. For instance, the isotonic regression model may be utilized as a training objective function to reduce (e.g., minimize) pair-wise differences for each temperature range’s probability value.

[0048] In some examples, the method 100 may include predicting, using the first machine learning model, a second thermal image based on second geometrical data. For instance, the apparatus may predict, using the first machine learning model, a second thermal image based on geometrical data after training.

[0049] In some examples, the method 100 may include predicting, using the second machine learning model, a second temperature based on the second thermal image. For instance, after training, the second machine learning model may be utilized to predict a temperature based on the second thermal image. For instance, after the second machine learning model (e.g., regression network) is trained, at an inference stage, the first machine learning model may predict a time series of thermal images of fusing layers, where each fusing layer’s thermal distribution is close to a buried layer’s thermal distribution. The time series of thermal images of fusing layers may be provided to the second machine learning model (e.g., regression network), which may adjust each fusing layer’s temperature distribution to a more accurate temperature distribution (where the peak temperature approximates a peak temperature that would be measured using an optical image depicting thermochromic dye, for instance). After training the second machine learning model, in some examples, the first machine learning model and the second machine learning model may be utilized to predict temperature information from geometrical data without measuring temperature using thermochromic dye.

[0050] Figure 2A is a block diagram illustrating an example of engines 204 that may be utilized in accordance with some examples of the techniques described herein. For instance, Figure 2A illustrates examples of engines 204 that may be utilized to train a second machine learning model. In some examples, an engine or engines of the engines 204 described in relation to Figure 2A may be included in the apparatus 324 described in relation to Figure 3. In some examples, a function or functions described in relation to any of Figures 1-6 may be performed by an engine or engines described in relation to Figure 2A. An engine or engines described in relation to Figure 2A may be a device or devices, hardware (e.g., circuitry), and/or a combination of hardware and instructions (e.g., processor and instructions). The engines 204 described in relation to Figure 2A include a first machine learning model engine 206, a region determination engine 212, a second machine learning model engine 208, a temperature measurement engine 218, a dispensing control engine 213, and an optical image capture engine 215. In some examples, the engines 204 may be disposed on one device. For instance, the engines 204 may be included in a 3D printer.

[0051] In some examples, an engine(s) of the engines 204 may be disposed on different devices. For instance, the first machine learning model engine 206, the region determination engine 212, the temperature measurement engine 218, and the second machine learning model engine 208 may be included in a computer, while the dispensing control engine 213 and the optical image capture engine 215 may be included in a 3D printer. For instance, instructions for the first machine learning model engine 206, the region determination engine 212, the temperature measurement engine 218, and the second machine learning model engine 208 may be stored in memory 326 and executed by a processor 328 of the apparatus 324 described in Figure 3 in some examples. In some examples, a function or functions of the dispensing control engine 213 and/or the optical image capture engine 215 may be performed by another apparatus.

[0052] Geometrical data 202 may be obtained. For example, the geometrical data 202 may be received from another device and/or generated as described in relation to Figure 1 . In some examples, the geometrical data 202 may include training data from a training dataset. The geometrical data 202 may be provided to the first machine learning model engine 206.

[0053] In some examples, slicing may be performed based on the geometrical data 202. For example, slicing may include generating a slice or slices (e.g., 2D slice(s)) corresponding to the geometrical data 202. For instance, an apparatus may slice the geometrical data 202 representing a build. In some examples, slicing may include generating a set of 2D slices corresponding to the build. A slice is a portion or cross-section. In some approaches, a build may be traversed along an axis (e.g., a vertical axis, z-axis, or other axis), where each slice represents a 2D cross section of the build. For example, slicing the model may include identifying a z-coordinate of a slice plane. The z-coordinate of the slice plane can be used to traverse the build to identify a portion or portions of the build intercepted by the slice plane. In some examples, a slice or slices may be expressed as a binary image or binary images. In some examples, the slice(s) may be provided to the first machine learning model engine 206.

[0054] The first machine learning model engine 206 may determine (e.g., predict, infer, etc.) a thermal image or thermal images based on the geometrical data 202 (e.g., slice(s)). In some examples, the first machine learning model engine 206 may include a first machine learning model that is trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data 202 as described in relation to Figure 1 . For instance, the first machine learning model may determine a thermal image of a fusing layer or layers. The thermal image(s) determined by the first machine learning model engine 206 may be provided to the region determination engine 212.

[0055] The region determination engine 212 may determine a thermal image region based on the thermal image provided by the first machine learning model engine 206. For instance, the region determination engine 212 may evaluate regions of the thermal image to determine a thermal image region with a peak temperature (e.g., greatest mean temperature). The region determination engine 212 may provide an indicator of the thermal image region to the dispensing control engine 213 and/or to the temperature measurement engine 218. For instance, the indicator may indicate a suggested region for thermochromic dye dispensing. In some examples, the region determination engine 212 may provide the thermal image region to the second machine learning model engine 208. For instance, the region determination engine 212 may crop the determined thermal image region and provide the cropped thermal image region to the second machine learning model engine 208.

[0056] The dispensing control engine 213 may control dispensing of a thermochromic dye on a fusing layer of a build volume. For example, the dispensing control engine 213 may control a printhead and/or a carrier of a 3D printer to dispense the thermochromic dye on a fusing layer of a build volume. In some examples, the dispensing control engine 213 may utilize the indicator of the thermal image region to dispense thermochromic dye. For instance, the dispensing control engine 213 may control dispensing of the thermochromic dye on a region of the fusing layer corresponding to the indicated thermal image region.

[0057] The optical image capture engine 215 may control optical image capture. For instance, the optical image capture engine 215 may control an optical image sensor and/or camera to capture video and/or a still frame of a fusing layer or layers (e.g., thermochromic dye on the layer(s)) during printing. In some examples, the optical image capture control engine 215 may time optical image capture for times when an optical image sensor view and/or camera view of the fusing layer is unobstructed by a printhead carrier. In some examples, the optical image capture engine 215 may control a light source or sources (e.g., white LED(s)) to illuminate the build volume during image capture. The optical image capture engine 215 may provide an optical image or images of a fusing layer or layers to the temperature measurement engine 218.

[0058] The temperature measurement engine 218 may measure a temperature of a fusing layer or layers indicated by an optical image of the thermochromic dye. In some examples, measuring the temperature of the fusing layer(s) may be performed as described in relation to Figure 1 . For instance, the temperature measurement engine 218 may map a color appearing in an optical image to a temperature to measure the temperature. In some examples, the temperature measurement engine 218 may utilize the indicator of the thermal image region to measure the temperature. For instance, the temperature measurement engine 218 may determine a temperature based on a region of the optical image corresponding to the thermal image region indicator. The measured temperature or temperatures may be provided to the second machine learning model engine 208.

[0059] The second machine learning model engine 208 may train a second machine learning model based on the thermal image region and the measured temperature. In some examples, the second machine learning model engine 208 may train the second machine learning model as described in relation to Figure 1 . For instance, the weights of the second machine learning model may be adjusted to reduce (e.g., minimize) a difference between a peak temperature of the thermal image region and the temperature (e.g., peak temperature) measured from a corresponding region of the optical image.

[0060] Figure 2B is a block diagram illustrating an example of engines 214 that may be utilized in accordance with some examples of the techniques described herein. For instance, Figure 2B illustrates examples of engines 214 that may be utilized after training (e.g., during an inferencing or prediction stage) to predict a temperature 222. In some examples, an engine or engines of the engines 214 described in relation to Figure 2B may be included in the apparatus 324 described in relation to Figure 3. In some examples, a function or functions described in relation to any of Figures 1-6 may be performed by an engine or engines described in relation to Figure 2B. An engine or engines described in relation to Figure 2B may be a device or devices, hardware (e.g., circuitry), and/or a combination of hardware and instructions (e.g., processor and instructions). The engines 214 described in relation to Figure 2B include a first machine learning model engine 206 and a second machine learning model engine 208. For instance, the second machine learning model engine 208 described in relation to Figure 2B may be the second machine learning model engine 208 described in relation to Figure 2A after training. In some examples of the techniques described herein, training and prediction or inferencing may be performed on a same device or different devices. For instance, a machine learning model may be trained on a first device and sent to a second device, which may utilize the trained machine learning model to perform prediction or inferencing.

[0061] Geometrical data 203 may be obtained. For example, the geometrical data 203 may be received from another device and/or generated as described in relation to Figure 1 . In some examples, the geometrical data 203 may include data for use in an inferencing stage. The geometrical data 203 may be provided to the first machine learning model engine 206.

[0062] In some examples, slicing may be performed based on the geometrical data 203. For example, slicing may be performed as similarly described in relation to Figure 2A. In some examples, the slice(s) may be provided to the first machine learning model engine 206.

[0063] The first machine learning model engine 206 may determine (e.g., predict, infer, etc.) a thermal image or thermal images based on the geometrical data 203 (e.g., slice(s)). In some examples, the first machine learning model engine 206 may include a first machine learning model that is trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data 203 as described in relation to Figure 1 . For instance, the first machine learning model may determine a thermal image of a fusing layer or layers. The thermal image(s) determined by the first machine learning model engine 206 may be provided to the second machine learning model engine 208.

[0064] The second machine learning model engine 208 may predict a temperature 222 based on the thermal image. In some examples, the second machine learning model engine 208 may predict the temperature 222 as described in relation to Figure 1. For instance, the second machine learning model engine 208 may predict a peak temperature or temperatures corresponding to a fusing layer or layers represented by the thermal image(s) provided by the first machine learning model engine 206.

[0065] Figure 3 is a block diagram of an example of an apparatus 324 that may be used in measuring thermochromic dye temperatures. The apparatus 324 may be a computing device, such as a personal computer, a server computer, a printer, a 3D printer, a smartphone, a tablet computer, etc. The apparatus 324 may include and/or may be coupled to a processor 328, a communication interface 330, and/or a memory 326. In some examples, the apparatus 324 may be in communication with (e.g., coupled to, have a communication link with) an additive manufacturing device (e.g., a 3D printer). In some examples, the apparatus 324 may be an example of 3D printer. The apparatus 324 may include additional components (not shown) and/or some of the components described herein may be removed and/or modified without departing from the scope of the disclosure. For instance, the apparatus 324 may be a 3D printer that includes a thermal image sensor(s) (not shown in Figure 3) and/or an optical image sensor(s) (e.g., optical camera(s)) (not shown in Figure 3).

[0066] The processor 328 may be any of a central processing unit (CPU), a semiconductor-based microprocessor, graphics processing unit (GPU), field- programmable gate array (FPGA), an application-specific integrated circuit (ASIC), and/or other hardware device suitable for retrieval and execution of instructions stored in the memory 326. The processor 328 may fetch, decode, and/or execute instructions stored on the memory 326. In some examples, the processor 328 may include an electronic circuit or circuits that include electronic components for performing a functionality or functionalities of the instructions. In some examples, the processor 328 may perform one, some, or all of the aspects, elements, techniques, etc., described in relation to one, some, or all of Figures 1-6.

[0067] The memory 326 is an electronic, magnetic, optical, and/or other physical storage device that contains or stores electronic information (e.g., instructions and/or data). The memory 326 may be, for example, Random Access Memory (RAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and/or the like. In some examples, the memory 326 may be volatile and/or non-volatile memory, such as Dynamic Random Access Memory (DRAM), EEPROM, magnetoresistive random-access memory (MRAM), phase change RAM (PCRAM), memristor, flash memory, and/or the like. In some examples, the memory 326 may be a non-transitory tangible machine-readable storage medium, where the term “non- transitory” does not encompass transitory propagating signals. In some examples, the memory 326 may include multiple devices (e.g., a RAM card and a solid-state drive (SSD)).

[0068] The apparatus 324 may further include a communication interface 330 through which the processor 328 may communicate with an external device or devices (not shown), for instance, to receive and store the information pertaining to a build or builds (e.g., data for training and/or object printing). The communication interface 330 may include hardware and/or machine-readable instructions to enable the processor 328 to communicate with the external device or devices. The communication interface 330 may enable a wired or wireless connection to the external device or devices. The communication interface 330 may further include a network interface card and/or may also include hardware and/or machine-readable instructions to enable the processor 328 to communicate with various input and/or output devices, such as a keyboard, a mouse, a display, another apparatus, electronic device, computing device, printer, etc. In some examples, a user may input instructions into the apparatus 324 via an input device. [0069] In some examples, the memory 326 may store image data 336. The image data 336 may be generated (e.g., simulated, predicted, and/or inferred) and/or may be obtained (e.g., received) from an optical image sensor(s) and/or thermal image sensor(s). For example, the processor 328 may execute instructions (not shown in Figure 3) to obtain an optical image(s) of thermochromic dye of a fusing layer(s) and/or a thermal image(s) of the fusing layer(s). In some examples, the apparatus 324 may include an optical image sensor(s) and/or thermal image sensor(s), may be coupled to a remote optical image sensor(s) and/or thermal image sensor(s), and/or may receive image data 336 (e.g., optical image(s) and/or thermal image(s)) from an (integrated and/or remote) optical image sensor(s) and/or thermal image sensor(s).

[0070] In some examples, the image data 336 may include a captured optical image(s) and/or thermal image(s). For example, a captured optical image may be an optical image of thermochromic dye on a fusing layer of a build volume. For instance, a printer may eject thermochromic dye on a fusing layer using a first print head. In some examples, the first print head may be separate from a second print head to eject agent. The captured optical image may depict the color of the thermochromic dye on the fusing layer. A sensed thermal image may indicate a temperature distribution (e.g., thermal temperature distribution over a fusing layer). In some examples, the optical image sensor(s) and/or thermal image sensor(s) may undergo a procedure(s) to overcome distortion introduced by sensor(s). Different types of sensing devices may be used in different examples. In some examples, the image data 336 may include a predicted thermal image(s) and/or captured optical image(s).

[0071] In some examples, the memory 326 may store geometrical data 340. The geometrical data 340 may include and/or indicate a model or models (e.g., 3D object model(s)). The apparatus 324 may generate the geometrical data 340 and/or may receive the geometrical data 340 from another device. In some examples, the memory 326 may include slicing instructions (not shown in Figure 3). For example, the processor 328 may execute the slicing instructions to perform slicing on the 3D model data to produce a stack of 2D vector slices. In some examples, the processor 328 may execute prediction machine learning model instructions (not shown in Figure 3) to predict a thermal image based on the geometrical data 340 (e.g., slices). For instance, the thermal image may be predicted by a prediction machine learning model. The thermal image may be stored as part of image data 336. The prediction machine learning model may be an example of the first machine learning model described in relation to Figure 1 , Figure 2A, and/or Figure 2B.

[0072] The memory 326 may store temperature determination instructions 334. In some examples, the processor 328 may execute the temperature determination instructions 334 to determine a temperature of a fusing layer based on an optical image of thermochromic dye of the fusing layer. In some examples, determining a temperature of a fusing layer based on an optical image of thermochromic dye may be performed as described in relation to Figure 1 and/or Figure 2A.

[0073] In some examples, the memory 326 may store training instructions 342. The processor 328 may execute the training instructions 342 to train a calibration machine learning model based on temperature (e.g., the temperature determined by executing the temperature determination instructions 334). In some examples, the calibration machine learning model may be trained as described in relation to Figure 1 and/or Figure 2A. The calibration machine learning model may be trained to predict a calibrated thermal image based on a thermal image. For instance, during an inferencing stage (e.g., after training), the processor 328 may execute the calibration machine learning model instructions 341 to predict a calibrated thermal image based on a thermal image. The thermal image may be predicted by a prediction machine learning model or captured with a thermal image sensor. The calibration machine learning model may be an example of the second machine learning model described in relation to Figure 1 , Figure 2A, and/or Figure 2B. In some examples, the calibration machine learning model may be utilized to predict temperature based on geometrical data 340 and/or a predicted thermal image(s) during an inferencing stage.

[0074] The memory 326 may store operation instructions 346. In some examples, the processor 328 may execute the operation instructions 346 to perform an operation based on temperature(s) predicted by the calibration machine learning model. In some examples, the processor 328 may execute the operation instructions 346 to utilize the predicted temperature to serve another device (e.g., printer controller). For instance, the processor 328 may print (e.g., control amount and/or location of agent(s) for) a layer or layers based on the predicted temperature(s). In some examples, the processor 328 may drive model setting (e.g., the size of the stride) based on the predicted temperature(s). In some examples, the processor 328 may perform offline print model tuning based on the predicted temperature(s). In some examples, the processor 328 may send a message (e.g., alert, alarm, progress report, quality rating, etc.) based on the predicted temperature(s). For instance, the predicted temperature may indicate a probability that distortion in a printed object may occur (e.g., clogged hole(s), thermal bleed, etc.) and/or that powder will be degraded beyond a threshold quantity. The apparatus 324 may present and/or send a message indicating the potential distortion and/or powder degradation. In some examples, the processor 328 may halt printing in a case that the temperature(s) indicate or indicates an issue (e.g., more than a threshold temperature). In some examples, the processor 328 may feed the temperature(s) for an upcoming layer to a thermal feedback controller to online- compensate contone maps for the upcoming layer.

[0075] In some examples, the processor 328 may execute the operation instructions 346 to compare the predicted temperature with a sensed thermal image to detect a nozzle failure or nozzle failures (e.g., failure of a nozzle or nozzles). For instance, a print nozzle defect may be detected and/or compensated by comparing a sensed thermal image or images with a predicted temperature(s). In some examples, a nozzle defect may be detected if a lower temperature streak pattern is detected relative to neighboring pixels in the print direction. For instance, if a temperature difference (e.g., average temperature difference) in a print direction satisfies a detection threshold, a nozzle defect may be detected. Compensation may be applied by increasing a neighboring nozzle injection amount or changing a layout for print liquid (e.g., agent, ink, etc.) application. [0076] In some examples, the processor 328 may execute the operation instructions 346 to compare the predicted temperature with a sensed thermal image to detect powder displacement. Examples of powder displacement may include powder collapse and/or object drag. In some examples, the processor 328 may execute the operation instructions 346 to compare the predicted temperature with a sensed thermal image to detect object drag and/or powder collapse. For instance, powder collapse and/or object drag may be detected by comparing a sensed thermal image or images with a predicted temperature. In some examples, powder collapse and/or object drag (that occurred during printing, for instance) may be detected if a transient colder region is detected. For instance, if a temperature difference (e.g., average temperature difference) in a region satisfies a detection threshold, powder displacement may be detected.

[0077] In some examples, the operation instructions 346 may include 3D printing instructions. For instance, the processor 328 may execute the 3D printing instructions to print a 3D object or objects. In some examples, the 3D printing instructions may include instructions for controlling a device or devices (e.g., rollers, print heads, thermal projectors, and/or fuse lamps, etc.). For example, the 3D printing instructions may use a contone map or contone maps (stored as contone map data, for instance) to control a print head or heads to print an agent or agents in a location or locations specified by the contone map or maps. In some examples, the processor 328 may execute the 3D printing instructions to print a layer or layers. The printing (e.g., thermal projector control) may be based on predicted temperature. For instance, a thermal projector (e.g., lamp) intensity may be adjusted to avoid heating the fusing layer beyond a threshold temperature (in a case that the predicted temperature indicates temperature beyond a threshold, for instance) or may be adjusted to increase heat (in a case that the predicted temperature indicates a temperature below a sintering threshold, for instance). In some examples, the processor 328 may execute the operation instructions to present a visualization or visualizations of the temperature(s) on a display and/or send the temperature(s) to another device (e.g., computing device, monitor, etc.). [0078] Figure 4 is a block diagram illustrating an example of a computer- readable medium 448 for temperature measurement. The computer-readable medium 448 is a non-transitory, tangible computer-readable medium. The computer-readable medium 448 may be, for example, RAM, EEPROM, a storage device, an optical disc, and the like. In some examples, the computer- readable medium 448 may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, PCRAM, memristor, flash memory, and the like. In some examples, the memory 326 described in relation to Figure 3 may be an example of the computer-readable medium 448 described in relation to Figure

4. In some examples, the computer-readable medium 448 may include code, instructions and/or data to cause a processor to perform one, some, or all of the operations, aspects, elements, etc., described in relation to one, some, or all of Figure 1 , Figure 2, Figure 3, Figure 5, and/or Figure 6.

[0079] The computer-readable medium 448 may include data (e.g., information and/or instructions). For example, the computer-readable medium 448 may include optical image processing calibration instructions 450, temperature measurement instructions 452, and/or thermal prediction procedure calibration instructions 454.

[0080] The optical image processing calibration instructions 450 may be instructions when executed cause a processor of an electronic device to calibrate optical image processing based on a first optical image of color swatches captured by a camera. In some examples, calibrating optical image processing may be performed as described in relation to Figure 1 and/or Figure

5.

[0081] The temperature measurement instructions 452 may be instructions when executed cause a processor of an electronic device to measure a temperature of a fusing layer indicated by a second optical image of thermochromic dye captured by the camera based on the calibrated optical image processing. In some examples, measuring a temperature of a fusing layer indicated by a second optical image of thermochromic dye based on the calibrated optical image processing may be performed as described in relation to Figure 1 . [0082] The thermal prediction procedure calibration instructions 454 may be instructions when executed cause a processor of an electronic device to calibrate a thermal prediction procedure based on the temperature. In some examples, calibrating a thermal prediction procedure based on the temperature may be performed as described in relation to Figure 1 and/or Figure 2A.

[0083] In some examples, the computer-readable medium 448 may include calibrated thermal image prediction instructions (not shown in Figure 4). The calibrated thermal image prediction instructions may be instructions when executed cause a processor of an electronic device to predict, using the calibrated prediction procedure, a calibrated thermal image based on geometrical data. In some examples, predicting a calibrated thermal image based on geometrical data may be performed as described in relation to Figure 1 and/or Figure 2B. For instance, a processor may execute a first machine learning model and may predict a thermal image based on the geometrical data. The processor may execute a second machine learning model to predict the calibrated thermal image based on the predicted thermal image.

[0084] Figure 5 is a diagram illustrating an example of optical image processing calibration 556. As illustrated in Figure 5, an image sensor 558 (e.g., color camera) with a white LED light source may be utilized to capture an optical image of color swatches 560a-h. The color swatches 560a-h may be swatches of different colors and/or color shades (e.g., shades of magenta). Approximately identical color swatches may be placed in different locations. For instance, color swatches (e.g., color swatches 560a-h and other identical color swatches) may be placed in different locations in a build volume (e.g., build bed). An optical image of the color swatches may be captured from the build volume. The optical image of the color swatches may be utilized to calibrate optical image processing to increase color accuracy. For instance, optical image processing may be calibrated as described in relation to Figure 1. The increased color accuracy may be utilized to increase temperature measurement accuracy based on optical images.

[0085] Figure 6 is a diagram illustrating an example of concurrent thermal image and optical image capture 684. As illustrated in Figure 6, an image sensor 686 (e.g., color camera) with a white LED light source may be utilized to capture an optical image of thermochromic dye 688 on a fusing layer 690. A thermal sensor 692 (e.g., IR camera) may capture a thermal image of the fusing layer 690 concurrently to the optical image capture. The optical image and thermal image may be utilized to determine a fusing layer cooling rate. For instance, a peak temperature (at a fusing layer location and/or optical image pixel, for instance) may be measured using the optical image. A temperature at a time after peak temperature has occurred (at a fusing layer location and/or thermal image pixel, for instance) may be indicated by the thermal image. In some examples, an apparatus may determine a cooling rate by determining a difference between the peak temperature indicated by the thermochromic dye 688 and a temperature indicated by the thermal image over a period of time. In some examples, the period of time may be a difference between a peak temperature time (e.g., a time at which a heating lamp is over the location) and a time at which the thermal image (at the location) is captured. In some examples, determining the cooling rate may be performed as described in relation to Figure 1 .

[0086] Some examples of the techniques described herein provide in-situ sensing of powder voxels. For example, peak temperature of powder voxels at any location of a build volume may be measured using a thermochromic dye. In some examples, a thermal footprint of printed objects may be captured. In some examples, a thermochromic dye may be ejected in a build bed (e.g., full build bed or region(s) of the build bed) to record peak powder temperature to determine thermal stress (e.g., maximum thermal stress of the powder). In some examples, thermochromic dye degradation data may be used to calibrate boundary conditions of a fusing layer in a 3D printing temperature prediction and/or simulation. In some examples, a video camera may be calibrated based on color swatches before printing and/or by applying a correction to the recorded temperature after the printing has finished.

[0087] As used herein, the term “and/or” may mean an item or items. For example, the phrase “A, B, and/or C” may mean any of: A (without B and C), B (without A and C), C (without A and B), A and B (without C), B and C (without A), A and C (without B), or all of A, B, and C.

[0088] While various examples are described herein, the disclosure is not limited to the examples. Variations of the examples described herein may be within the scope of the disclosure. For example, aspects or elements of the examples described herein may be omitted or combined.