Lines Matching full:convolution
12 ## Channeled Convolution
36 We will consider the 2D channeled convolution example extracted from Halide
54 // Declarations of "mathematical functions" for convolution and relu.
64 // Core convolution with the result initialized to the bias value.
85 // Convolution proper. While Linalg has named operations for 2D convolutions,
325 Practically, this corresponds to fusing the convolution initialization and
330 * first the main convolution update is fused into ReLU that uses it and has
332 * then the bias initialization is fused into the convolution+relu loop nest.
349 loops from the convolution operation. However, these are reduction loops and it
363 This transformation materializes the desired loops around the convolution
595 happen around floating point multiplications and additions in the convolution.