GILO Notes
Published:
Note of the paper Gaussian Integral Linear Operators for Precomputed Graphics
What is precomputed graphics?
As we all know, in graphics, it is common to use an integral linear operator $\mathcal{K}$
\[u(x) = \mathcal{K}[g(y)] = \int K(x, y) g(y) dy\]The most widely known method to evaluate $u(x)$ is the Monte Carlo integration. And some previous works have explored a method to precompute the integral linear operator for a given kernel function.
A typical example is the precomputed radiation transfer (PRT).
- The result light transport kernel $K(x, y)$ is precomputed.
- Environment map $g(y)$ is allowed to change at runtime.
The functions $K(x, y)$ and $g(y)$ are represented by basis expansion. Hence $K(x, y)$ becomes a matrix and $g(y)$ becomes a vector.
Issue of previous method: The expansion introduces errors when basis can note represent the function well.
Solution:
- apply a basis expansion to the kernel K (x, y) as a higher dimensional function
- use Gaussian as a basis which allows us to perform such analytical integration.
Traditional Integral Linear Operator
Let us review the previous integral linear operator and corresponding application precomputed graphics.
- Input function: expand on basis $\phi_i(y)$
- Output function: expand on basis $\psi_j(y)$
Now we investigate how to compute $U_j$ ‘s, by the integration formula
\[u(x) \approx \int K(x, y) \sum_i G_i \phi_i(y) = \sum_i G_i \int K(x, y) \phi_i(y)\]We approximate the inner integral via basis functions $\Psi_j$
\[\int K(x, y) \phi_i(y) = \sum_{j} k_{ij} \psi_j\]This can be precomputed, since $K(x, y)$ and $\psi(y)$ are known. Thus,
\[u(x) \approx \sum_j \left(\sum_j G_i k_{ij} \right) \psi_j(x)\]Operator Composition: consider a sequence of operators
\(u = \mathcal{K}_N[\mathcal{K}_{N- 1}[\cdots \mathcal{K}_1 [g]]],\) where the sequence of kernels is determined only at runtime.
- May need re-projection if the the input and output bases belong to different families
Method
Kernel Basis
Kernel $K$ is expanded jointly:
\[K(x, y) = \sum_i K_i \cdot \sigma(x, y)\]Now we derive $u(x)$
\(u(x) \approx \int \sum_i K_i \cdot \sigma(x, y) \sum_{j} G_j \phi_j(y) = \sum_i \sum_j K_i G_j \xi_{ij}(x),\) where \(\xi_{ij}(x) = \int \sigma_i(x, y) \phi_j(y) dy\)
- We use $N_y$-dimensional Gaussians as input bases and $N_K$-dimensional Gaussians as kernel bases. Therefore $xi$ can be evaluated analytically.
- When each kernel is represented using our high-dimensional Gaussians, the output of each operator remains a Gaussian mixture. This handles the operator composition. For example,
