1

II

Practical Elliptical Texture

Filtering on the GPU

Pavlos Mavridis and Georgios Papaioannou

1.1 Introduction

Hardware texture ﬁltering, even on state-of-the-art graphics hardware, suﬀers

from several aliasing artifacts, in both the spatial and temporal domain. These

artifacts are mostly evident in extreme conditions, such as grazing viewing angles,

highly warped texture coordinates, or extreme perspective, and become especially

annoying when animation is involved. Poor texture ﬁltering is evident as excessive

blurring or moir´e patterns in the spatial domain and as pixel ﬂickering in the

temporal domain, as can be seen in Figure 1.1 and the accompanying demo

application.

Figure 1.1. A benchmark scene consisting of two inﬁnite planes demonstrating the

improvement of elliptical ﬁltering over the native hardware texture ﬁltering.

91

92 II Rendering

In this chapter, we present a series of simple and eﬀective methods to perform

high quality texture ﬁltering on modern GPUs. We base our methods on the the-

ory behind the elliptical weighted average (EWA) ﬁlter [Greene and Heckbert 86].

EWA is regarded as one of the highest quality texture ﬁltering algorithms and

is used as a benchmark to test the quality of other algorithms. It is often used

in oﬄine rendering to eliminate texture aliasing in the extreme conditions men-

tioned above, but due to the high computational cost it is not widely adopted in

real-time graphics.

We ﬁrst present an exact implementation of the EWA ﬁlter that smartly

uses the underlying bilinear ﬁltering hardware to gain a signiﬁcant speedup. We

then proceed with an approximation of the EWA ﬁlter that uses the underlying

anisotropic ﬁltering hardware of the GPU to construct a ﬁlter that closely matches

the shape and the properties of the EWA ﬁlter, oﬀering vast improvements in the

quality of the texture mapping. To further accelerate the method, we also intro-

duce a spatial and temporal sample distribution scheme that reduces the number

of required texture fetches and the memory bandwidth consumption, without re-

ducing the perceived image quality. We believe that those characteristics make

our method practical for use in games and other interactive applications, as well

as applications that require increased ﬁdelity in texture mapping, like GPU ren-

derers and image manipulation programs. We ﬁrst described these methods at the

2011 Symposium on Interactive 3D Graphics and Games [Mavridis and Papaioan-

nou 11]. This chapter reviews the main ideas of that paper with an emphasis on

small, yet important implementation details.

1.2 Elliptical Filtering

This section provides an overview of the theory behind texture ﬁltering and the

elliptical weighted average (EWA) ﬁlter.

In computer graphics the pixels are point samples. The pixels do not have an

actual shape, since they are points, but we often assign an area to them. This

area is the footprint (the nonzero areas) of the ﬁlter that is used to reconstruct

the ﬁnal continuous image from these point samples, according to the sampling

theorem. As discussed in [Smith 95], high quality reconstruction ﬁlters, like a

truncated sinc or Gaussian, have a circular footprint, so a high quality texture

ﬁltering method should assume circular overlapping pixels.

The projection of a pixel with circular footprint to texture space is an ellipse

with arbitrary orientation, as illustrated in Figure 1.2. In degenerate cases, like

extreme grazing viewing angles, the projection is actually an arbitrary conic sec-

tion, but for our purposes an elliptical approximation suﬃces, since, for these

cases, any visible surface detail is lost anyway. A texture ﬁltering algorithm

should return a convolution of the texels (texture point samples) inside the pro-

jected area S of the pixel with the projection of the reconstruction ﬁlter H in

1. Practical Elliptical Texture Filtering on the GPU 93

Figure 1.2. The projection of a pixel with circular footprint on a surface covers an

elliptical region.

texture space. In particular, it should compute the following equation:

C

f

(s, t)=

X

u,v∈S

H(u, v)C(u, v),

where C(u, v) is the color of the texel at the (u, v) texture coordinates and C

f

(s, t)

is the ﬁltered texture color. In the above equation H is normalized.

The EWA algorithm approximates the projected pixel footprint with an ellip-

tical region, deﬁned by the following equation [Heckbert 89]:

d

2

(u, v)=Au

2

+ Buv + Cv

2

,

where the center of the pixel is assumed to be at (0, 0) in texture space and

A = A

nn

/F,

B = B

nn

/F,

C = C

nn

/F,

F = A

nn

C

nn

− B

2

nn

/4,

A

nn

=(∂v/∂x)

2

+(∂v/∂y)

2

,

B

nn

= −2 ∗ (∂u/∂x ∗ ∂v/∂x + ∂u/∂y ∗ ∂v/∂y),

C

nn

=(∂u/∂x)

2

+(∂u/∂y)

2

.

The partial derivatives (∂u/∂x, ∂u/∂y, ∂v/∂x, ∂v/∂y) represent the rate of

change of the texture coordinates relative to changes in screen space. The quan-

tity d

2

denotes the squared distance of the texel (u, v) from the pixel center when

projected back into screen space. The algorithm scans the bounding box of the

elliptical region in texture space and determines which texels reside inside the

ellipse (d

2

≤ 1). These samples contribute to the convolution sum, with weights

Start Free Trial

No credit card required