Mastering Windows: Improving Reconstruction



Thomas Theul Helwig Hauser Eduard Groller

Institute of Computer Graphics

Vienna UniversityofTechnology

(a) (b) (c)

Figure 1: A CT scan of a head reconstructed with (a) linear interp olation and central di erences with linear interp olation, (b)

Catmull-Rom spline and derivative and (c) Kaiser windowed sinc and cosc of width three with numerically optimal parameters.

dows. The latter two are particularly useful since b oth have Abstract

a parameter to control their shap e, which, on the other hand,

requires to nd appropriate values for these parameters. We

Ideal reconstruction lters, for function or arbitrary deriva-

showhow to derive optimal parameter values for Kaiser and

tive reconstruction, have to b e b ounded in order to b e prac-

Gaussian windows using a Taylor series expansion of the con-

ticable since they are in nite in their spatial extent. This

volution sum. Optimal values for function and rst deriva-

can b e accomplished bymultiplying them with windowing

tive reconstruction for windowwidthsoftwo, three, four and

functions. In this pap er we discuss and assess the quality

ve are presented explicitly.

of commonly used windows and show that most of them are

unsatisfactory in terms of numerical accuracy. The b est p er-

Keywords: ideal reconstruction, windowing, frequency

forming windows are Blackman, Kaiser and Gaussian win-

resp onse, Taylor series expansion



ftheussl,helwig,[email protected]

1 Intro duction

Reconstruction is a fundamental pro cess in volume visualiza-

tion. Mo dalities like CT or MRI scanners provide discrete

data sets of continuous real ob jects, for instance patients

in medical visualization. The function in between sample

points has to b e reconstructed.

It is well known that a band-limited and prop erly sampled

function can be p erfectly reconstructed by convolving the

samples with the ideal function reconstruction lter. Simi-

larly, derivatives of the function can, due to linearity of con-

volution and derivation, directly be reconstructed by con- of the reconstructed function. Goss uses the windowed cosc

volving the samples with the corresp onding derivative of the only on sampling p oints,inbetween some interp olation has

ideal reconstruction lter. The rst derivative of a three- to b e p erformed which is not stated explicitly.

dimensional function, the gradient, can b e interpreted as a Most researchers assess the quality of reconstruction lters

normal to an iso-surface passing through the p ointofinter- in frequency domain, whereas Moller et. al [10 ] prop ose a

est. This gradient is usually used for shading or classi cation purely numerical metho d op erating in spatial domain. We

and therefore the quality of gradient reconstruction strongly will review this metho d in Chapter 4 as it will allowusto

a ects the visual app earance of the nal image. derive optimal parameters, in terms of numerical accuracy,

From signal pro cessing theory we know that the ideal for the Kaiser and Gaussian windows.

function reconstruction lter is the sinc lter.



sin x

3 Windowing ideal reconstruction lters

if x 6=0

x

(1) sinc (x)=

1 if x =0

The windows investigated in this work are

The rst derivative of the sinc lter, called cosc lter,

- The rectangular windoworbox function





cos (x)sinc(x)

if x 6=0

x

1 if jxj

(2) cosc (x)=

 (x)= (3)



0 if x =0

0 else

consequently is the ideal rst derivative reconstruction l-

which p erforms pure truncation.

th

ter [1]. In general, the ideal reconstruction lter of the n

th

- The Bartlett window, which is actually just a tent func-

derivative of a function is the n derivative of the sinc func-

tion:

tion.



However, since these lters are in nite in their spatial

jxj

if jxj < 1



(4) Bartlett (x)=

extent they are impracticable. Simple truncation causes 

0 else

severe , as can be seen in Fig. 6, which

shows the Marschner-Lobb data set [6 ] reconstructed with

- The Welch window:

linear interp olation and central di erences and linear inter-





2

p olation (Fig. 6a), the Catmull-Rom spline and derivative

x

jxj < 1



(5) Welch (x)=

 (Fig. 6b) and the truncated cosc of width three (Fig. 6c).

0 else

(for high resolution images please refer to the web page of

this pro ject [15 ]). We will investigate and explain the cause

- The Parzen window

of this bad result in frequency domain in Chapter 3 and in

(

2 3

spatial domain in Chapter 4.

4 6jxj +3jxj 0 jxj < 1

1

3

To reduce artifacts originating from truncation the ideal

Parzen (x)=

(2 jxj) 1 jxj < 2

4

reconstruction lters can be multiplied with appropriate

0 else

functions which drop o more smo othly at the edges. Wewill

(6)

discuss some of the more commonly used of these functions,

is a piece-wise cubic approximation of the Gaussian

which are adversely called windows. Weshow that most of

window with extend two. Although its width is not

them are unsuitable for reconstruction, esp ecially for higher

directly adjustable it can, of course, b e scaled to every

order derivative reconstruction, although some, for example

desired extend.

Hamming [5 ] and Lanczos [16 ] windows, are rep orted to b e

commonly used.

- The Hann window (due to Julius van Hann, often

wrongly referred to as Hanning window [17 ], sometimes

just cosine b ell window) and Hamming window, which

2 Previous work

are quite similar, they only di er in the choiceofone

parameter :

Turkowski [16 ] used windowed sinc functions for image re-



x

sampling. He found the Lanczos window sup erior in terms

+(1 )cos( ) jxj <



H (x)= (7)

;

of reduction of , sharpness, and minimal ringing as

0 else

conclusion of an empirical exp eriment.

Marschner and Lobb [6 ] used a cosine-b ell windowed sinc

1

b eing the Hann windowand =0:54 the with =

2

(often called Hann window) and found it sup erior to the en-

Hamming Window.

tire family of cubic reconstruction lters. According to their

metric there is always a windowed sinc with b etter smo oth-

- The Blackman window, which has one additional cosine

ing and p ost-aliasing prop erties. Machira ju and Yagel [5 ]

term as compared to the Hann and Hamming window.

use a Hamming windowed sinc without further explanation

(

1 x

0:42 + cos ( )+

of their choice. However, we found the Hamming window

2 

x

Blackman (x)= 0:08 cos (2 ) jxj <



of not being optimal. Recently, Meijering et al. [7 ] per-



0 else

formed a purely quantitative comparison of eleven windowed

(8)

sinc functions. They found the Welch, Cosine, Lanczos and

Kaiser windows yielding the b est results and noted that the

- The Lanczos window, which is the central lob e of a sinc

truncated sinc was one of the worst p erforming reconstruc-

function scaled to a certain extend.

tion lters.



x

) sin(

Goss [2] extended the idea of windowing the ideal recon-



jxj <

x



(9) Lanczos (x)=

 struction lter to derivative reconstruction. The parameter



0 else

of the Kaiser windowcontrols the smo othing characteristics 1.2 1.2 rectangular window Hann window Bartlett window Hamming window 1 Welch window 1 Blackman window Parzen window Lanczos window

0.8 0.8

0.6 0.6

0.4 0.4

0.2 0.2

0 0

-0.2 -0.2

-3 -2 -1 0 1 2 3 -3 -2 -1 0 1 2 3

spatial domain

1.2 1.2 rectangular windowed sinc Hann windowed sinc Bartlett windowed sinc Hamming windowed sinc Welch windowed sinc Blackman windowed sinc 1 Parzen windowed sinc 1 Lanczos windowed sinc ideal ideal

0.8 0.8

0.6 0.6

0.4 0.4

0.2 0.2

0 0

pi pi

frequency domain

3.5 3.5 rectangular windowed cosc Hann windowed cosc Bartlett windowed cosc Hamming windowed cosc 3 Welch windowed cosc 3 Blackman windowed cosc Parzen windowed cosc Lanczos windowed cosc ideal Ideal 2.5 2.5

2 2

1.5 1.5

1 1

0.5 0.5

0 0

0 pi 2pi 0 pi 2pi

Figure 2: Rectangular, Bartlett, Welch, Parzen, Hann, Hamming, Blackman and Lanczos windows of width two on top, b elow

the frequency resp onses of corresp ondingly windowed sinc and cosc functions.

window. It is in its general form de ned by - The Kaiser window [3], which has an adjustable param-

eter which controls how steeply it approaches zero at



2

x

( )

the edges. It is de ned by



2 jxj <

Gauss (x)= (11)

;

0 else

(

p

2

1(x= ) ) I (

0

with  b eing the standard deviation. The higher 

jxj

I ( )

(10) Kaiser (x)=

0

;

gets, the wider the Gaussian window b ecomes and, on

0 else

the other hand, the more severe gets the truncation.

All these windows, except Kaiser and Gaussian windows, are

where I (x) is the zeroth order mo di ed Bessel func-

0

depicted in Fig. 2 on top, the frequency resp onses of corre-

tion [13 ]. The higher gets the narrower b ecomes the

sp ondingly windowed sinc (with window width two) in the

Kaiser window.

middle row and windowed cosc in the bottom row. Since

function reconstruction lters are even functions and rst

derivative lters are odd functions, the power sp ectra, as - and we can also use a truncated Gaussian function as 1.2 1.2 Kaiser window alpha=2 Gaussian window sigma=1.0 alpha=4 sigma=1.5 1 alpha=8 1 sigma=1.8 alpha=16 sigma=2.0

0.8 0.8

0.6 0.6

0.4 0.4

0.2 0.2

0 0

-0.2 -0.2

-3 -2 -1 0 1 2 3 -3 -2 -1 0 1 2 3

spatial domain

1.2 1.2 Kaiser windowed sinc alpha=2 Gaussian windowed sinc sigma=1.0 alpha=4 sigma=1.5 alpha=8 sigma=1.8 1 alpha=16 1 sigma=2.0 ideal ideal

0.8 0.8

0.6 0.6

0.4 0.4

0.2 0.2

0 0

pi pi

frequency domain

4 3.5 Kaiser windowed cosc alpha=2 Gaussian windowed cosc sigma=1.0 alpha=4 sigma=1.5 3.5 alpha=8 3 sigma=1.8 alpha=16 sigma=2.0 Ideal Ideal 3 2.5

2.5 2 2 1.5 1.5

1 1

0.5 0.5

0 0

0 pi 2pi 0 pi 2pi

Figure 3: Kaiser and Gaussian windows of width twowithvarying parameters on top, b elow again the frequency resp onses of

corresp ondingly windowed sinc and cosc functions.

resp onses. The frequency resp onses for some values of and depicted in Figs. 2 and 3, actually are the absolute values

 again become negative, which implies that these values of the real resp ectively the imaginary part of the Fourier

should b e chosen carefully. transform of the lters. Some frequency resp onses, e.g., the

rectangular or Lanczos windowed sinc and cosc, actually as-

sume negativevalues, which means that the phases of certain

4 Numerical analysis

frequencies are reverted. This, of course, causes severe ar-

tifacts. Consequently, these windows are not suitable for

Moller et al. [10 ] intro duced an elegant metho d to assess the

reconstruction purp oses.

quality of a reconstruction lter, based solely on numerical

accuracy. We will shortly review this metho d as it is crucial

Kaiser and Gaussian windows, with varying parameters,

for our purp ose. It will allow us to quantitatively assess lter

are depicted in Fig. 3 on top, the frequency resp onses of

quality in spatial domain and to derivenumerically optimal

corresp ondingly windowed sinc (with window width two) in

parameters for Kaiser and Gaussian windows.

the middle row and windowed cosc in the b ottom row. These

Reconstructing a function f (x) or derivatives thereof from plots show that the parameters and  of the Kaiser and

its discrete samples f [k ]isdonebyconvolving it with a con- Gaussian windows directly a ect the shap e of their frequency 0.1 0.1 truncated Hann Bartlett windowed Hamming windowed Welch windowed Blackman windowed Parzen windowed Lanczos windowed

0.05 0.05

0 0

-0.05 -0.05

-0.1 -0.1 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1

0.1 0.1 Kaiser windowed with alpha=2 Gaussian windowed with sigma=1.0 alpha=4 sigma=1.5 alpha=8 sigma=1.8 alpha=16 sigma=2.0

0.05 0.05

0 0

-0.05 -0.05

-0.1 -0.1

0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1

Figure 4: Co ecient plot of Taylor series expansion for windowed cosc lters with width three.

tinuous lter: and the remainder term

k =1

X

k =1

X 1

(N +1) (N +1)

f ( )(k x) h(x k ) r (x)=

N

k

f (x)= f [k ]  h(x k ) (12)

r

(N + 1)!

k =1

k =1

(16)

Eq. 14 shows that each derivative of the function has an as-

f denotes the reconstructed function (which can be the

r

so ciated co ecient. If a particular derivative of the function

function itself or an arbitrary derivative, dep ending on the

f (x) has to b e reconstructed, the corresp onding co ecient

reconstruction lter h). Such a is a weighted

has to be one and all others have to b e zero, in the ideal

average of the samples which is nite when a nite recon-

case. For practicable lters, only the rst few co ecients

struction lter is used.

will ob ey this scheme. The more co ecients agree with this

f [k ] can now b e expanded as a Taylor series in f ab out x

scheme, the more numerically accurate the lter is. If the

assuming that the rst N + 1 derivatives of f (x) exist:

th th

n co ecientofan derivative lter is di erentfromone,

the lter has to b e normalized by dividing by a [9 ]. In the

n

N

(n) (N +1)

X

f (x) f ( )

k

n (N +1)

following discussion this normalization step is assumed to b e

f [k ]= (k x) + (k x) (13)

n! (N + 1)!

p erformed when necessary and not stated explicitly.

n=0

The co ecients a only dep end on the lter. This allows

n

th

a classi cation where all lters which N co ecient, b eyond

where  2 [x; k ]. Substituting this in Eq. 12 and reordering

k

the co ecient corresp onding to the kind of derivativetobe

the terms in order of the derivatives yields

reconstructed (which has to b e one) is di erent from zero,

th

b elong to one class [9 ]. Filters in class N 1 are called N

N

X

(n)

degree error lters (N EF ). They can exactly reconstruct

f (x)= a (x)f (x)+ r (x) (14)

r n N

a p olynomial of degree N 1orlower. Further assessmentof

n=0

lters in one class could b e accomplished with the remainder

term r . Note, that the co ecients a actually are functions

N n

with the co ecients

de ned b etween zero and one, i.e., dep ending on the distance

k =1

to the next sampling p oint[9].

X

1

n

We use this concept to analyze the e ects of windowing

a (x)= (k x)  h(x k ) (15)

n

n!

the ideal reconstruction lters. In Fig. 4 the co ecient a is

0

k =1 4 0.5 sinc (a1) sinc (a1) cosc (a0) cosc (a0) 0.45 3.5

0.4 3 0.35

2.5 0.3

2 0.25 ||an(x)|| ||an(x)|| 0.2 1.5

0.15 1 0.1

0.5 0.05

0 0 0 2 4 6 8 10 12 3 3.5 4 4.5 5 5.5 6 6.5 7

alpha alpha

Figure 5: Co ecient plot of Taylor series expansion for Kaiser windowed sinc and cosc with varying parameters and window

width two. The right image is a closeup to the minima of these functions.

plotted for the ideal derivative reconstruction lter (cosc) L norm. The integral in Eq. 17 can b e evaluated numer-

1

windowed with window width three. In order to appro- ically. For a sp eci c parameter to b e optimal the resulting

0

priately reconstruct the derivative f (x) the co ecient a co ecient error function will have the lowest value. The co-

0

should be close to zero in the interval [0; 1]. On top left, ecient error function for the Kaiser windowed cosc (i.e.,

the co ecient plot for the truncated cosc explains the bad evaluating Eq. 17 for a ) with width two for between zero

0

result of Fig. 6. Bartlett, Welch, and Parzen window im- and twelve can b e seen in Fig. 5 on the left. Higher values

prove the situation but are still far from zero. On top right, of would not b e reasonable since the window then already

Hann, Hamming, and Lanczos windowarenotmuchbetter gets to o narrow. This co ecient error function indeed has a

but the Blackman window shows quite a go o d result. On global minimum, whichproves that there is an optimal pa-

b ottom left, the co ecient plots for the Kaiser windowed rameter. The right image in Fig. 5 shows an enlarged part

cosc and on b ottom right for the Gaussian windowed cosc of the neighb orho o d of this minimum.

show, as one could exp ect, a direct dep endence on the choice Wewant to apply this concept to nd optimal parameters

of the corresp onding parameters, and  . also for the Kaiser and Gaussian windowed sinc lter for

function reconstruction. To reconstruct the function itself,

Lets take a closer lo ok at the Kaiser window (a similar ar-

the co ecient a must b e one and all others zero. Due to

gument holds, of course, for the Gaussian window). Cho os-

0

the prop erties of the Taylor series expansion, the co ecient

ing =2 obviously is not appropriate for a windowwidth

with most in uence will b e a . The higher n gets in Eq 15,

of three as the co ecient plot is quite bad. With =4 the

1

the lower the contribution of the co ecient. So we plot the

situation gets b etter, and for =8 a is almost zero. With

0

L norm of this co ecient against the varying parameters

= 16 the situation b ecomes worse again. One can exp ect

1

of Kaiser and Gaussian windowed sinc. The result for the

a value of between 8 and 16 b eing optimal for gradient

Kaiser windowed sinc of width two can also b e seen in Fig. 5

reconstruction. We will show that such an optimal value

on the left. Surprisingly,weobserve that as the parameter

exists and how it can b e computed.

of the Kaiser window approaches zero the co ecient error

function approaches zero also. A Kaiser windowwith =0

is a b ox function, whichwould result in a truncation of the

5 Optimal parameters for Kaiser and Gaus-

.

sian windows

However, truncating the sinc lter is commonly known to

cause unwanted artifacts, for example ringing [5 ]. Taking

As Moller et al. [10 ] already mentioned, a rst derivativere-

again a lo ok at the error co ecient function of the Kaiser

construction lter is useless if the rst co ecient a is signif-

0

windowed sinc in Fig. 5 we further observe that it takes on

icantly di erent from zero. This is the case for the truncated

another lo cal minimum. The right image in Fig. 5 again

cosc and the results shown in Figs. 6c and 6i illustrates this

shows an enlarged p ortion of the neighb orho o d of this mini-

situation. Our goal is to derive optimal parameters, in a

mum. Investigation of a at this p ointshows that is is almost

0

purely numerical sense, for Kaiser and Gaussian windowed

constant one, which it is not for = 0, so that there is no

ideal reconstruction lters.

normalization step necessary. Furthermore, a at this lo cal

2

In order to do this, we compute the L norm of the co ef-

1

minimum is much closer to constant zero than a for =0

2

cient function a , given by

n

so that we conclude that this lo cal minimum is more appro-

Z

priate than = 0, although a is not exactly zero. The same

1

1

holds for the Gaussian window. The only di erence is that

jja (x)jj = ja (x)jdx (17)

n n

the Gaussian window approaches constant one (whichwould

0

result in a rectangular window) as  approaches in nity.

and plot it against the varying parameter of the Kaiser or With this metho d we are now able to compute optimal

Gaussian windows. The L norm was chosen for sakeofsim- parameters for Kaiser and Gaussian windowed ideal recon-

1

plicity,itwould, of course, also b e p ossible to use the L or struction lters, in a strictly numerical sense. Weevaluated 2

Windowwidth

2 3 4 5

function reconstruction (sinc) (Kaiser window) 5.36 8.93 12.15 15.4

 (Gaussian window) 1.11 1.33 1.46 1.63

rst derivative reconstruction (cosc) (Kaiser window) 6.05 9.28 12.5 15.5

 (Gaussian window) 1.045 1.238 1.42 1.56

Table 1: Optimal values for the parameters for Kaiser and Gaussian windows for function and rst derivative reconstruction

with window width two, three, four, and ve.

terp olation and central di erences with linear interp olation values for Kaiser and Gaussian windowed sinc and cosc of

and truncated sinc and cosc. Cubic splines (Fig. 6h) show window width two, three, four and ve, which are presented

some artifacts esp ecially in the highlights which are removed in Table 1. Not re ected in this table is the fact that the

by the Blackman and Kaiser window (Figs. 6j and 6k). absolute numerical error is one magnitude smaller for the

The Gaussian window again intro duces artifacts, also, quite Kaiser windowed ideal reconstruction lters as compared to

notable, in the function reconstruction, due to the window the Gaussian or Blackman windowed ideal reconstruction

width of three. lters.

Fig. 1 shows the results for the head data set. Linear

interp olation for function reconstruction and central di er-

6 Results

ences with linear interp olation for gradient reconstruction

were used in the left image. We see that this metho d in-

Toevaluate the results of the last chapters we rst tested the

tro duces quite some smo othing, so that the skull app ears

reconstruction lters on a standard arti cial data set pro-

quite smo oth and visually app ealing. However, the close-up

p osed byMarschner and Lobb [6 ]. We used our framework

belowshows that it is not suitable for reconstructing small

\Smurf: a smart surface mo del for advanced visualization

features. In the middle the Catmull-Rom spline was used

techniques" [4 ] which easily allows to replace reconstruction

both for function reconstruction and gradient reconstruc-

lters. We used a ray-casting iso-surface extraction algo-

tion. We see that the skull exhibits stair case artifacts as

rithm and always used the rst derivative of the function

they are not smo othed out any more but the close-up clearly

reconstruction lter also for rst derivative reconstruction.

exhibits the b etter reconstruction qualityofthis lter. On

This is according to the scheme prop osed byBentum et al. [1 ]

the rightwe used Kaiser windowed sinc and cosc with win-

(with the exception of linear interp olation, where central dif-

dow width three and numerically optimal parameters. Here

ferences with linear interp olation were used).

we can see the b etter reconstruction esp ecially as the high-

Fig. 6a shows the result for central di erences and linear

lights in the close-up app ear more sharply. Again, the skull

interp olation. We clearly observe the smo othing, which is

exhibits stair-case artifacts which actually are a visualization

an intrinsic prop erty of this metho d [2 ], as the image is to o

of data accuracy. Although the result with central di er-

bright. In Fig. 6b, the Catmull-Rom spline and its derivative

ences is visually more app ealing, the Kaiser windowed sinc

(i.e., a BC-Spline with B =0 and C =0:5 [8 ] or, equiva-

and cosc are physically more correct.

lently, a cardinal spline with a = 0:5[1])were used. Wesee

For further images (also of the other not so go o d p erform-

that the smo othing is reduced but the structure of the arti-

ing windows) please refer to earlier work of the authors [14 ]

facts remains. In Fig. 6c truncated sinc and cosc were used.

and to the web page of this pro ject [15 ].

This image shows practically what was discussed theoreti-

cally in Chapters 3 and 4. Truncating ideal reconstruction

lters is more or less useless.

7 Conclusions and Future Work

Figs. 6d { 6f shows images reconstructed with the sinc

We showed that windowing ideal reconstruction lters al-

and cosc lter windowed with the more promising windows

lows to improve reconstruction quality as compared to the

from our numerical analysis in Chapter 4. Fig. 6d shows the

the use of traditional reconstruction metho ds like linear in-

Blackman window, which is xed for a certain width (we

terp olation or cubic splines. Windowing ideal reconstruction

used a window width of three for this image). The Black-

lters also has the advantage that b etter reconstruction can

man windowgives already quite a go o d b ehavior. In Fig. 6e

be achieved by using wider windows, which, however, also

a Kaiser window of width three was used with =8:93 for

increases the computational cost.

function reconstruction and =9:28 for rst derivative re-

construction, according to Table 1, which further reduces the

From the comparison of some of the more commonly used

windows we conclude that the choice of the windowing func- annoying artifacts present in the previous reconstructions.

tion is crucial. Weshowed that some p opular windows, like A Gaussian window of width three with numerically opti-

the Lanczos or Hamming window, are not optimal in the mal parameters ( = 1:33 for function reconstruction and

sense of numerical accuracy. From our exp eriments as well  =1:238 for rst derivative reconstruction) shows some ar-

as from numerical analysis we found Blackman, Kaiser and tifacts, esp ecially visible in the highlights, again. This is due

Gaussian windows p erforming b est. to the fact that a window width of three is to o narrowfor

the Gaussian window [14 ].

Further, we derived numerical optimal values for Kaiser

and Gaussian windows, whichhave a parameter to control The lters were also tested on real world data sets, such

their shap e. Our analysis of ideal reconstruction lters win- as a CT scan of a human head and an MR scan of a hu-

dowed with these windows using a Taylor series expansion of man kidney. Figs. 6g { 6l show the results for a close-up of

the convolution sum allowed us to determine optimal values the kidney data set. The same reconstruction lters were

for the parameters of these windows, in a strictly numeri- used in the same order as for the Marschner-Lobb data set

cal sense. The Kaiser window p erforms b est in this regard ab ove. Figs. 6g and 6i show again the de ciencies of linear in-

as the numerical error is smallest. As the Gaussian win- [7] E. H. W. Meijering, W. J. Niessen, J. P. W. Pluim,

dows shows some de ciencies for a window width of three, and M. A. Viergever. Quantitative comparison of sinc-

we conclude that the Blackman window is a rst go o d choice approximating kernels for medical image interp olation.

which, however, can b e improved by using a Kaiser window In C. Taylor and A. Colchester, editors, Medical Im-

with numerically optimal parameters. age Computing and Computer Assisted Intervention -

MICCAI'99, pages 210{217, Septemb er 1999.

Continuity conditions, whicharevery imp ortant for the

visual app earance of the resulting image [11 ] have not b een

[8] D. P. Mitchell and A. N. Netravali. Reconstruc-

addressed and are left for future work. Our exp eriments

tion lters in computer graphics. Computer Graphics,

showed also that a coupling of function reconstruction and

22(4):221{228, August 1988.

derivative reconstruction improves over-all reconstruction

quality. This area deserves further investigation, probably

[9] T. Moller, R. Machira ju, K. Muller,  and R. Yagel. Clas-

in a way similar to Neumann et al. [12 ] who use a tight cou-

si cation and lo cal error estimation of interp olation and

pling of function and gradient reconstruction based on linear

derivative lters for volume rendering. In Proceedings

regression.

1996 Symposium on Volume Visualization, pages 71{78,

Septemb er 1996.

8 Acknowledgments

[10] T. Moller, R. Machira ju, K. Muller,  and R. Yagel. Eval-

uation and Design of Filters Using a Taylor Series Ex-

The work presented in this pap er has b een funded by the

pansion. IEEE Transactions on Visualization and Com-

is ed is ed

V M pro ject (http://www.vismed.at/). V M is sup-

puter Graphics, 3(2):184{199, 1997.

p orted by Tiani Medgraph , Vienna (http://www.tiani.

[11] T. Moller, K. Muller, Y. Kurzion, Raghu Machira ju,

com/) and the Forschungsforderungsfonds fur  die gewerbliche

and Roni Yagel. Design of accurate and smo oth lters

Wirtschaft, Austria (http://www.telecom.at/fff/). The

for function and derivative reconstruction. In IEEE

medical data sets are courtesy of Tiani Medgraph GesmbH,

Symposium on Volume Visualization, pages 143{151.

Vienna. Sp ecial thanks go to JirHladuvka, Jan Prikryl and

IEEE, ACM SIGGRAPH, 1998.

Rob ert F. Tobler for pro of reading and valuable hints and

discussions.

[12] L. Neumann, B. Csebfalvi, A. Konig, and M. E. Groller.

Gradient estimation in volume data using 4D linear re-

gression. Technical Rep ort TR-186-2-00-03, Institute of

References

Computer Graphics, Vienna UniversityofTechnology,

2000. Accepted for EUROGRAPHICS 2000.

[1] M. J. Bentum, B. B. A. Lichtenb elt, and T. Malzb ender.

Frequency Analysis of Gradient Estimators in Volume

[13] W. H. Press, B. P. Flannery,S.A.Teukolsky, and W. T.

Rendering. IEEE Transactions on Visualization and

Vetterling. Numerical Recipes in C: The Art of Scien-

Computer Graphics, 2(3):242{254, Septemb er 1996.

ti c Computing. Cambridge UP, 1988.

[2] M. E. Goss. An adjustable gradient lter for volume

[14] T. Theul. Sampling and reconstruction in volume visu-

visualization image enhancement. In Proceedings of

alization. Master's thesis, Vienna UniversityofTechnol-

Graphics Interface '94, pages 67{74, Ban , Alb erta,

ogy, 2000. http://www.cg.tuwien.ac.at/~th euss l/

Canada, May 1994. Canadian Information Pro cessing

DA/.

So ciety.

[15] T. Theul, H. Hauser, and M. E. Groller. Mas-

tering windows: Improving reconstruction { web

[3] J. F. Kaiser and R. W. Schafer. On the Use of

page. http://www.cg.tuwien.ac.at/res earc h/vi s/

the I -Sinh Window for Sp ectrum Analysis. IEEE

0

vismed/Windows/.

Trans. Acoustics, Speech and , ASSP-

28(1):105, 1980.

[16] K. Turkowski. Filters for common resampling tasks.

In Andrew S. Glassner, editor, Graphics Gems I, pages

[4] H. Lo elmann, T. Theul, A. Konig, and M. E. Groller.

147{165. Academic Press, 1990.

Smurf: a smart surface mo del for advanced visualiza-

tion techniques. In Winter School of Computer Graph-

[17] G. Wolb erg. Digital Image Warping. IEEE Computer

ics '99, pages 156 { 164, Plzen, Czech Republic, 1999.

So ciety Press, 10662 Los Vaqueros Circle, Los Alamitos,

To app ear revised as "Smart surface interrogation for

CA, 1990. IEEE Computer So ciety Press Monograph.

advanced visualization techniques" in journal of High

Performance Computer Graphics, April/May 2000.

[5] R. Machira ju and R. Yagel. Reconstruction error

characterization and control: A sampling theory ap-

proach. IEEE Transactions on Visualization and Com-

puter Graphics, 2(4):364{378, Decemb er 1996.

[6] S. R. Marschner and R. J. Lobb. An evaluation of re-

construction lters for volume rendering. In R. Daniel

Bergeron and Arie E. Kaufman, editors, Proceedings

of the Conference on Visualization, pages 100{107, Los

Alamitos, CA, USA, Octob er 1994. IEEE Computer So-

cietyPress.

(a) (b) (c)

(e) (f )

(d)

(h) (i)

(g)

(j) (k) (l)

Figure 6: Marschner Lobb data set (a { f ) and data set of a human kidney (g { l) reconstructed with (a,g) linear interp olation

and central di erences and linear interp olation, (b,h) Catmull-Rom spline and derivative, (c,i) truncated (rectangular window)

sinc and cosc of width three, (d,j) Blackman windowed sinc and cosc with window width three, (e,k) Kaiser windowed sinc and

cosc with window width three and numerically optimal parameters and (f,l) Gaussian windowed sinc and cosc with window

width three and numerically optimal parameters.