List update and paging problems are important
problems in computer science. Various researchers (e.g,. [1])
developed and analyzed the problems under the statistical assumptions
that distribution (i.e., “average-case”) of input
requests in known. Sleator and Tarjan [2] proposed worst case
competitive analysis as a new way to evaluate the performance
of an algorithm. In competitive analysis, the online algorithm
is assumed to have no knowledge of the input. We propose a “semi-random”
adversary as a way to bridge the gap between the worst case and
the average case. In this model, the adversary is allowed to “corrupt”
a fraction of the input coming from a known distribution. Thus
the online algorithm has a limited knowledge about the input distribution.

The goal of this project is to evaluate a known algorithm like
Frequency Counting, both theoretically and experimentally against
a semi-random adversary. The ideas from this could be used to
propose an algorithm that covers the spectrum smoothly between
average case and worst case bounds.

[1] R. Rivest, On self-organizing
sequential search heuristics, Communication of the ACM,
19:63-67, 1976.

[2] Daniel D. Sleator and Robert Endre Tarjan, Amortized
efficiency of list update and paging rules, Communication
of the ACM, 28:202-208, 1985.

Preliminary Presentation (ppt)

Final Presentation (ppt)

Other REUs for Summer 2004

Algorithms for Dynamic Point Location with Good Practical Performance

A Bezier-Based Approach to Unstructured Moving Meshes

Evaluation of Algorithms for the List Update Problem

Exploring PLS-Completeness of Simple Stochastic Game (or Stable Circuit Problem)

Fast and Compact Data Structures

The Game of NimG (Nim on Graphs)

Random Graph Models of Large Real-World Networks

Solving Partial Differential Equations Numerically

Traceable Anonymity

Back to the REU page