To use matched filtering, we need to analyze the data by presupposing that there is a pulsar at each given point in the sky, and then searching for the corresponding signal in the data set. In practice we set up a very fine grid on the sky and in frequency and in the rateofchange of frequency, or frequencyderivative, and search at each grid point.
The computer power required for this search grows very rapidly as a function of the length of integration time T, primarily because the grid used on the sky, and in frequency and frequencyderivative, gets finer and finer as T increases. For a fixed band of search frequencies, the computation time is proportional to the sixth power of T. So increasing the integration time T from 10 hours to 20 hours increases the amount of computer power needed by a factor of 64.
In practice, if we are restricted to computers and computer clusters controlled by the LIGO Scientific Collaboration (LSC) [7] we can't search for very long times T. This is too bad, since the larger T is, the more likely it is that we can 'dig down' through the instrument noise to find a pulsar signal buried in it.
This is where Einstein@Home and your computer can help us. With many more computers available, we can dig deeper down into the noise to search for signals.
When you run Einstein@Home on your computer, you get credits for your work, once the results have been validated. (The results are validated by comparing them to results obtained by other users for the same work.) The credits you accumulate over time reflect the amount of valid work that your computer (or computers, if you have more than one) have accomplished. Roughly speaking, each 100 credits corresponds to the work that could be done in a 24hour period by a computer doing 1 billion floating point operations (add, subtract, multiply or divide) per second.
Einstein@Home S3 Analysis Summary 
Last Revised: 2005.09.11 16:22:17 UTC 
Copyright © 2005 Bruce Allen for the LIGO Scientific Collaboration

Document version: 1.97 