View Single Post
Old 04-14-2005, 05:33 PM   #1 (permalink)
RAGEAngel9
Insane
 
Location: Vermont
Algorithm Timing

Hey guys quick question that I can't seem to get.
I'm trying to show how a program I wrote improves (or worsens) as the number of processors increases. However, I can't for the life of me determine a way to represent the time it take for one of the funcitons to run.


Code:
    double term = start;
    double newTerm = 0;
    double precision=1e-6;
    double sum = start;
    int i =0;
    // Calculate each new term and add to total 
    //Stop when the previous term is less than the precision
    for( i = p;term > precision ; )
    {
        double newFac = 1;
        // Calculate the nessesary jump given i and the (n)umber of processors
        for(int j = 1;j <= n ; j++ )
        {
            newFac = newFac * ( x / ( i+ j ));
            
        }
        newTerm = newFac;
        //Get new term
        term = term * newTerm;
        // Increment sum by new term
        sum += term;
        i=i+n;
    }
I know the inner loop runs n times each time the main loop runs. But I can't figure out how to determine the number of times the main loop runs in any sort of formulaic way.

Yes, it's homework, but 90% of the assignment was developing and writing the algorithm.
Thanks
RAGEAngel9 is offline  
 

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62