Hey guys quick question that I can't seem to get.
I'm trying to show how a program I wrote improves (or worsens) as the number of processors increases. However, I can't for the life of me determine a way to represent the time it take for one of the funcitons to run.
Code:
double term = start;
double newTerm = 0;
double precision=1e-6;
double sum = start;
int i =0;
// Calculate each new term and add to total
//Stop when the previous term is less than the precision
for( i = p;term > precision ; )
{
double newFac = 1;
// Calculate the nessesary jump given i and the (n)umber of processors
for(int j = 1;j <= n ; j++ )
{
newFac = newFac * ( x / ( i+ j ));
}
newTerm = newFac;
//Get new term
term = term * newTerm;
// Increment sum by new term
sum += term;
i=i+n;
}
I know the inner loop runs n times each time the main loop runs. But I can't figure out how to determine the number of times the main loop runs in any sort of formulaic way.
Yes, it's homework, but 90% of the assignment was developing and writing the algorithm.
Thanks