so I'm having some trouble creating a program to measure cache size in C. I understand the basic concept of going about this but I'm still having trouble figuring out exactly what I am doing wrong.
Basically, I create an array of varying length (going by power of 2s) and access each element in the array and put it in a dummy variable. I go through the array and do this around 1000 times to negate the "noise" that would otherwise occur if I only did it once to get an accurate measurement for time. Then, I look for the size that causes a big jump in access time. Unfortunately, this is where I am having my problem, I don't see this jump using my code and clearly I am doing something wrong.
Another thing is that I used /proc/cpuinfo to check the cache and it said the size was 6114 but that was not a power of 2. I was told to go by powers of 2 to figure out the cache can anyone explain why this is?
Here is the just of my code...I will post the rest if need be
{
struct timeval start;
struct timeval end; //
int n = 1; // change this to test different sizes
int array_size = 1048576*n; // I'm trying to check the time "manually" first before creating a loop for the program to do it by itself this is why I have a separate "n" variable to increase the size
char x = 0;
int i =0, j=0;
char *a;
a =malloc(sizeof(char) * (array_size));
gettimeofday(&start,NULL);
for(i=0; i<1000; i++)
{
for(j=0; j < array_size; j += 1)
{
x = a[j];
}
}
gettimeofday(&end,NULL);
int timeTaken = (end.tv_sec * 1000000 + end.tv_usec) - (start.tv_sec *1000000 + start.tv_usec);
printf("Time Taken: %d \n", timeTaken);
printf("Average: %f \n", (double)timeTaken/((double)array_size);
}