Joris van Lier said:
Given two implementations of an algorithm how do I determine the relative
computational complexity of each?
Are there tools that can determine the relative performance of two
algorithms or implementations?
Joris van Lier
Let me explain this a bit more:
I'm working on a process where multiple copies of data may exist,
new data is supplied to the process with a location where to store it,
the process stores it in the designated location and
replaces all copies with links to the new location
For the purpose of this example I'll assume that the storage is the
filesystem , the data is contained in files and all files are of equal
length, location is a path, and all locations where data can be stored are
known beforehand
With my limited math skills and assuming the program runs as a single thread
I tried to resolve this as below
D= data unit (file)
L = number of locations
F = average number of files in a location
B = average number of bytes in a file
O(fx(D))= L*F*B
is this correct?
How do I account for multiple concurrent threads of execution?