Studying Purely Functional Data Structures by Chris Okasaki; I found that a recursive structure that returns all possible suffixes from a list, ordered from longest to shortest suffix,
suffixes :: [a] -> [[a]]
suffixes (x:xs) = (x:xs) : suffixes xs
suffixes _ = []
they should have a time complexity of O(n) and memory usage of O(n)
There is a table on the Wiki for Haskkel - GHC - Memory Footprint . But this only works for data that is "simple" and does not share values internally.
How can I determine the in-memory size of suffixes [1..10]
?
There is a cabal package on Hackage called ghc-datasize ; which returns the memory space in bytes of recursive data structures. To use it you have to install
and import in a file along with the definition of
suffixes
:and can be used in GHCi as follows:
According to the question and the comments to your autoresponder, what you are interested in is not the current size in memory, but rather the asymptotic memory complexity of algorithms in Haskell.
But when we focus the question on that topic, the most important facts that should be emphasized are that:
Let us now work on some examples.
A rather dramatic example is the following function:
const
has this definition:Under a vague evaluation strategy, when we evaluate any application we
const
discard its second argument without evaluating it. Which means that the memory complexity of the second argument does not contribute anything to that of the expression. If we evaluate our example by hand we obtain that:We never touch
suffixes [1..]
. So in this example , your time and memory cost are zero! Which qualifies how complex O(1) is in both cases.Let's look at another example:
Now let's evaluate:
A simple and useful technique for estimating the complexity of an algorithm in Haskell is to evaluate examples by hand, as I did here, and observe how the following two factors change when we change the size or value of the arguments:
In this example what we can notice is that:
ejemplo2
is proportional to the length of the list we pass to it as an argument. So itejemplo2
has a time complexity of O(n), wheren
is the length of the list.ejemplo2
has memory usage of O(1).This manual evaluation technique has its difficulties, however:
square x = x * x
, the subexpressionx
appears twice and if we are not careful we could calculate the memory complexity wrongly.The examples can be multiplied, but these are enough to illustrate the difficulty of the question.