What notation do you use for the average time complexity of an algorithm? It occurs to me that the proper way would be to use big-theta to refer to a set of results (even when a specific try may differ). For example, average array search would be Θ(n+1)/2.
Is this right?
Explanation of average search time being Θ(n+1)/2: Given that the access time sum for the n first integers is n(n+1)/2, the average time for a large number of random access to the array will be (n(n+1)/2)/n = (n+1)/2. In this case I would say that Θ(n+1)/2 will be the average access time. It makes sense (to me) but since I'm new to asymptotic notation (and programming in general), I'd like to know if this is common practice.
I'm asking because I'm confused to see big-O being used for everywhere in pages like http://bigocheatsheet.com. eg: average case of array search O(n).