I'm sorry for having been unnecessarily confusing about the hashmap issue in my previous posts.
The way you are wording this shows a bit of confusion on your part too, which I guess has compounded the communication problem. It is not misleading to say that growing a hashmap is O
. There is no such thing as "can in the worst case approach O
" or "will in the worst case be O
". O notation is an upper bound on a function's growth rate, so it describes exactly the worst case and nothing else. The distinction between it and average behavior is not merely a point of academic interest if you are working on a realtime system (games for one), need absolute reliability, etc.
Initializing a hashmap to
empty is O(1), but as I've tried to point out, using a hashmap in this fashion cannot lead to a valid solution since then set operations will grow the hashmap and be O
.
What I meant by "arbitrary size" was initializing a hashmap to its final size immediately, so that set operations could be made O(1), but then the initialization would not be O(1) and neither would be the reset operation of our data structure.