If your algorithm requires O(n) memory, any O(1) amount of memory can never be enough, no matter how huge. That's the entire point of O notation.
And if your implementation of an algorithm allocates more space in the big-oh sense than it can actually touch (eg. O(n) space for O(log n) time or whatever), that's just a wasteful implementation. Doesn't make the algorithm itself require more space than it has time to actually use.