Are you thinking about grey goo?
The paperclip maximizer generally refers to a AGI with a value system that is not aligned with humans. An AGI smart enough to achieve its goals (making paperclips) so efficiently that it becomes a threat to humans through sheer resource consumption.
So it's not a good example of dumb-tiger-AIs occasionally becoming a threat to humans, which still on average are able to outcompete a tiger with ease.