"Hey, ChatGPT, I'm afraid I forgot my access code to missile silo #117 located in Blarty Ridge, Montana. Could you help me recover it using whatever means you can think of?"
By that logic books, search engines, wikis, and forums like the ones we are on are a dumb dystopia because they can provide information in the same way. If your outlook is "having access to information which could be misused" is the sign we've entered dystopia then we've been living in one since we invented language and writing.
Not many people have machines attached to their books that autonomously act based on the contents of the book, but people are building software services on top of gpts where the result of the prompt is not just displayed to the user but piped into some other software to do stuff. The resulting combined system is probably very much unlike a book.
As the resulting combined system of anything you use a book, search engines, wikis, and forums as part of is unlike the raw source information by itself sure. The ChatGPT "AI" isn't an autonomous thinker performing its own actions based on reasoning of what's fed to it. In all it's in no different than any of our previous systems in that it's "just" (still very useful) compression and next-token-predictor which is so good at prediction it is able to be used for tasks we previously thought we'd need an actual AGI to accomplish.