It's like banning Iliad, because it describes troyan horse.
What I say is, this is a double-use technology and dangerous beyond a certain point, so it might need to be regulated or banned at the end of the day.
In an ideal world, we shouldn't need this, but we don't live that ideal world.
For example, I can't independently develop an amateur rocket which can land to an area of my choosing by the means of actively directing itself, beyond a certain accuracy and precision. Because it'll be a homing missile. On the same essence I can say that this technology can be used to damage other people.
Or, I can't get some enriched uranium to build myself a small, teapot sized reactor to power my house during power outages.
Can we say that we're censoring research in this areas too, because they're low security things?
This is same with latest A.I. developments. However I'm a bit busy to open these cans today.
> Untrusted data is a problem of stupidity in comparison.
In the past, wrong data showed itself because of a lack of coherency. With the advanced misinformation operations, it almost have became an alternative reality game. A.I. today allows us to generate convincing lies at the push of a button. I can fathom what kind of misinformation bubbles can be built with technology like that.
These technologies are attacking to lowest level instincts of humans, which ones we deem utterly reliable for thousands of years. They are the same level with the manipulative algorithms in my mind. I put these into dangerous and harmful category.
This is not a case of stupidity. This is plain old, and very dangerous kind of, manipulation.
Downplaying this is not wise.