Sure, but I don't think it will be with fire, viruses, bullets, and bombs.
If technology doesn't stop advancing, I think it's more likely we incorporate more and more technology into ourselves until it's all machine.
There's no reason for an AI to harm us. IMO that's thinking too human. We want to kill and dominate... but again that's necessarily true. Because if our ancestors hadn't had this drive, then we wouldn't exist in the first place.
you havent watched enough sci fi movies. our desire to kill and dominate is exactly why the cylons will feel compelled to destroy us.
Can't tell if this is a joke, sorry :)
Yes, the plot of "protect humans is your priority" so it decides to lock us up and kill many of us to protect us from ourselves. Seems rather contrived to me though.
More likely something like we ask it to optimize a model for society. And of course humans with cybernetic implants are the smartest and best, so they run a lot of things. Those humans might use technology to f*** s*** up, but not a pure machine IMO.
Anyway, I think people are too fundamentally curious for any disaster like that to color all pursuit of knowledge as bad.
Although we already do consider some knowledge bad. For example anything that might suggest a gender or race or superior.
So ok, I can see where you're coming from. If technology (or research) is used to oppress people for 100 (or a few hundred) years, then it will be seen as taboo for a while.
forgive me for chiming in, but, you state that people will not allow disaster based on curiosity?
No, I mean, I don't think people will allow their curiosity to be subjugated by disaster.
e.g. technology kills 100 million people. It's horrific. But we don't abandon it, I think we tell ourselves we'll learn from those mistakes and try again.
I agree that knowledge is good. Currently some research is taboo, but I expect it to be cyclical... in the future it will be acceptable, then possibly taboo again, then acceptable again, etc.