Anyway, I think people are too fundamentally curious for any disaster like that to color all pursuit of knowledge as bad.
Although we already do consider some knowledge bad. For example anything that might suggest a gender or race or superior.
So ok, I can see where you're coming from. If technology (or research) is used to oppress people for 100 (or a few hundred) years, then it will be seen as taboo for a while.
forgive me for chiming in, but, you state that people will not allow disaster based on curiosity?
No, I mean, I don't think people will allow their curiosity to be subjugated by disaster.
e.g. technology kills 100 million people. It's horrific. But we don't abandon it, I think we tell ourselves we'll learn from those mistakes and try again.
I agree that knowledge is good. Currently some research is taboo, but I expect it to be cyclical... in the future it will be acceptable, then possibly taboo again, then acceptable again, etc.

Sure, but I don't think it will be with fire, viruses, bullets, and bombs.
If technology doesn't stop advancing, I think it's more likely we incorporate more and more technology into ourselves until it's all machine.
There's no reason for an AI to harm us. IMO that's thinking too human. We want to kill and dominate... but again that's necessarily true. Because if our ancestors hadn't had this drive, then we wouldn't exist in the first place.