And I'm saying that virtually every country on Earth has some military knowledge. You really think that France/the German principalities/England/the Ottoman Empire/the PL Commonwealth/the Kievan Rus'/... just forgot everything after a war, and that Spain, miraculously, was the only country to understand that experience is a thing?
And do you believe that the Southern American populations religiously waited for the White Man to bring them the concept of armed conflicts, and that they lived in perfect harmony until then, miraculously oblivious to the concept of conquest and imperialism?