And the US has a squeaky clean history around the word?
I am not only talking about trade.
Just going back to post WW 2 without pre WW2.
All the countries where the US has started wars to better themselves. Millions have died.
The countries all over the world where the US has ripped of countries of there natural resources.
Not that UK and some other European countries have a better record.
A lot of sheer hypocrisy
The world doesn't seem to learn from history.
I have a question that I have been asking for years.
Why was mankind put onto this beautiful planet.
When all man does, is take everything. Destroys everything and puts nothing back into it.
Anyone got a answer? I haven't.