The colonization of the world by the West after the Cold War helped spread the liberal democracy Is this the end of the story? The West has won forever?
There was no colonization after the cold war. The cold war ended in 1990 and most colonies had been freed long before in the 60s.
After the Cold War, neocolonialism came. Russia has become a colony of the United States and the West.
To start a colony, you need colonists. Where are the American and EU colonies in Russia? There aren't any. This is hyperbole only.
Honestly? Every new presidency, they say Russia and America will have a grand rapprochement. Here is a clue. It won't last. It never lasts. Putin and Trump will try and be friends just as every Russian and American leader has tried to in the past. But they are rivals first and foremost. Not as leaders, but as nations.
"Russia must pay for the financial stability of the United States." Dvorkovich is a Russian economist. Deputy Chairman of the Government of the Russian Federation