Why do you think America is the best country in the world?

United States is one of the worst country to live in. Rape rate is as high as medical care. Education sector is polluted with debt and mass shooting. Politics is polluted by Money & Supremacy.

Police sector is stained with the blood of thousand people that die every year by police shooting. Police brutality is at a point at the present, like citizens are treated worse than animals by them, you and your children are one second away from getting shot to death by Police in United States.

When it comes to diplomacy, it doesn’t exist in USA. It doesn’t honor deals, it doesn’t respect the freedom of speech which actually matters in the time of crisis.

In the past it bombed other nations to dust, currently it’s bombing many nations. As long as it exist we can only assume it will keep raising one war after another until it’s collapse by outside force or it’s own weight.

So in what sector it’s the best country I don’t understand. Some president says in front of the camera and I think any president that says such thing doesn’t posses the necessary qualities to be a leader of any nation.