In light of the increasingly hostile political, social, and economic climate in the United States, more and more Americans have been leaving this nation for other lands, including Canada. Now Americans are finding more and more foreign borders closed to them. Could it be because those nations don't want to take immigrants from a country like the U.S.? Is that what we have become in the eyes of the world?
We need to look at our nation and our leaders and stop the hemorrhaging of our way of life. Everything we have worked to accomplish during the last 60 years is at risk of being destroyed. I hope it is not already too late.