In my opinion, alot of Americans seem to believe that the entire world loves us and wants to live here. That is false..most people do not like America since we invaded Iraq and even before that. Believe it or not....I was on a trip in Italy when 9/11 happened and many of the people there said that we deserved it. Shocking and horrifying but true. I have traveled to many other parts of the world...they do NOT like the U.S. or what we are doing.
Answer on What is something that many Americans believe that is wrong, in you opinion?
The problem we have is that we think we have to get others to like us. We need to have an image in the world I agree, but that image shouldn't include "liking us." It should be based more on Respect of us.