This is a thread about America. Or, rather, this is a thread about how America (the country) and Americans (the people) are perceived around the world, and how foreign media influences these perceptions.
Before I went to Japan my Japanese history teacher, a German, warned me that if I got any flack while I was over there it would probably be from Europeans who were also studying abroad. I was prepared for this, so I kept my guard up around them, but I tried to be my natural self with my Japanese friends. The funny thing was, while I was not discriminated against intentionally, I was constantly having to deal with the most ridiculous stigmas about American men and Japanese women, and just general stupidity. "How you know how to use cell phone?" "Don't you drive big SUV?" "Don't you know American men rape Japanese girls?" (As if Japanese men don't already have that covered.)
It was a weird year, to say the least, and I constantly found myself having to explain to both Japanese and other international students that, no, not all Americans think alike, we didn't all vote for Bush, we're not all Protestants, I've never even touched a gun, and not all black people are in the Army (I have no fucking clue where that one came from). This is not to say that your average American actually knows diddly fuck about life outside his own tri-county area, but these were students at the international school of one of the four most prestigious universities in Japan.
The common thread among all these misconceptions seemed to be that my foreign friends were convinced that the American press was at least partially owned and operated by the Government, just like in their country. My roommate this past semester was from Hong Kong, and he refused to believe that it was even legal
to criticize George Bush until he saw the Daily Show ("Oh my God, are they allowed to say that? Won't he go to jail now?"). Watching TV news in Japan and reading their daily papers (English translation...) I was always floored at how America was framed as this colossal, monolithic bully full of stupid, inbred white rednecks and black drug dealers (apparently there are no Hispanics in the US, by the way).
Anyway, I know there are quite a few people from around the world who post on these forums, and I'm really curious as to how you think the media in your country influences popular perception about the US, and how that jives with your own personal experiences. Also, for Americans who have lived abroad, what were your experiences with negative stereotypes? How did you handle it?
EDIT: Upon further consideration, feel free to share experiences about otherwise intelligent Americans had some kind of ridiculous misconception about you because you were from another country. (Legit complaints only, though. No one cares if Bucky-Jim from Kentucky didn't know your traditional Slavic clog-hat dance of friendship and misogyny.)
Flew away in a balloon
Had sex with polar bears
While sitting in a reclining chair
Now there are Zim-Bear hybrids
Running around and clawing eyelids
Watch out, a Zim-Bear is about to have sex with yooooooou!