I've begun reading a book by a British woman who spent a year in a very isolated Mongolian town teaching English. This is not an unusual sort of book. I read a lot of "traveler's tales}" books and the majority of them are by Brits, almost none are by American women. I know some American nuns have done work in Central America and that some Americans still do Peace Corps work. But something about British culture inspires intrepid women an American culture doesn't do anything similar. Of course Britain has a Queen and they had a woman Prime Minister years ago and we seem all agog because a woman is running for vice president but didn't have the gumption to let a woman run for President. This bugs me. Not the British part -- I love that I have these great stories to read and they go back into the 19th century as well.
Maybe the traveling has a little bit to do with a sense of geography. I don't know what British school children are taught, but since they had a far flung empire that is surely taught as history. They must have a better sense of the size and wonder of the world. Our school kids often don't know where most American states are and probably would have a very, very hard time simply listing all 50.
I was surfing the web a couple of days ago and saw a map of Africa, totally unlike the one above except in shape. It was labeled American perceptions [I searched for it a few minutes ago and couldn't find it] But there was a colored patch at the bottom that said, Mandala, and a small colored patch at the upper right that said pyramids and a squarish patch near the middle that said Sudan. And all the rest said "Tigers". No, it didn't say elephants or lions. It said "tigers." I guess it's not stating the obvious, although I would think it should be, to say tigers don't live in Africa. Do Americans really think there are tigers all over Africa? My mind is so boggled I can't write more at the moment.
David Russell paints - Abstract Done-Up 2
7 hours ago