Naio wrote:
jfkmk wrote:
no matter how a wild animal is portrayed, if anyone doesn't think a wild animal can be dangerous, then I guess it goes to The Darwin effect.
I think a lot of people who grow up in cities don't know what animals ARE.
I mean, the only animals they know are pets -- who have been bred for millennia to be very human-oriented, and to think like humans (a bit, more than wild animals do) and interact with humans and care about humans.
My big ol' tomcat, no matter how angry he was at me for dragging him out from his hidey hole by his fur to take him to the vet, did not rip the skin and muscle off my arms, as he was perfectly capable of doing. All he did was make biting _gestures_, with no real intent behind them other than to communicate. Didn't come near breaking the skin, even.
Wild animals are not like that. They are strong, and they don't give a cr*p about keeping humans safe. But city humans, how would they even know that? How would they conceive of the idea?
I'm sorry, it has NOTHING to do with where a person lives or grew up. There are just as many country folk doing these stupid things as there are city folk doing them. Just as there are just as many city folk who know better than approach a wild animal as there are country folk who know better.
The plain and simple fact is that people simply believe "it won't happen to me". It's why people do stupid things every day. When people do start to feel vulnerable, they pick themselves up with justifications "I know how to approach them", "oh, people do this all the time, it's not dangerous", etc.