jfkmk wrote:
no matter how a wild animal is portrayed, if anyone doesn't think a wild animal can be dangerous, then I guess it goes to The Darwin effect.
I think a lot of people who grow up in cities don't know what animals ARE.
I mean, the only animals they know are pets -- who have been bred for millennia to be very human-oriented, and to think like humans (a bit, more than wild animals do) and interact with humans and care about humans.
My big ol' tomcat, no matter how angry he was at me for dragging him out from his hidey hole by his fur to take him to the vet, did not rip the skin and muscle off my arms, as he was perfectly capable of doing. All he did was make biting _gestures_, with no real intent behind them other than to communicate. Didn't come near breaking the skin, even.
Wild animals are not like that. They are strong, and they don't give a cr*p about keeping humans safe. But city humans, how would they even know that? How would they conceive of the idea?