I really hate most end of the world movies. They show what jaded Hollywood people think of the rest of the world, and they clearly think we're a bunch of sadistic idiots (or at least that watching sadistic idiots react to things is somehow entertaining). I've been to L.A. many times, I have family that works in show business, and I just want to say that these are the *last* people we should be looking to for a reality check. Some disaster movies at least paint a clever picture: Children of Men, 12 Monkeys, but usually the message is just "people will do anything to survive, all is dark and sad and purposeless, we should all be ashamed of ourselves".

Don't get me wrong, I like the idea of a story that explores throwing off the system of social order and testing people's mettle in the face of horror, and I do believe some people would act like this film portrayed, but sadistic idiots aside, I seriously doubt society would just dissolve into every man for himself, that's just insulting.