I have never ever heard of anyone interpreting zombie fiction as right wing. Like, just look at Night of the Living Dead. Actually, is any zombie movie even marginally right-wing? Zombieland?
I started on the comic before the show and then I realised how much there was and gave up.
But isn’t The Walking Dead more about the bad things that people do to each other? Power corrupts, lack of accountability makes for a crueler society, that kind of thing? When circumstances make people desperate maybe you shouldn’t exploit that? And it would actually be better if we could change circumstances so people weren’t desperate?
Edit: but Rick is also a dickhead right? So I guess if you’re a dickhead and see a dickhead protagonist then you might feel validated.
I have never ever heard of anyone interpreting zombie fiction as right wing. Like, just look at Night of the Living Dead. Actually, is any zombie movie even marginally right-wing? Zombieland?
Walking Dead certainly satisfies those gun nut fantasies.
I started on the comic before the show and then I realised how much there was and gave up.
But isn’t The Walking Dead more about the bad things that people do to each other? Power corrupts, lack of accountability makes for a crueler society, that kind of thing? When circumstances make people desperate maybe you shouldn’t exploit that? And it would actually be better if we could change circumstances so people weren’t desperate?
Edit: but Rick is also a dickhead right? So I guess if you’re a dickhead and see a dickhead protagonist then you might feel validated.