Where exactly are they going with Rick's character on the show?
Rick started off as a really good man. When others had been the side of a man who had done him wrong, he was willing to offer them sanctuary.
Now he will attack anyone he doesn't know, without hesitation.
enter link description here
enter link description here
He wouldn't kill anyone, and if he did, he normally regretted it. Now he will kill anyone without hesitation.
enter link description here
In the early days, he had no interest in taking from others, like he had no interest in controlling Woodbury, even though the Governor wanted the Prison. In S5 however, he was more than willing to take Alexandria. And willing to possibly kill anyone who dared to stop him.
enter link description here
enter link description here
So where exactly are they going with him? Are they turning him into a villain? Because they already did that with another AMC protagonist - Walter White - Breaking Bad.
I've been wanting to ask this question for quite some time, because Rick is probably one of my favorite characters on the entire show. But I've hesitated to, because being concerned about the answers that I might get.
On an unrelated sidenote, I must say that I really think it is a great idea that they didn't remove Rick's hand, as they did with his comic book counterpart.
Yes, it could have been interesting to see his character go to the apocalypse with just one hand, but this way it makes it more action friendly. I believe that with his having his other hand, that we get to see more brutal fight scenes and whatnot from the character.
Comments
Just a phase, Negan will put him in his place.
I love Rick this way. He's so entertaining to watch.
I think Rick might get even darker but a straight up villain seems like a bit much.
As a side note: I want dark Carl back.