Some shows like "Seinfeld" and "friends" can cut deep cultural divisions, and unite all the groups, becoming all things to everyone. "Sex and the City" was one of those shows, and now all the seasons, and two successful feature film spin-off on the back, it's easy to see the influence there. But only had to show it means for women all over the world? How does it affect the way they dress, what they are looking for a man, the expectations placed his life and career in general? Read and understand:
What is the show for women?
The women were able to tune in and watch "Sex and the City" every Sunday night on HBO. They may have a cultural phenomenon all to themselves. They were able to connect with people who said and did things that have not previously thought that a woman was capable of. They saw each new episode that a woman can be strong and vulnerable, and the star of his own show, all at the same time. It was a morale booster, a reason to laugh, and the perfect escape from a world that was constantly pushing the traditional male roles in them, while forcing them to "act like women." Thanks to "Sex and the City" TV show, women have seen how it could be independent, and they embraced.