Liberals Losing the Culture Wars

September 3, 2010 06:31


For decades, liberals won the culture wars and, consequently most of the elections that reshaped America. More than politics is changing in this crossroads year of 2010—Americans are reasserting traditional culture. You won’t read or see it in the mainstream media, but this fact is driving the political story.

Roger Hedgecock at Human Events

EXCERPTS:

‘Beginning in the 1960s, God was driven out of American public life because liberals said the Constitution demanded a separation of church and state.

Planned Parenthood was part of a campaign that convinced many Americans that killing unborn babies was really a defense of a woman’s constitutional right to choose.

The ACLU sued to define “free speech” to include vandalism, sacrilegious art, and spitting on returning veterans of the Vietnam War.

Even more depressing, the drive for equal rights for liberated slaves, begun by Republicans during and after the Civil War, morphed into a liberal affirmative action program which reintroduced privilege based on skin color.

In 2010, the tide has turned.’

FULL STORY



Help Make A Difference By Sharing These Articles On Facebook, Twitter And Elsewhere: