The real story of religion in America’s past is an often awkward, frequently embarrassing and occasionally bloody tale that most civics books and high-school texts either paper over or shunt to the side. And much of the recent conversation about America’s ideal of religious freedom has paid lip service to this comforting tableau.
From the earliest arrival of Europeans on America’s shores, religion has often been a cudgel, used to discriminate, suppress and even kill the foreign, the “heretic” and the “unbeliever”—including the “heathen” natives already here. Moreover, while it is true that the vast majority of early-generation Americans were Christian, the pitched battles between various Protestant sects and, more explosively, between Protestants and Catholics, present an unavoidable contradiction to the widely held notion that America is a “Christian nation.”
Even after accounting for unrelated variables, the researchers found that a 10 percent nominal increase in the percentage of female managers (at the level of the then-prevailing glass ceiling) was associated with a 1 percent nominal increase in ROA.
"The results are pretty strong that even when you control for anything that's fixed about a company, it appears that increasing your female managers leads to higher profitability over time," says Siegel.
For some reason he could never comprehend, people were inclined to believe the very worst about anything and everything; they were immune to contrary evidence just as if they'd been medically vaccinated against the force of fact. Furthermore, there seemed to be a bizarre reverse-Cassandra effect operating in the universe: whereas the mythical Cassandra spoke the awful truth and was not believed, these days "experts" spoke awful falsehoods, and they were believed.
Unselfish colleagues come to be resented because they "raise the bar" for what is expected of everyone. As a result, workers feel the new standard will make everyone else look bad.
The real form of the question, the one that generates the correct answer simply in its asking, is, "why doesn't having kids-- or getting married or getting a better job or getting laid or anything else I try to do-- make me happy? Oh. I get it. I'll shut up now."
I was sure that color coordinating the baby and the bathroom would make me happier but it didn't... should I have gone with lavender?
This intriguing media criticism suggests not just that the article's subjects are self-absorbed, but the journalists themselves. The Last Psychiatrist calls such articles "cognitive parasites" because even if one disagrees with the articles' conclusions, they can change the way one thinks.
Since Google's CEO has proclaimed the future of the web is no anonymity, does that make it a fact? If we keep hearing that privacy is dead and long buried, how long before we accept that anonymity is an anti-social behavior and a crime?
Security expert Bruce Schneier suggests that we protect our privacy if we are thinking about it, but we give up our privacy when we are not thinking about it.
Schneier wrote, "Here's the problem: The very companies whose CEOs eulogize privacy make their money by controlling vast amounts of their users' information. Whether through targeted advertising, cross-selling or simply convincing their users to spend more time on their site and sign up their friends, more information shared in more ways, more publicly means more profits. This means these companies are motivated to continually ratchet down the privacy of their services, while at the same time pronouncing privacy erosions as inevitable and giving users the illusion of control."
The loss of anonymity will endanger privacy. It's unsettling to think "governments will demand" an end to anonymous identities. Even if Schmidt is Google's CEO, his message of anonymity as a dangerous thing is highly controversial. Google is in the business of mining and monetizing data, so isn't that a conflict of interest? Look how much Google knows about you now.
Bruce Schneier put it eloquently, "If we believe privacy is a social good, something necessary for democracy, liberty and human dignity, then we can't rely on market forces to maintain it."
The world is more addictive than it was 40 years ago. And unless the forms of technological progress that produced these things are subject to different laws than technological progress in general, the world will get more addictive in the next 40 years than it did in the last 40.
Just in the United States, Clay Shirky maintains, we collectively watch about 200 billion hours of TV every year. For a vast majority of us, watching TV is essentially a part-time job.
What would the world be like if many of us quit our TV-watching gigs? Critics of television have long lamented its opportunity costs, but Shirky’s inquiry into what we might join together to do instead if we weren’t watching TV isn’t as fantastical as previous efforts. That’s because for the first time since the advent of television, something strange is happening — we’re turning it off. Young people are increasingly substituting computers, mobile phones and other Internet-enabled devices for TV.
The time we might free up by ditching TV is Shirky’s “cognitive surplus” — an ocean of hours that society could contribute to endeavors far more useful and fun than television. With the help of a researcher at I.B.M., Shirky calculated the total amount of time that people have spent creating one such project, Wikipedia. The collectively edited online encyclopedia is the product of about 100 million hours of human thought, Shirky found. In other words, in the time we spend watching TV, we could create 2,000 Wikipedia-size projects — and that’s just in America, and in just one year.
Higher education may be heading for a reckoning. For a long time, despite the occasional charge of liberal dogma on campus or of a watered-down curriculum, people tended to think the best of the college and university they attended. Perhaps they attributed their career success or that of their friends to a diploma. Or they felt moved by a particular professor or class. Or they received treatment at a university hospital or otherwise profited from university-based scientific research. Or they just loved March Madness.
Recently, though, a new public skepticism has surfaced, with galling facts to back it up. Over the past 30 years, the average cost of college tuition and fees has risen 250% for private schools and nearly 300% for public schools (in constant dollars). The salaries of professors have also risen much faster than those of other occupations. At Stanford, to take but one example, the salaries of full professors have leapt 58% in constant dollars since the mid-1980s. College presidents do even better. From 1992 to 2008, NYU's presidential salary climbed to $1.27 million from $443,000. By 2008, a dozen presidents had passed the million-dollar mark.
Meanwhile, tenured and tenure-track professors spend ever less time with students. In 1975, 43% of college teachers were classified as "contingent"—that is, they were temporary instructors and graduate students; today that rate is 70%. Colleges boast of high faculty-to-student ratios, but in practice most courses have a part-timer at the podium.