by Ryan Streeter on April 8, 2014. Follow Ryan on Twitter.
Spending on the major safety-net programs nearly quadrupled between 1970 and 2010, and that’s after adjusting for inflation and population growth…[T]he biggest increases in spending have gone to those who were middle class or hovering around the poverty line. Meanwhile, Americans in deep poverty — that is, with household earnings of less than 50 percent of the official poverty line — saw no change in their benefits in the decade leading up to the housing bubble. In fact, if you strip out Medicare and Medicaid, federal social spending on those in extreme poverty fell between 1993 and 2004.
Arthur Brooks has been drawing attention to this quite a bit over the past year – namely, that failing to reform entitlements ultimately hurts low-income people the most.
Where is America’s brain power concentrated? There are two ways to look at this: cities that have the most highly educated people per capita, and cities experiencing the highest growth levels among highly educated people.
The latter is “an indicator of momentum that is likely to carry over into the future,” according to Joel Kotkin, in his new ranking of “America’s Brainpower Cities.”
Here are the top 30 cities with the highest growth rate among their college-educated population between 2007 and 2012:
by Ryan Streeter on April 6, 2014. Follow Ryan on Twitter.
[T]here’s a reason that golden ages can diminish into twilight — because the demands of the present can crowd out the needs of the future, and because what’s required to preserve and sustain is often different, in the end, from what’s required to grow.
That’s an apt summary of several thousand years of history. You might not think it comes at the end of an opinion piece on how we’ll be debating Obamacare for decades to come. In his latest column Ross Douthat explains why the debate over Obamacare is only just beginning and what it means for the future. Immediately preceding the quote above, he writes:
[T]he political salience of this debate will rise for the same reason that the costs of Medicare will be rising: because the country will be older over all, and health policy inevitably matters more to the old than to the young.
Which means that the future almost certainly holds more cries of “death panels,” more ads featuring Paul Ryan clones pushing seniors over a cliff,and no doubt as-yet-undreamt-of forms of demagogy. And it means, as well, that if it’s hard to get Washington to focus on other issues now — tax reform, education, family policy, you name it — just wait awhile: It will get much worse.
It’s important to note, of course, that this “worse” will be the result of betterment: our political debates will be consumed by health care because of all that medicine can do for us, and we’ll be arguing about how to sustain what earlier generations would have regarded as a golden age.
Much of the future of this debate will turn on how actively and successfully the millennial generation, which is presently skeptical of institutions, decides to take it on and change the course of future policy options.
by Ryan Streeter on March 26, 2014. Follow Ryan on Twitter.
There’s no real consensus on why our labor force participation rates are dropping like they are. Brennan sums it up this way:
With American labor-force-participation level dropping so dramatically, a lot of economists have been asking and debating how much of it can be attributed to demographics and how much to economic weakness: America is getting older, and the rate at which women work isn’t skyrocketing anymore. But prime-age labor-force participation removes the former of those concerns, and we’re still seeing a noticeable drop. The consensus estimates are that something like one-half to two-thirds of the labor-force drop are due to demographics (which still, of course, presents an economic and fiscal problem, even if it it’s inevitable).
Another, more depressing conclusion: The drop in labor-force-participation among prime-age men isn’t so much about continued economic weakness as it is about permanent shifts that have depressed wages (making work less appealing than non-work) and suppressed jobs growth.
by Ryan Streeter on March 23, 2014. Follow Ryan on Twitter.
This Atlantic essay by Hanna Rosin is worth reading in its entirety. She chronicles the rise in overly protective parenting and employs some data to question whether it’s making our children better.
Children are born with the instinct to take risks in play, because historically, learning to negotiate risk has been crucial to survival; in another era, they would have had to learn to run from some danger, defend themselves from others, be independent. Even today, growing up is a process of managing fears and learning to arrive at sound decisions. By engaging in risky play, children are effectively subjecting themselves to a form of exposure therapy, in which they force themselves to do the thing they’re afraid of in order to overcome their fear. But if they never go through that process, the fear can turn into a phobia…We might accept a few more phobias in our children in exchange for fewer injuries. But the final irony is that our close attention to safety has not in fact made a tremendous difference in the number of accidents children have.
She cites quite a few statistics and studies to make her point. This piece should prove to be a conversation-starter, much like Rosin’s book on men a couple years ago.
by Ryan Streeter on March 23, 2014. Follow Ryan on Twitter.
Upward mobility isn’t what we would hope in America. You have a greater chance of moving from low- to middle-class in many other developed countries than here, which is a difficult data point to swallow for those of us who have experienced America first-hand as the land of opportunity.
There have a been a number of insightful studies on this topic in the past few years, and Aparna Mathur and Abby McCloskey at AEI have done us a nice service by reviewing a number of them in this new report (PDF).
Among the factors they summarize from the studies, these are the most powerful in helping or inhibiting a young person’s chances of successfully moving up the ladder of opportunity in America:
- Parents’ income
- Where you live
- Family structure
- Educational options
The report also includes policy ideas from AEI scholars on how to address the lack of opportunity in America for lower-income people. As the report points out, all of the money we spend on social welfare and the safety net does not help people get ahead, however much it spares them hardship.
Millennials: Poorer than previous generations, less likely to marry, more distrustful – and yet hopeful
by Ryan Streeter on March 23, 2014. Follow Ryan on Twitter.
It took me awhile to get around to reading the Pew report on Millennials (PDF) given interruptions like 70 hour work weeks, taxes, and hours of watching cable TV coverage of flight 370 in which no new fact is introduced. So I won’t offer anything new on this already-discussed report, but I did want to record some of its findings.
The main takeaway is that adult millennials (18-33 years old) are more unattached to fundamental institutions (political, religious, familial) than any generation in our recorded history. They are libertarian in the lifestyle rather than political sense of the term, and despite being highly networked through social media (or maybe because of it), they are much more distrustful of others.
Here are some data points of note:
Millennials are also the first in the modern era to have higher levels of student loan debt, poverty and unemployment, and lower levels of wealth and personal income than their two immediate predecessor generations (Gen Xers and Boomers) had at the same stage of their life cycles.
Even though millennials are much more likely than older generations to support gay marriage, they are far less likely to actually get married. The graph on the right shows them compared to previous generations at the same age.
During the “hope and change” years of the Obama administration, millennials have grown more politically independent at quite a clip. In a way, they are tracking with a larger trend in which Gen Xers, Baby Boomers, and even the Silent generation have all grown more politically independent. But their upward trend lines in this direction have been steadier and larger. Whereas they only slightly outnumbered Democrats when Obama was elected, now one in two millennials identifies as an independent:
Eddie Lazear sees troubling data in the recent jobs numbers:
The average workweek in the U.S. has fallen to 34.2 hours in February from 34.5 hours in September 2013, according to the Bureau of Labor Statistics. That decline, coupled with mediocre job creation, implies that the total hours of employment have decreased over the period.
After ruling out math errors and bad weather, he speculates:
Another possibility for the declining average workweek is the Affordable Care Act. That law induces businesses with fewer than 50 full-time employees—full-time defined as 30 hours per week—to keep the number of hours low to avoid having to provide health insurance. The jury is still out on this explanation, but research by Luis Garicano, Claire LeLarge and John Van Reenen (National Bureau of Economic Research, February 2013) has shown that laws that can be evaded by keeping firms small or hours low can have significant effects on employment.
by Ryan Streeter on March 17, 2014. Follow Ryan on Twitter.
Ross Douthat has a characteristically good column on the individualism that surveys shows is at the heart of what defines the millennial generation:
[M]illennials’ skepticism of parties, programs and people runs deeper than their allegiance to a particular ideology. Their left-wing commitments are ardent on a few issues but blur into libertarianism and indifferentism on others. The common denominator is individualism, not left-wing politics: it explains both the personal optimism and the social mistrust, the passion about causes like gay marriage and the declining interest in collective-action crusades like environmentalism, even the fact that religious affiliation has declined but personal belief is still widespread. (emphasis added)
So the really interesting question about the millennials isn’t whether they’ll all be voting Democratic when Chelsea Clinton runs for president. It’s whether this level of individualism — postpatriotic, postfamilial, disaffiliated — is actually sustainable across the life cycle, and whether it can become a culture’s dominant way of life.
To get some perspective on individualism’s place in modernity, Ross turns to Robert Nisbet’s Quest for Community:
Trying to explain modern totalitarianism’s dark allure, Nisbet argued that it was precisely the emancipation of the individual in modernity — from clan, church and guild — that had enabled the rise of fascism and Communism.
In the increasing absence of local, personal forms of fellowship and solidarity, he suggested, people were naturally drawn to mass movements, cults of personality, nationalistic fantasias. The advance of individualism thus eventually produced its own antithesis — conformism, submission and control.
You don’t have to see a fascist or Communist revival on the horizon (I certainly don’t) to see this argument’s potential relevance for our apparently individualistic future. You only have to look at the place where millennials — and indeed, most of us — are clearly seeking new forms of community today.
That place is the online realm, which offers a fascinating variation on Nisbet’s theme. Like modernity writ large, it promises emancipation and offers new forms of community that transcend the particular and local. But it requires a price, in terms of privacy surrendered, that past tyrannies could have only dreamed of exacting from their subjects.
That’s a hypothesis Michael Barone is testing in his latest column. When I first read his headline, I thought, no, tougher sentencing and better policing have done the trick. But as I read his column, I became more curious. Maybe he’s onto something. This should be tested some more. He writes:
The welfare reform work requirements may also be contributing — this is my hypothesis — to the remarkable decline in violent crime in America over the last 20 years.
A second factor, starting more than 30 years ago, was tougher sentencing, which kept many violent criminals off the street.
Still another factor may be a change in the mindset of those most likely to commit crimes, males age 15 to 25, particularly (unpleasant to say, but true) black and Hispanic males in that age cohort. This at-risk population seems to be committing many fewer crimes than their counterparts did 25 years ago.
A disproportionate number in both cases were sons of single mothers on welfare. But the 1989 15-to-25s had mothers who stayed at home and collected welfare checks.
Today’s 15-to-25s were more likely to have mothers who, if they collected welfare, had to hold a job. Mothers with jobs are away from home during work hours.
But they are also likely to have more moral authority. They bring home the bacon and are entitled to demand good behavior in return.
And, especially if they move ahead at work, they set a better example for their children, male and female. They show that there is a connection between honest effort and legitimate reward. A mother who earns success shows her children they can, too.