Will Robots Take Our Children’s Jobs?

From today’s New York Times:

Like a lot of children, my sons, Toby, 7, and Anton, 4, are obsessed with robots. In the children’s books they devour at bedtime, happy, helpful robots pop up more often than even dragons or dinosaurs. The other day I asked Toby why children like robots so much.

“Because they work for you,” he said.

What I didn’t have the heart to tell him is, someday he might work for them — or, I fear, might not work at all, because of them.

It is not just Elon MuskBill Gates and Stephen Hawking who are freaking out about the rise of invincible machines. Yes, robots have the potential to outsmart us and destroy the human race. But first, artificial intelligence could make countless professions obsolete by the time my sons reach their 20s.

You do not exactly need to be Marty McFly to see the obvious threats to our children’s future careers.

Say you dream of sending your daughter off to Yale School of Medicine to become a radiologist. And why not? Radiologists in New York typically earn about $470,000, according to Salary.com.

But that job is suddenly looking iffy as A.I. gets better at reading scans. A start-up called Arterys, to cite just one example, already has a program that can perform a magnetic-resonance imaging analysis of blood flow through a heart in just 15 seconds, compared with the 45 minutes required by humans.

Maybe she wants to be a surgeon, but that job may not be safe, either. Robots already assist surgeons in removing damaged organs and cancerous tissue, according to Scientific American. Last year, a prototype robotic surgeon called STAR (Smart Tissue Autonomous Robot) outperformed human surgeons in a test in which both had to repair the severed intestine of a live pig.

So perhaps your daughter detours to law school to become a rainmaking corporate lawyer. Skies are cloudy in that profession, too. Any legal job that involves lots of mundane document review (and that’s a lot of what lawyers do) is vulnerable.

Software programs are already being used by companies including JPMorgan Chase & Company to scan legal papers and predict what documents are relevant, saving lots of billable hours. Kira Systems, for example, has reportedly cut the time that some lawyers need to review contracts by 20 to 60 percent.

As a matter of professional survival, I would like to assure my children that journalism is immune, but that is clearly a delusion. The Associated Press already has used a software program from a company called Automated Insights to churn out passable copy covering Wall Street earnings and some college sports, and last year awarded the bots the minor league baseball beat.

What about other glamour jobs, like airline pilot? Well, last spring, a robotic co-pilot developed by the Defense Advanced Research Projects Agency, known as Darpa, flew and landed a simulated 737. I hardly count that as surprising, given that pilots of commercial Boeing 777s, according to one 2015 survey, only spend seven minutes during an average flight actually flying the thing. As we move into the era of driverless cars, can pilotless planes be far behind?

Then there is Wall Street, where robots are already doing their best to shove Gordon Gekko out of his corner office. Big banks are using software programs that can suggest bets, construct hedges and act as robo-economists, using natural language processing to parse central bank commentary to predict monetary policy, according to Bloomberg. BlackRock, the biggest fund company in the world, made waves earlier this year when it announced it was replacing some highly paid human stock pickers with computer algorithms.

So am I paranoid? Or not paranoid enough? A much-quoted 2013 study by the University of Oxford Department of Engineering Science — surely the most sober of institutions — estimated that 47 percent of current jobs, including insurance underwriter, sports referee and loan officer, are at risk of falling victim to automation, perhaps within a decade or two.

Read the complete article here.

Work Productivity: Laptops Are Great. But Not During a Lecture or a Meeting.

From today’s New York Times by Susan Dynarski:

Step into any college lecture hall and you are likely to find a sea of students typing away at open, glowing laptops as the professor speaks. But you won’t see that when I’m teaching.

Though I make a few exceptions, I generally ban electronics, including laptops, in my classes and research seminars.

That may seem extreme. After all, with laptops, students can, in some ways, absorb more from lectures than they can with just paper and pen. They can download course readings, look up unfamiliar concepts on the fly and create an accurate, well-organized record of the lecture material. All of that is good.

But a growing body of evidence shows that over all, college students learn less when they use computers or tablets during lectures. They also tend to earn worse grades. The research is unequivocal: Laptops distract from learning, both for users and for those around them. It’s not much of a leap to expect that electronics also undermine learning in high school classrooms or that they hurt productivity in meetings in all kinds of workplaces.

Measuring the effect of laptops on learning is tough. One problem is that students don’t all use laptops the same way. It might be that dedicated students, who tend to earn high grades, use them more frequently in classes. It might be that the most distracted students turn to their laptops whenever they are bored. In any case, a simple comparison of performance may confuse the effect of laptops with the characteristics of the students who choose to use them. Researchers call this “selection bias.”

Read the entire article here.

We’re With Stupid: On Fake News and the Literacy of America’s Electorate

From New York Times by Timothy Egan (Nov. 17, 2017):

It would be much easier to sleep at night if you could believe that we’re in such a mess of misinformation simply because Russian agents disseminated inflammatory posts that reached 126 million people on Facebook.

The Russians also uploaded a thousand videos to YouTube and published more than 130,000 messages on Twitter about last year’s election. As recent congressional hearings showed, the arteries of our democracy were clogged with toxins from a hostile foreign power.

But the problem is not the Russians — it’s us. We’re getting played because too many Americans are ill equipped to perform the basic functions of citizenship. If the point of the Russian campaign, aided domestically by right-wing media, was to get people to think there is no such thing as knowable truth, the bad guys have won.

As we crossed the 300-day mark of Donald Trump’s presidency on Thursday, fact-checkers noted that he has made more than 1,600 false or misleading claims. Good God. At least five times a day, on average, this president says something that isn’t true.

We have a White House of lies because a huge percentage of the population can’t tell fact from fiction. But a huge percentage is also clueless about the basic laws of the land. In a democracy, we the people are supposed to understand our role in this power-sharing thing.

Nearly one in three Americans cannot name a single branch of government. When NPR tweeted out sections of the Declaration of Independence last year, many people were outraged. They mistook Thomas Jefferson’s fighting words for anti-Trump propaganda.

Fake news is a real thing produced by active disseminators of falsehoods. Trump uses the term to describe anything he doesn’t like, a habit now picked up by political liars everywhere.

But Trump is a symptom; the breakdown in this democracy goes beyond the liar in chief. For that you have to blame all of us: we have allowed the educational system to become negligent in teaching the owner’s manual of citizenship.

Read the entire article here.

Employees do want their job to matter, but meaning at work can be hard to find

From today’s Chicago Tribune by Alexia Elejalde-Ruiz:

Jennifer Ruiz holds her patient’s trembling hand as she presses a stethoscope to the frail woman’s chest and belly. She compliments the woman on her recently painted fingernails. She cheerfully asks how she’s feeling, knowing she’ll get no answer from the little curled body in the big hospital bed but for a penetrating stare.

Ruiz, a hospice nurse, finds her work deeply meaningful, in part for reasons that are obvious: “We get to be there for people during some of the most tragic and tough times in their lives,” she said.

But even those who shepherd the dying and their families through the fear, heartbreak and mystery of the end of life can lose sight of a job’s meaning in the stress of the day-to-day, if their employer doesn’t foster it.

“You have to fan that flame,” said Brenda McGarvey, corporate director of program development at Skokie-based Unity Hospice, where Ruiz works. “It’s your responsibility.”

A job’s meaningfulness — a sense that the work has a broader purpose — is consistently and overwhelmingly ranked by employees as one of the most important factors driving job satisfaction. It’s the linchpin of qualities that make for a valuable employee: motivation, job performance and a desire to show up and stay.

Meaningful work needn’t be lofty. People find meaning picking up garbage, installing windows and selling electronics — if they connect with why it matters.

But many Chicago-area employers seem to be missing an opportunity to tap this critical vein.

In a survey conducted by Energage for the Chicago Tribune’s 2017 Top Workplaces magazine, local employees regarded their employers more positively than the national average on nearly all measures, but companies fell significantly short in response to this statement: “My job makes me feel like I am part of something meaningful.” Meaningfulness also was the only measure that did not see any improvement among Chicago-area respondents this year, compared with last.

Read the article here.

WeWork and the Death of Leisure

From today’s New York Times “Opinion” by Ginia Bellafante

This past week, Hudson’s Bay, whose story begins 347 years ago in the fur trade, making it the oldest company in North America, announced that it was selling Lord & Taylor’s flagship store, on Fifth Avenue, several years after it had acquired the department store chain through a deal with a private-equity firm.

The buyer would be WeWork, the office rental outfit very much rooted in the virtue-and-shell-game ethos of 21st-century capitalism. The founders Adam Neumann and Miguel McKelvey got together in a building on the Brooklyn waterfront where they both worked — Mr. Neumann as the proprietor of a company called Krawlers that produced padded clothes for babies — and quickly realized that they could make money from all the vacant space they saw around them by simulating the atmosphere of the Silicon Valley workplace, fueling the dreams of young entrepreneurs who always wanted to appear as if they were having fun. Over the summer, seven years into its existence, WeWork reached a $20 billion valuation.

On the face of it, the transformation of a department store — the first in the country to install an elevator — into the headquarters of a start-up is simply a story of the new economy cannibalizing the old. Traditional retail businesses have been in decline for a long time; the cult of shared goods and services enabled by technology is ever ascendant.

The first iteration of Lord & Taylor was a dry goods store on Catherine Street in Lower Manhattan that opened in 1826. The 676,000-square-foot Italianate building in Midtown it eventually occupied in 1914 (a building for which WeWork is now paying $850 million) stood not merely as a monument to turn-of-the-century commerce but also as the grand testament to what the sociologist Thorstein Veblen called the rising culture of “conspicuous leisure.”

Leisure, Veblen wrote, “does not connote indolence or quiescence.’’ What it conveys is the “nonproductive consumption of time,” by which he was not anticipating the 10,000 hours people would fritter away playing Minecraft, but any time spent away from the activity of labor. In their infancy and well into the first 80 years or so of the 20th century, department stores were largely places to pass the hours. When Lord & Taylor opened on Fifth Avenue and 38th Street it featured three dining rooms, a manicure parlor for men and a mechanical horse that could walk, trot or canter. Harry Gordon Selfridge, founder of Selfridges in London, dictated that “a store should be a social center.” To that end he installed an ice rink and shooting range on the roof of his store and exhibited the first plane to fly over the English Channel.

Read the entire article here.

In the Fight Against Poverty, Work Is Our Most Powerful Weapon

From today’s Harvard Business Review by Leila Janah:

Fourteen years ago, I left suburban Los Angeles to teach English in rural Ghana. I’d expected, like so many young people with bleeding hearts and big dreams, to make a difference by donating my time as a schoolteacher for six months. Upon arrival in the village, I was shocked to discover that my students, avid listeners of Voice of America and BBC radio, already spoke English quite well, and some could speak to me about President Clinton’s state visit to Africa. These were blind or partially sighted kids from families earning less than $3 a day.

How was this possible? I’d learned from countless TV specials on war and poverty in the continent that Africans needed aid. They needed us to send food and clothes and to build wells and schools. But on the ground, almost every poor person I spoke to told me the same thing: “We don’t want aid, we want work.” I spent the next four years studying development economics at Harvard, designing a special major to focus on African development, and later working at the World Bank to further understand the problem of poverty and how to fix it.

My conclusion after all this time isn’t so novel. But it bears repeating because we’ve lost our way: Work is the most powerful weapon we have to fight poverty and all its downstream effects, from child malnutrition to maternal mortality, both domestically and abroad. We need to modernize workforce training, incentivize companies to hire low-income people, and encourage consumers to support those organizations that #GiveWork, not aid.

Last year, the 2,000 largest companies spent an estimated $12 trillion on goods and services, a lot of it directed to suppliers that mine or harvest raw materials or make and grow things in poor countries. The fair trade movement was a strong first step in working to access these reserves of capital to fund poverty reduction directly. Started in the 1950s, it pushed purveyors of commodity goods like coffee, chocolate, sugar, and cotton to adhere to a rigorous set of core principles, including deliberately working in marginalized communities and paying living wages. And the results have been good. For example, Starbucks sources all its European espresso beans from fair trade certified producers, and Dutch company Fairphone sells the world’s first entirely fair trade Android phone, with batteries made from ethically mined minerals.

But I believe we now need something broader and simpler to mobilize companies and consumers to think differently about aid: a model called “impact sourcing,” which pushes for workforces (whether directly employed or employed through suppliers) to be economically diverse enough to include some of the world’s most disadvantaged people. This shift could, by our estimation, lift millions out of poverty in a single year.

Read the entire article here.

The Maddeningly Simple Way Tech Companies Can Employ More Women

From the New York Times, August 15, 2017 by Katherine Zaleski:

I am the co-founder of a company that helps clients find ways to diversify their work force. We recently set up an interview at a major company for a senior African-American woman software engineer. After meeting with the hiring panel, she withdrew her application, telling us she felt demeaned by the all-white male group that failed to ask her any questions about her coding skills. She described how one of the men had made it clear to her that she wasn’t a cultural fit and that therefore they didn’t need to proceed with technical questions.

I hear stories like this regularly, as I work with companies in Silicon Valley and beyond who want to bring more women onto their tech teams. Higher-ups declare their intention to hire more women. But the actual hiring is still all too rare.

There’s a continuing debate about the reasons for the lack of diversity in the tech sector, including candidate pools that are mostly male, and stubborn, superficial notions of what it means to be a “cultural fit” for an organization — the template for which is often based on young white men. But at least one small component of this problem is immediately solvable: Many companies are alienating the qualified women who want to work for them, and who they want to hire, during the interview process itself.

While Silicon Valley companies are enthusiastically putting money into STEM programs in schools and nonprofits focused on diversity, with the goal of creating a richer pipeline of talent in 10 years, they’re missing opportunities to make simple, immediate improvements by changing how they communicate with women who are sitting across the table from them now.

Read the entire article here.

Robocalypse Now? Central Bankers Argue Whether Automation Will Kill Jobs

From today’s New York Times by Jack Ewing:

SINTRA, Portugal — The rise of robots has long been a topic for sci-fi best sellers and video games and, as of this week, a threat officially taken seriously by central bankers.

The bankers are not yet ready to buy into dystopian visions in which robots render humans superfluous. But, at an exclusive gathering at a golf resort near Lisbon, the big minds of monetary policy were seriously discussing the risk that artificial intelligence could eliminate jobs on a scale that would dwarf previous waves of technological change.

“There is no question we are in an era of people asking, ‘Is the Robocalypse upon us?’” David Autor, a professor of economics at the Massachusetts Institute of Technology, told an audience on Tuesday that included Mario Draghi, the president of the European Central Bank, James Bullard, president of the Federal Reserve Bank of St. Louis, and dozens of other top central bankers and economists.

The discussion occurred as economists were more optimistic than they had been for a decade about growth. Mr. Draghi used the occasion to signal that the European Central Bank is edging closer to the day when it will begin paring measures intended to keep interest rates very low and bolster the economy.

“All the signs now point to a strengthening and broadening recovery in the euro area,” Mr. Draghi said. His comments pushed the euro to almost its highest level in a year, though it later gave up some of the gains.

But along with the optimism is a fear that the economic expansion might bypass large swaths of the population, in part because a growing number of jobs could be replaced by computers capable of learning — artificial intelligence.

Policy makers and economists conceded that they have not paid enough attention to how much technology has hurt the earning power of some segments of society, or planned to address the concerns of those who have lost out. That has, in part, nourished the political populism that contributed to Britain’s vote a year ago to leave the European Union, and the election of President Trump.

“Generally speaking, economic growth is a good thing,” Ben S. Bernanke, former chairman of the Federal Reserve, said at the forum. “But, as recent political developments have brought home, growth is not always enough.”

In the past, technical advances caused temporary disruptions but ultimately improved living standards, creating new categories of employment along the way. Farm machinery displaced farmworkers but eventually they found better paying jobs, and today their great-grandchildren may design video games.

But artificial intelligence threatens broad categories of jobs previously seen as safe from automation, such as legal assistants, corporate auditors and investment managers. Large groups of people could become obsolete, suffering the same fate as plow horses after the invention of the tractor.

Read the entire article here.

A New Kind of Tech Job Emphasizes Skills, Not a College Degree

From today’s New York Times by Steve Lohr:

ROCKET CENTER, W.Va. — A few years ago, Sean Bridges lived with his mother, Linda, in Wiley Ford, W.Va. Their only income was her monthly Social Security disability check. He applied for work at Walmart and Burger King, but they were not hiring.

Yet while Mr. Bridges had no work history, he had certain skills. He had built and sold some stripped-down personal computers, and he had studied information technology at a community college. When Mr. Bridges heard IBM was hiring at a nearby operations center in 2013, he applied and demonstrated those skills.

Now Mr. Bridges, 25, is a computer security analyst, making $45,000 a year. In a struggling Appalachian economy, that is enough to provide him with his own apartment, a car, spending money — and career ambitions.

“I got one big break,” he said. “That’s what I needed.”

Mr. Bridges represents a new but promising category in the American labor market: people working in so-called new-collar or middle-skill jobs. As the United States struggles with how to match good jobs to the two-thirds of adults who do not have a four-year college degree, his experience shows how a worker’s skills can be emphasized over traditional hiring filters like college degrees, work history and personal references. And elevating skills over pedigree creates new pathways to employment and tailored training and a gateway to the middle class.

Read the complete article here.

Is the Gig Economy Working?—For Some, But Not For Most Workers

From this month’s The New Yorker magazine by Nathan Heller:

The American workplace is both a seat of national identity and a site of chronic upheaval and shame. The industry that drove America’s rise in the nineteenth century was often inhumane. The twentieth-century corrective—a corporate workplace of rules, hierarchies, collective bargaining, triplicate forms—brought its own unfairnesses. Gigging reflects the endlessly personalizable values of our own era, but its social effects, untried by time, remain uncertain.

Support for the new work model has come together swiftly, though, in surprising quarters. On the second day of the most recent Democratic National Convention, in July, members of a four-person panel suggested that gigging life was not only sustainable but the embodiment of today’s progressive values. “It’s all about democratizing capitalism,” Chris Lehane, a strategist in the Clinton Administration and now Airbnb’s head of global policy and public affairs, said during the proceedings, in Philadelphia. David Plouffe, who had managed Barack Obama’s 2008 campaign before he joined Uber, explained, “Politically, you’re seeing a large contingent of the Obama coalition demanding the sharing economy.” Instead of being pawns in the games of industry, the panelists thought, working Americans could thrive by hiring out skills as they wanted, and putting money in the pockets of peers who had done the same. The power to control one’s working life would return, grassroots style, to the people.

The basis for such confidence was largely demographic. Though statistics about gigging work are few, and general at best, a Pew study last year found that seventy-two per cent of American adults had used one of eleven sharing or on-demand services, and that a third of people under forty-five had used four or more. “To ‘speak millennial,’ you ought to be talking about the sharing economy, because it is core and central to their economic future,” Lehane declared, and many of his political kin have agreed. No other commercial field has lately drawn as deeply from the Democratic brain trust. Yet what does democratized capitalism actually promise a politically unsettled generation? Who are its beneficiaries? At a moment when the nation’s electoral future seems tied to the fate of its jobs, much more than next month’s paycheck depends on the answers.

Read the entire article here.