In an era of tech innovation, whispers of declining research productivity – Irving Wladawsky-Berger

MIT Sloan Visiting Lecturer Irving Wladawsky-Berger

MIT Sloan Visiting Lecturer Irving Wladawsky-Berger

From The Wall Street Journal

Given the pace of technological change, we tend to think of our age as the most innovative ever. But over the past several years, a number of economists have argued that increasing R&D efforts are yielding decreasing returns.

Are Ideas Getting Harder to Find?, a recent paper by economists Nicholas Bloom, Charles Jones and Michael Webb from Stanford and John Van Reenen from MIT, shows that, across a wide range of industries, research efforts are rising substantially while research productivity is declining sharply.

Moore’s Law, the empirical observation that the number of transistors in a computer chip doubles approximately every two years, illustrates these trends. The paper points out that the number of researchers required to double chip density today is 18 times larger than those required in the early 1970s. In the case of Moore’s Law, research productivity has been declining at a rate of about 6.8% per year.

The authors conducted a similar in-depth analysis in the agricultural and pharmaceutical industries. For agricultural yields, research effort went up by a factor of two between 1970 and 2007, while research productivity declined by a factor of 4 over the same period, at an annual rate of 3.7 %. For pharmaceuticals, research efforts went up by a factor of 9 between 1970 and 2014 while research productivity declined by a factor of 5, an annual rate of 3.5%.

 

Read More »

Seeing past the hype around cognitive computing – Jeanne Ross

Jeanne Ross, Director & Principal Research Scientist at the MIT Sloan School's CISR

Jeanne Ross, Director & Principal Research Scientist at the MIT Sloan School’s CISR

From Information Management

Given the hype around artificial intelligence, you might be worried that you’re missing the boat if you haven’t yet invested in cognitive computing applications in your business. Don’t panic! Consumer products, vehicles, and equipment with embedded intelligence are generating lots of excitement. However, business applications of AI are still in the early stages.

Research at MIT Sloan’s Center for Information Systems Research (CISR) suggests that small experiments in cognitive computing may help you tap the significant opportunities AI offers. But it’s easy to invest huge amounts of cash and time in failed experiments so you will want to carefully target your investments.

The biggest impact from cognitive computing applications is expected to come from automation of many existing jobs. We expect computers to do—faster and cheaper—many tasks now performed by humans. Progress thus far, however, suggests that we have significant obstacles to overcome in our efforts to replace human intelligence with computer intelligence. Despite some notable exceptions, we expect the displacement of human labor to proceed incrementally.

The business challenge is to determine which applications your company is ready to cash in on while resisting the lure of tackling processes that you can’t cost-effectively teach machines to do well. We have studied the opportunities and risks of business applications of cognitive computing and identified several lessons. These lessons offer suggestions for positioning your firm to capitalize on the potential benefits of cognitive computing and avoid the pitfalls.

Read More »

How companies can create a cybersafe culture at work – Stuart Madnick

Stuart Madnick, MIT Sloan Prof. of Information Technology

From The Wall Street Journal

As technical defenses against cyberattacks have improved, attackers have adapted by zeroing in on the weakest link: people. And too many companies are making it easy for the attackers to succeed.

An analogy that I often use is this: You can get a stronger lock for your door, but if you are still leaving the key under your mat, are you really any more secure?

It isn’t as if people aren’t aware of the weapons hackers are using. For instance, most people have heard of, and probably experienced, phishing—emails or messages asking you to take some action. (“We are your IT dept. and want to help you protect your computer. Click on this link for more information.”) Although crude, these tactics still achieve a 1% to 3% success rate.

Then there are the more deadly, personalized “spearphish” attacks. One example is an email, apparently sent from a CEO to the CFO, that starts by mentioning things they discussed at dinner last week and requests that money be transferred immediately for a new high-priority project. These attacks are increasingly popular because they have a high success rate.

The common element of all these kinds of attacks: They rely on people falling for them. Read More »

Retailers need to get real about security – Lou Shipley

MIT Sloan Lecturer Lou Shipley

MIT Sloan Lecturer Lou Shipley

From Xconomy

It seems a distant memory now. In December 2013 – light years ago in technology time – the retail giant Target disclosed a massive software security breach of its point of sale systems. The bad guys fled the virtual premises with the credit card information of 40 million customers. This astounding number would later rise to 70 million customers.

Target’s embarrassment, its loss of market share, its brand erosion, and its legal costs to settle claims collectively should have served as a nerve-jangling wakeup call for retailers large and small nationwide.

It would be hopeful to believe that retailers learned from Target’s data breach, but in fact the opposite has happened. In 2016, retail software security breaches were up 40 percent over the prior year and in 2017 the following familiar brand names suffered breaches – Sonic, Whole Foods Market, Arby’s, Saks Fifth Avenue, Chipotle, Brooks Brothers, Kmart, and Verizon. Retail software security is getting worse, not better, and the dismal trend seems likely to continue in the near term. Why? Read More »

Is “murder by machine learning” the new “death by PowerPoint?” – Michael Schrage

MIT Center for Digital Business Research Fellow Michael Schrage

From Harvard Business Review 

Software doesn’t always end up being the productivity panacea that it promises to be. As its victims know all too well, “death by PowerPoint,” the poor use of the presentation software, sucks the life and energy out of far too many meetings. And audit after enterprise audit reveals spreadsheets rife with errors and macro miscalculations. Email and chat facilitate similar dysfunction; inbox overload demonstrably hurts managerial performance and morale. No surprises here — this is sadly a global reality that we’re all too familiar with.

So what makes artificial intelligence/machine learning (AI/ML) champions confident that their technologies will be immune to comparably counterproductive outcomes? They shouldn’t be so sure. Digital empowerment all too frequently leads to organizational mismanagement and abuse. The enterprise history of personal productivity tools offers plenty of unhappy litanies of unintended consequences. For too many managers, the technology’s costs often rival its benefits.

It’s precisely because machine learning and artificial intelligence platforms are supposed to be “smart” that they pose uniquely challenging organizational risks. They are likelier to inspire false and/or misplaced confidence in their findings; to amplify or further entrench data-based biases; and to reinforce — or even exacerbate — the very human flaws of the people who deploy them.

The problem is not that these innovative technologies don’t work; it’s that users will inadvertently make choices and take chances that undermine colleagues and customers. Ostensibly smarter software could perversely convert yesterday’s “death by Powerpoint” into tomorrow’s “murder by machine learning.” Nobody wants to produce boring presentations that waste everybody’s time, but they do; nobody wants to train machine learning algorithms that produce misleading predictions, but they will. The intelligent networks to counter-productivity hell are wired with good intentions. Read More »