Watch Now: “The Truth About Fake News” with Sinan Aral and Tim O’Reilly

Sinan Aral, MIT Sloan David Austin Professor of Management

Our latest installment of the MIT Sloan Experts Series includes a conversation about fake news with Sinan Aral, David Austin Professor of Management and author of the forthcoming book, The Hype Machine. We’ll discuss insights from the latest research from Aral and his co-researchers Soroush Vosoughi and Deb Roy of the MIT Media Lab which overturns conventional wisdom about how misinformation spreads, what causes it to spread so fast, and who—or what—is spreading it.

It is the largest study of its kind about fake news and is featured in the latest issue of Science, “The Spread of True and False News Online”,  March 9, 2018.

Tim O’Reilly, the founder, CEO, and Chairman of O’Reilly Media, and the author of many books including, WTF: What’s the Future and Why It’s Up to Us, also appears on the program to discuss possible technological and algorithmic solutions.

Watch the show here:

 

 

“The Future of American Innovation”– a podcast with David Schmittlein

MIT Sloan Dean David Schmittlein

MIT Sloan Dean David Schmittlein

MIT Sloan’s David Schmittlein appeared on CEO Global Foresight to discuss how the United States is leading world innovation in life sciences, information technology, and energy.

The segment was recently made available as an 8-minute podcast on the Innovation Gamechangers podcast, available on iTunes.

Dean Schmittlein also discusses innovation clusters and how the MIT community encourages a culture of collaboration and action learning. The program also includes interviews DARPA director Arati Prabhakar and Carl Dietrich, an MIT alumnus and CEO of flying car company Terrafugia.

Listen to Innovation Gamechangers podcast, available on iTunes.

David Schmittlein is the John C Head III Dean and Professor of Marketing at the MIT Sloan School of Management.

The Case for Evidence in Government – Doug Criscitello

Doug Criscitello, Executive Director of MIT’s Center for Finance and Policy

Doug Criscitello, Executive Director of MIT’s Center for Finance and Policy

From Government Executive

Although the U.S. government presides over what collectively must be one of the world’s largest data repositories, its capacity to use that data to build citizen trust and make informed, evidence-based decisions is severely constrained. As explained in an enlightening report recently issued by the bipartisan Commission on Evidence-Based Policymaking (CEP), the mere existence of data is a necessary but not sufficient condition for creating empirical evidence to inform decisions throughout the full lifecycle of public programs—enactment, funding, operation, reform, termination.

The digitization of many facets of various activities the government funds through its $4 trillion annual budget has resulted in a data explosion at federal agencies. But that data needs to be synthesized into actionable information to satisfy taxpayers’ demands for better results and greater transparency. The CEP report makes clear that much remains to be done to achieve that goal and provides a comprehensive plan to improve access to federal data, strengthen privacy protections and expand the public, private and academic research communities’ capacity to analyze data.

CEP provides an insightful list of recommendations such as establishing a National Secure Data Service to enable and leverage capabilities across government, addressing statutory impediments that obstruct smart data use, and streamlining processes used to grant researchers access to data. The report appropriately emphasizes strong privacy protections and advocates for comprehensive risk assessments for publicly released data and for the use of better technology and greater coordination across government. To prioritize efficient evidence building, CEP points out the need to coordinate statistical activities, evaluation and policy research within and between departments and across levels of government.

Read More »

New study offers hope for commuters caught in traffic – Ioannis Ch. Paschalidis

Ioannis Ch. Paschalidis, a former Visiting Professor at MIT Sloan

If you live in Boston, Los Angeles or any other major U.S. city, you know this fact: traffic is a nightmare. Sometimes it seems that traffic is all anyone talks about and each delayed meeting or event begins with a story about how bad it was.

The average commuter in the U.S. spends 42 hours in traffic per year. The cost of commuter delays has risen by 260 percent over the past 25 years and 28 percent of U.S. primary energy is now used in transportation. Road congestion is responsible for about 20% of fuel consumption in urban areas. According to one estimate, the cumulative cost of traffic congestion in the U.S. will reach $2.8 trillion by 2030. At the individual citizen level, traffic congestion cost $1,740 per driver during 2014. If unchecked, this number is expected to grow by more than 60 percent, to $2,900 annually, by 2030.

It’s a problem with a classic common and tragic root.  No individual driver has an incentive to make changes that would make the entire system better.  In other words, each driver seeks to make the best time or take the most convenient route, but no one is in charge of making the system work better as a whole.  As a result, traffic just keeps getting worse.

But technology, which in the form of the automobile gave us this problem, may now offer up the faintest hope of a solution for this problem—that is, the global positioning system, the pervasive use of cell phones, and the advent of the self-driving vehicle could bring new solutions to this seemingly intractable problem. Read More »

The business of artificial intelligence – Erik Brynjolfsson and Andrew McAfee

Professor of Information Technology, Director, The MIT Initiative on the Digital Economy

Director of the MIT Initiative on the Digital Economy, Erik Brynjolfsson

Co-Director of the MIT Initiative on the Digital Economy, Andrew McAfee

From Harvard Business Review

For more than 250 years the fundamental drivers of economic growth have been technological innovations. The most important of these are what economists call general-purpose technologies — a category that includes the steam engine, electricity, and the internal combustion engine. Each one catalyzed waves of complementary innovations and opportunities. The internal combustion engine, for example, gave rise to cars, trucks, airplanes, chain saws, and lawnmowers, along with big-box retailers, shopping centers, cross-docking warehouses, new supply chains, and, when you think about it, suburbs. Companies as diverse as Walmart, UPS, and Uber found ways to leverage the technology to create profitable new business models.

The most important general-purpose technology of our era is artificial intelligence, particularly machine learning (ML) — that is, the machine’s ability to keep improving its performance without humans having to explain exactly how to accomplish all the tasks it’s given. Within just the past few years machine learning has become far more effective and widely available. We can now build systems that learn how to perform tasks on their own.

Why is this such a big deal? Two reasons. First, we humans know more than we can tell: We can’t explain exactly how we’re able to do a lot of things — from recognizing a face to making a smart move in the ancient Asian strategy game of Go. Prior to ML, this inability to articulate our own knowledge meant that we couldn’t automate many tasks. Now we can.

Read More »