MIT Sloan’s David Schmittlein appeared on CEO Global Foresight to discuss how the United States is leading world innovation in life sciences, information technology, and energy.
The segment was recently made available as an 8-minute podcast on the Innovation Gamechangers podcast, available on iTunes.
Dean Schmittlein also discusses innovation clusters and how the MIT community encourages a culture of collaboration and action learning. The program also includes interviews DARPA director Arati Prabhakar and Carl Dietrich, an MIT alumnus and CEO of flying car company Terrafugia.
Much has been made of the fact that growth in coal use around the world is stalling, but coal will not disappear anytime soon. While a wave of firms is exiting the coal-fired electricity sector across the global, coal is still poised to contribute to the fuel mix for a long time to come. This means that careful management of its remaining uses is more important than ever. Coal will remain important for two reasons.
First, it is still in high demand. The International Energy Agency projects that coal in power generation may drop to 36 percent by 2021, down from 41 percent in 2014, largely due to renewables and energy efficiency in China and the United States. However, this amounts to at best a flatlining, not a reduction, in demand. Second, coal has a role outside the power sector, in industrial and household demand. In developing countries, power and heat account for only 50 percent of coal use. The other 50 percent is directly burned by households or industries.
Statistics from the World Coal Association paint a clear picture of coal’s role in industrial activities, especially in construction industries that support infrastructure development. According to that group, the cement industry uses 200 kilograms of coal to produce one ton of cement, with 300 kilograms to 400 kilograms of cement needed to produce one cubic meter of concrete. Coal is also used in as an input to production of 70 percent of the world’s steel, as a raw material to make chemicals, and to make liquid fuel for transportation.
Doug Criscitello, Executive Director of MIT’s Center for Finance and Policy
From Government Executive
Although the U.S. government presides over what collectively must be one of the world’s largest data repositories, its capacity to use that data to build citizen trust and make informed, evidence-based decisions is severely constrained. As explained in an enlightening report recently issued by the bipartisan Commission on Evidence-Based Policymaking (CEP), the mere existence of data is a necessary but not sufficient condition for creating empirical evidence to inform decisions throughout the full lifecycle of public programs—enactment, funding, operation, reform, termination.
The digitization of many facets of various activities the government funds through its $4 trillion annual budget has resulted in a data explosion at federal agencies. But that data needs to be synthesized into actionable information to satisfy taxpayers’ demands for better results and greater transparency. The CEP report makes clear that much remains to be done to achieve that goal and provides a comprehensive plan to improve access to federal data, strengthen privacy protections and expand the public, private and academic research communities’ capacity to analyze data.
CEP provides an insightful list of recommendations such as establishing a National Secure Data Service to enable and leverage capabilities across government, addressing statutory impediments that obstruct smart data use, and streamlining processes used to grant researchers access to data. The report appropriately emphasizes strong privacy protections and advocates for comprehensive risk assessments for publicly released data and for the use of better technology and greater coordination across government. To prioritize efficient evidence building, CEP points out the need to coordinate statistical activities, evaluation and policy research within and between departments and across levels of government.
The passage of Sarbanes-Oxley (SOX) was big news for public companies, but there was little discussion or analysis about what it meant for private firms, nonprofits and governmental entities. Yet those nonpublic entities needed to purchase accounting services from the same pool of independent auditors. It turns out that shocks to public companies from SOX significantly affected supply for the entire audit services market.
In a recent study, my colleagues and I looked at these developments and found that SOX had several negative spillover effects for nonpublic entities. Overall, audit fee increases for nonpublic entities more than doubled. Many others were forced to switch to a different auditor.
Why is this a big deal if those groups aren’t legally required to hire independent auditors? It’s important because nonpublic entities still have substantial financial reporting needs. For example, organizations use audits to establish payments plans with vendors and suppliers or to demonstrate creditworthiness to banks. Charities use audits to show they are responsibly spending donors’ money.
Here is a breakdown of the spillover effects: Read More »
As of Saturday, January 13, all EU member states were to fully implement the revised Payment Services Directive, known as PSD2.1) Among other things, PSD2 allows third-party payment service providers to gain access to customers’ bank accounts (with the customers’ consent, of course), and customers’ banks are required to provide API connection for identity verification. Its potential impact should not be underestimated. For example, under PSD2, customers and merchants can, in principle, cut credit cards and debit cards out of their transactions, saving significant costs along the way. In addition, banks can no longer “own” their customers’ account data or prevent competitors from accessing them.
The EU’s PSD2 is a major development in payments and financial market infrastructure, a once-sleepy “back-office” function that is now alive and kicking. The essence of PSD2 is to encourage competition and reduce the information advantages of incumbent banks. Likewise, the Bank of England announced in July 2017 that non-bank payment service providers can become direct settlement participants in the UK’s payment system, as long as certain requirements are met.
Access to financial market infrastructure such as payment systems has important implications for market competition. The study of industrial organization shows that competition is reduced by vertical integration. A vertically integrated incumbent that produces both “upstream” and “downstream” goods can effectively reduce competition in the downstream market if its stand-alone competitors rely on the incumbent for providing the upstream good.2) Financial market infrastructure is the ultimate upstream good for almost all economic activities. Privileged access to market infrastructure makes banks “special” and, in some situations, may encourage anticompetitive behavior. Good examples to keep in mind include two antitrust class lawsuits in over-the-counter derivatives markets in which investors accused dealer banks of, among other things, using their unique positions as clearing members in OTC derivatives to shut off new entrants that aim to compete with dealer banks in the transaction of these derivatives.3) One of these lawsuits has been settled, with dealer banks paying $1.86 billion.Read More »