Preparing for the cyberattack that will knock out U.S. power grids – Stuart Madnick

MIT Sloan Professor Stuart Madnick

MIT Sloan Professor Stuart Madnick

From Harvard Business Review

Cyberattacks are unavoidable, but we’re not going to stop using computerized systems. Instead, we should be preparing for the inevitable, including a major cyberattack on power grids and other essential systems. This requires the ability to anticipate not only an unprecedented event but also the ripple effects that it could cause.

Here’s an example of second-order effects (though not caused by a cyberattack, they’re a good way to think through what could happen in an attack). In February 2017, an area of Wyoming was hit by a strong wind storm that knocked down many power lines. It took about a week to restore power, due to heavy snow and frozen ground. Initially, water and sewage treatment continued with backup generators. But the pumps that moved sewage from low-lying areas to the treatment plants on higher ground were not designed to have generators, since they could hold several days’ worth of waste. After three days with no power, they started backing up. The water then had to be cut off to prevent backed-up waste water from getting into homes. The area had never lost power for so long, so no one had anticipated such a scenario.

Now think about what would happen if a cyberattack brought down the power grid in New York, for example. New Yorkers could manage for a few hours, maybe a few days, but what would happen if the outage lasted a week or more? For an example of the kind of disruption such an attack could cause, consider the 2011 Japanese tsunami. It knocked out both the power lines and the backup generators at the same time. Either event could have been managed, but both occurring at the same time was a disaster. Without power, the cooling systems in three nuclear reactors failed, resulting in massive radiation exposure and concerns about the safety of food and water. The lesson: We need to prepare not only for an unexpected event but also for the possible secondary effects.

Read More »

The board’s role in share repurchases – Robert Pozen

MIT Sloan Senior Lecturer Robert Pozen

MIT Sloan Senior Lecturer Robert Pozen

From MIT Sloan Management Review

Capital allocation is a significant function for company directors. How much of the company’s profits gets reinvested in the business rather than distributed to shareholders through cash dividends or share repurchases is a critical decision companies must make. Boards of directors typically approve a dividend policy and precise amounts for each quarter: Everyone knows that cutting the dividend will result in a sharp decline in the share price.

Yet in many companies, decisions about the level and timing of share repurchases are left to management. That stems partly from differences in legal requirements: The board must formally approve the amount of the company’s quarterly dividend but not its repurchases. Moreover, the implementation of the repurchase program is heavily influenced by the company’s actual cash flows.

Nevertheless, share repurchases are something to which directors should pay more attention. Specifically, directors should carefully consider the capital allocated to repurchases relative to the company’s realistic opportunities for value creation through internal development or external acquisitions. They should be highly skeptical of large repurchase programs that are financed by selling debt rather than paid for out of company profits.

Read More »

Finding new actionable insights in old data research – Hazhir Rahmandad

MIT Sloan Professor Hazhir Rahmandad

From Information Management

There is a common problem often associated with managing data across scientific disciplines. As the stock of information rapidly grows through scientific discoveries, a major data management challenge emerges as data professionals try to tap prior research findings.

Current methods to aggregate quantitative findings (meta-analysis) have limitations. They assume that prior studies share similar designs and substantive factors. They rarely do.

Take for example studies estimating basal metabolic rate – the measure of human energy expenditure. Study results can have important implications for understanding human metabolism and developing obesity and malnutrition interventions.

Over 47 studies have estimated BMR. But these calculations are based on different body measures, such as fat mass, weight, age, and height – to name a few. How do we combine those studies into a single equation to get usable insights?

To address this issue, my colleagues and I designed a new method for aggregating prior work into a meta model, called “generalized model aggregation” (GMA). Building on advances in data analytics and computational power this method enables one to combine previous studies, even when they have heterogeneous designs and measures.

We used the BMR problem as an empirical case to apply GMA. Using only the models available from the literature, we estimated a new model that takes into account all the different body measures considered in prior studies for estimating GMA. Then, on a separate dataset, we compared our equation’s predictive power against older equations as well as state-of-the-art equations used by the World Health Organization and Institute of Medicine.

Our equation outperformed all other equations available, including the more recent ones.

Read More »

Why home care costs too much – Paul Osterman

MIT Sloan Prof. Paul Osterman

MIT Sloan Prof. Paul Osterman

From the Wall Street Journal

As baby boomers age into long-term care facilities, Medicaid costs will go through the roof. Americans already spend – counting both public and private money – more than $310 billion a year on long-term support services, excluding medical care, for the elderly and the disabled. Medicaid accounts for about 50% of that, according to a 2015 report from the Kaiser Commission on Medicaid and the Uninsured. Other public programs cover an additional 20%.

Yet in another decade or so these figures may look small. In 2015 around 14 million Americans needed long-term care. That number is expected to hit 22 million by 2030. There’s an urgent need to find ways of providing good long-term care at a lower cost. One fix would be to deregulate important aspects of home care.

There are two million home health aides in the U.S. They spend more time with the elderly and disabled than anyone else, and their skills are essential to their clients’ quality of life. Yet these aides are poorly trained, and their national median wage is only a smidgen more than $10 an hour.

The reason? State regulations – in particular, Nurse Practice Acts – require registered nurses to perform even routine home-care tasks like administering eyedrops. That duty might not require a nursing degree, but defenders of the current system say aides lack the proper training. “What if they put in the cat’s eyedrops instead?’ a healthcare consultant asked me. In another conversation, the CEO of a managed-care insurance company wrote off home-care aides as “minimum wage people.”

Read More »

Has China’s coal use peaked? Hear’s how to read the tea leaves – Valerie J. Karplus

Assistant Professor Valerie Karplus

Assistant Professor Valerie Karplus

From The Conversation

As the largest emitter of carbon dioxide in the world, how much coal China is burning is of global interest.

In March, the country’s National Bureau of Statistics said the tonnage of coal has fallen for the second year in the row. Indeed, there are reports that China will stop construction of new plants, as the country grapples with overcapacity, and efforts to phase out inefficient and outdated coal plants are expected to continue.

A sustained reduction in coal, the main fuel used to generate electricity in China, will be good news for the local environment and global climate. But it also raises questions: what is driving the drop? And can we expect this nascent trend to continue?

Read More »