What’s an acceptable percentage to tip? The amount has been accelerating without any clear economic force driving it, and with unclear benefits for all parties involved. In the 19th century and during the first half of the 20th century, a 10% tip was common. By the 1980s, 15% tips had become the standard. Now we observe 18%, 20%, and even 25% tipping rates.
Perhaps as a result, tipping is a constant source of tension and debate, and a favorite topic for social and economic critique. And, like any controversial subject, it has its own little-understood rules and oddities.
Here’s one that stands out: Employees who qualify for tips are not paid the current mandated federal minimum wage of $7.25 per hour. Instead they are paid a minimum wage of only $2.13 per hour—presumably because their tips make up for their lower wages. If the tips do not make up the difference, then the employer is required to do so. But with limited reporting and enforcement, it is hard to know how this rule plays out. (Some states require higher minimum wages than the federally required $2.13 per hour, and a few states do not allow lower minimum wages for tipped employees.)
For this reason, along with others, we have to ask: do tips really make workers better off? Alternatively, how would service markets perform if tipping was replaced by a practice where prices included all service charges and customers were asked not to leave tips? This is the policy that restaurateur Danny Meyer announced in 2015, for example, applicable to 13 of his New York restaurants.
Read the full post at Quartz.
Oz Shy is a Senior Lecturer teaching economics at MIT Sloan School of Management. He has published three books: How to Price, The Economics of Network Industries and Industrial Organization: Theory and Applications.