Tired of ads?
Join today and never see them again.
Advertisement - Guide continues below
Efforts to fight drug use in America date back just as far as the drug use itself—which is to say, to the earliest days of European settlement in the country. Every single drug ever to achieve widespread use in America—from caffeine to crack—has been subjected, at one time or another, to attempts at serious government restriction. Various forms of restriction have been applied to different drugs at different times, but by taking the long view, we can see that all American anti-drug efforts have fallen into one of two broad approaches to dealing with drug problems in society: regulation and prohibition.
Regulation accepts that citizens will use a given drug, but imposes extra taxes and certain restrictions—like age limits, health warnings, or special permits for sellers—on its use. The idea is to reduce use of the drug by increasing its cost and restricting its availability, without criminalizing its use.
Prohibition, in contrast, simply makes it illegal to sell or use a particular drug. The threat of criminal punishment is intended to deter citizens from using the drug at all.
Both approaches have their advantages and disadvantages as public policy. Regulation allows for more nuanced policy and generates revenue for the government through collection of "sin taxes," but sends the message that the government officially condones citizens' use of the drug being regulated.
Prohibition, on the other hand, sends a clear message that use of the prohibited drug isn't acceptable, but creates expensive and socially destructive problems of enforcement when some people inevitably use the drug anyways. Through 400 years of drug use in America, neither regulation nor prohibition has solved the drug problem, but both approaches have, at times, been effective in mitigating the negative impacts of drugs on American societies. For all their faults, the two approaches remain the best we've got.
So, which drugs should be regulated, and which prohibited? Americans today are familiar with a certain hierarchy of drug regulation and enforcement:
It's easy today to imagine that these hierarchies, which have been enshrined in federal law since the passage of the Controlled Substances Act of 1970, are natural or timeless, rooted in the innate qualities of the various drugs. In other words, heroin—to take one example—simply is, by its very nature, an illegal drug, while caffeine isn't.
But our history suggests otherwise. A century ago, heroin was sold, over the counter and without a prescription, as a cough suppressant. In 1911, the federal government tried, but failed, to shut down Coca-Cola for illegally adding caffeine to its famous secret formula. In the 1920s, marijuana was perfectly legal in most states, but alcohol was banned everywhere in America. After the Civil War, thousands of Americans—including at least two presidents, William McKinley and Ulysses S. Grant—regularly drank Vin Mariani, a popular and entirely legal brand of red wine infused with pure cocaine. Even before the founding of the United States as an independent nation, the Massachusetts General Court attempted to prohibit tobacco in 1632 and England's King Charles II tried to outlaw coffee houses in 1675.
In short, most of today's banned drugs have been, in the past, perfectly legal, and most of today's legal drugs have been, in the past, banned. Whether any particular drug should be accepted, regulated, or prohibited is a political, cultural, and historical choice for society to make, not an inherent aspect of the drug's chemistry.
The first drug to be regulated in American history was tobacco. Neither tobacco nor the habit of smoking any drug into the lungs had ever been encountered in Europe before the voyages of Columbus. Both the drug and the method of taking it were native to the Western Hemisphere. Europeans had to learn how to smoke tobacco from Native Americans, and they soon exported the custom, and the drug, back to Europe.
And by the late-16th century, Europe had become a continent of nicotine addicts. The rapid spread of smoking through European society appalled many spiritual and political leaders, who regarded the habit as the evil custom of uncivilized heathens. Smoking, in other words, was the Devil's work.
European governments at the time had no power to ban tobacco, but in 1604—three years before the first English colony in America took root at Jamestown—England's King James I wrote a pamphlet, memorably named A Counterblaste to Tobacco, urging his subjects not to smoke the American weed.
"And now Good Countrymen," wrote the king, "let us (I pray you) consider, what honour or policy can move us to imitate the barbarous and beastly manners of the wild, Godless and slavish Indians, especially in so vile and stinking a custom?" Nixon announced he would launch a "War on Drugs" to fight that menace. Subsequent presidents—especially Ronald Reagan, who joined his wife Nancy in heavily promoting the "Just Say No" movement—have carried on Nixon's fight.
Half a century after the War on Drugs officially began, we don't appear to be close to winning.
Like Prohibition, the War on Drugs has achieved a real measure of success in reducing overall drug use. Proportionally fewer Americans smoke crack today than they did in the 1970s, for example. And there's no ambiguity in the federal government's strong message to its citizens: "Just say no" to drugs, or face the consequences.
But, like Prohibition, the War on Drugs has had many socially undesirable side-effects. The most obvious costs are, well, costs. Total state and federal government spending on the War on Drugs—encompassing education, treatment, interdiction, enforcement, prosecution, and incarceration—likely exceed $50 billion a year. That's money that could be spent on other programs or returned to taxpayers through tax cuts. Less obvious costs include the heavy social consequences of crime and punishment. Like Prohibition, the War on Drugs created a huge black market in drugs, fueling runaway growth in both violent criminal drug-trafficking activity and in drug-focused law enforcement operations. As a consequence, many American cities have been menaced by violent gangs that are sustained by drug profits.
Meanwhile, millions of people have been ensnared by the criminal justice system for drug-related crimes. Today, more than 2.3 million people are imprisoned in America. That's seven times as many as in 1970, just before the War on Drugs officially began. In fact, the number of nonviolent drug offenders imprisoned today is higher than the total prison population—for all crimes—in 1970. Largely as a result of the War on Drugs, the United States now has a higher proportion of its population locked away behind bars than any other society in the history of the world. And the social costs for heavily impacted American communities have been immense.
"War is helll."
That was the simple, sad verdict of one of America's most celebrated war heroes, General William Tecumseh Sherman, whose brutal tactics in subduing the South helped the Union win the Civil War. Sherman and his men—like their fellow soldiers in every American war before or since—understood from bitter first-hand experience that the realities of military life during wartime were horrific. The rhetoric of war emphasizes a set of abstractions—glory, honor, patriotism—but the lived experience of war, for its participants, offers a set of very real miseries: hunger, thirst, heat, cold, sickness, injury, stress, boredom, fear, death.
And so, the full version of General Sherman's famous quote (delivered to the graduates of the Michigan Military Academy in 1879): "There is many a boy here today who looks on war as all glory, but, boys, it is all hell."
While Steele's rhetoric may have been somewhat overblown, subsequent investigations suggested that at least 40,000 Vietnam veterans came back from the conflict as heroin addicts.
Today, most of us are familiar with the concept of the "drug-free workplace."
Drinking and drug use is banned on most job sites, many companies drug test their employees, and an entire office of the federal government (the Department of Labor's Drug Free Workplace Alliance) exists for no purpose other than to encourage companies to institute anti-drug programs for their employees.
The message is clear: drugs and work do not mix.
But the history of drugs and labor in America has actually been much more ambiguous. While sentiments similar to those motivating the drug-free workplace movement can be traced back to the early years of the American republic, so too can instances of employers condoning or even encouraging certain drug use as a means of increasing worker productivity.
Before the rise of modern industrial capitalism in America, artisans manufactured goods in small workshops organized along the lines of Europe's medieval guilds. A master craftsman worked and usually lived right alongside his team of young apprentices, with the workshop functioning as a kind of all-male family. Among these artisans, it was common for master and apprentices to take a break from work every so often to share a drink of rum or hard cider, typically enjoying a dram or tankard right there on the shop floor.
This tradition of communal workplace drinking surely slowed the workshops' economic output, but it also made the work more tolerable while strengthening the family—like social bond between master and apprentices.
By the 1820s, however, the Industrial Revolution had taken hold in much of America and the old-fashioned artisan workshop gave way to the modern capitalist factory.
Men who would've been master craftsmen became factory bosses, while boys who would've been apprentices became wage laborers. The fraternal bonds of community cultivated by the old workshops were lost as industrial workforces divided into often-antagonistic classes of employers and employees. The new factories, organized to maximize production, had no use for workers taking a break to drink intoxicating liquor, and drunkenness on the job was banned as an impediment to industrial safety and efficiency.
It's worth noting here that the initial response of the workers, freed from the round-the-clock supervision of artisan masters by their transformation into wage laborers, was to embark upon an orgy of drunkenness outside the workplace. Americans in the late 1820s drank more than ever before or since. The social and economic costs of America's seeming transformation into a "nation of drunkards" led to the rise of a powerful temperance movement, and soon to a general expectation of sobriety for all workers.
In short, America's capitalist transformation meant that employers came to demand maximum productivity from their workers. And maximum productivity from workers meant that the traditional shop floor drink—to say nothing of the shop floor drunkard—had to go. So, the drug-free workplace was born.
But just as alcohol was being eased out of the workplace, other drugs were being eased in. These new drugs, not coincidentally, tended to increase rather than decrease the productivity of the workers. The new workplace drugs—caffeine and nicotine foremost among them—helped workers endure the long hours and brutal conditions of wage labor in the early years of the Industrial Revolution.
Workers in 19th-century factories often worked 10, 12, or 14-hour shifts. They needed some way to sustain energy, suppress hunger, and maintain focus just to survive.
The Industrial Revolution began in Great Britain, and the British developed the first—and perhaps most famous—new tradition of industrial workplace drug use: teatime. By mid-afternoon, a worker might have already been on the job for six or eight hours, with another four or six still to go. A short rest and a cup of hot tea provided just the lift needed to make it to the end of the shift. Even the now-traditional British style of taking tea—with plenty of milk and sugar—served an important industrial purpose: the tea delivered a dose of stimulating caffeine, while the sugar provided a short-term energy boost and the protein in the milk helped to suppress the appetite.
It seems strange to say it today, but teatime—something most of us now associate with old ladies and tea cozies—played an integral role in enabling the Industrial Revolution.
At some point, most Americans lost the mother country's deep affection for tea, but on this side of the Atlantic, coffee has long served much the same purpose. In this country, workers' coffee rituals never became quite as standardized as the Brits' 4:00PM teatime, but the coffee break—which was invented in 1952 as a marketing ploy by the Pan American Coffee Bureau—has become a ubiquitous feature of the workday in every American office and jobsite.
Caffeine is a legal drug that boosts worker productivity, and so its use by workers has been not only condoned, but actively encouraged by employers for two centuries.
But caffeine isn't the only drug used to help workers work harder, faster, or smarter. Other drugs—including some illegal drugs—serve the same purpose in many job-specific circumstances. Some long-haul truckers, who drive the lonely highways in ten-hour shifts, take amphetamines to keep from dozing off and wrecking their rigs. Some exhausted stockbrokers take Ritalin or cocaine to keep focused during hundred-hour workweeks. Some aging baseball players take steroids to keep hitting the home runs that keep fans coming to the ballpark.
In all of these cases, governments have declared the productivity-boosting drugs to be illegal and employers have officially banned their use. Yet workers are rewarded for the artificially strong performance that these drugs enable, and society in general is at least partly complicit in demanding levels of productivity only made possible with chemical assistance.
Can our society's goal of a drug-free workplace truly be reconciled with a simultaneous expectation that its truckers should be able to drive all-night shifts and its leftfielders should be able to hit 72 home runs in a season?
Since the early-19th century, American women's movements have often been closely linked to American temperance movements—that is, efforts to limit or prohibit alcohol consumption.
A strong argument can be made that temperance—not suffrage, and not feminism—has been the most powerful women's social movement in American history. But why?
The story begins in the early-19th century, when powerful forces unleashed by the Industrial Revolution upended traditional social structures and cultural norms in American communities.
In the 1820s, efficient new technologies of industrial production made alcohol cheaper than ever before, just at a time when millions of Americans were struggling to adapt to the unprecedented demands of a new market economy. The result was a smorgasbord of bacchanalian excess, as the citizens of the young republic—especially young men striving to secure a place in the emerging middle class—drowned their anxieties in an ocean of liquor.
Horrified religious leaders lamented that the United States was "fast becoming a nation of drunkards," and the statistics suggest that they were right. By 1830, annual alcohol consumption skyrocketed to 9.5 gallons of hard liquor, plus 30.3 gallons of beer or hard cider, for every American man and woman over age 14.blank" rel="nofollow">Great Depression with beer.
Alcohol has been legal for American adults ever since.
Yet faint echoes of the original temperance movement as a powerful female moral crusade can still be seen in our own time. In 1980, Candy Lightner, the mother of a 13-year-old child tragically killed by a drunken driver, founded Mothers Against Drunk Driving (MADD) as a grassroots organization dedicated to fighting drunk driving. MADD's appeals to the female moral authority of bereaved mothers proved to be politically potent, and the group's agenda expanded from opposition to drunk driving to opposition to drinking in general. MADD's advocacy was instrumental in changing many state and federal policies related to alcohol, most significantly the 1984 federal law that increased the drinking age to 21 nationwide.
Addicts are good customers, often willing to pay almost any price to obtain the drugs upon which they've become dependent. As a result, drugs have long been some of the most valuable products on earth.
Today, coffee is the world's second most valuable legally traded international commodity, trailing only petroleum. A single company, Starbucks, sells more than $8 billion worth of coffee a year. Americans spend more than $50 billion every year on cigarettes, and more than $100 billion on alcohol.
Illegal drugs have a smaller market than the "Big Three" legal drugs, but ounce by ounce, they're even more precious. Cocaine is literally more valuable than gold, and by a wide margin: the drug has a street value of more than $100 a gram while the precious metal trades at less than $25. Marijuana is currently the United States' most valuable cash crop, with cultivators growing nearly $36 billion dollars worth of the federally-illegal weed every year. On the flip side, with over half of the states legalizing marijuana in some form, the legal weed market is reaching $7 billion and will likely increase. Corn, the nation's second most valuable agricultural commodity, is worth only $23 billion.
If he'd prospered in any other line of work, "Freeway" Ricky Ross might be seen as a hero of the American Dream, a self-made business success and entrepreneurial genius. But unlike Starbucks' Howard Schultz, Ricky Ross went into business on the wrong side of the law, and eventually, Ross had to face the consequences.
In the end, now former-CEO of Starbucks Howard Schultz was worth close to $3 billion, while "Freeway" Ricky Ross was serving out a 20-year jail sentence in federal prison. The starkly divergent fortunes of two of the country's most successful recent purveyors of popular drugs suggests that the biggest action in the American drug market remains in the sale of legal drugs.