Friday, May 30, 2025

SNAP: Nutrition Aid Can Provide Long-Term Benefits BY Hilary Hoynes

 the Issue:

The Supplemental Nutrition Assistance Program (SNAP), also known as "food stamps", is one of the largest anti-poverty programs in the United States, reaching over 44 million Americans in 2016, at a cost of $73 billion to the Federal Government. Major policy changes up for consideration in Congress as well as the Trump administration's federal budget proposal could lead to dramatic funding cuts to SNAP.

 While the costs are easy to tally, the program also has measurable short- and long-term benefits, especially for children.

 

The Facts:

  • The Supplemental Nutrition Assistance Program (SNAP) provides low-income individuals and families monthly additional food resources. It is by far the nation's largest food and nutrition program, and disproportionately reaches families with children. In fiscal year 2017, the average SNAP recipient received US$ 125 each month through vouchers or Electronic Benefit Transfer cards that work like debit cards but can only be used for food purchases. About two-thirds of SNAP benefits go to families with children and 44 percent of SNAP participants in 2015 were children under 18. The wide reach of SNAP is partly due to the fact that its coverage is universal — unlike other social programs, which restrict eligibility to particular groups such as female-headed households, children, the disabled or the elderly. To be eligible, participants must be below certain income and asset thresholds (for instance, gross monthly income must be at or below 130 percent of the poverty line). There are some restrictions: strikers, most college students, some legal immigrants and all undocumented immigrants are ineligible, regardless of their income. Additionally, SNAP is time-limited for able-bodied adults without dependents in many states.
  • In the short-term, those receiving food stamps experience greater food security and are better able to weather tough economic times. There is fairly consistent evidence that SNAP leads to a decrease in food insecurity — a condition of limited or uncertain access to adequate food, as defined by the U.S. Department of Agriculture. By one estimate, SNAP receipt reduces food insecurity by roughly 30 percent and the likelihood of being very food insecure by 20 percent. SNAP also plays an important role in reducing poverty: About 3.6 million Americans, including 1.5 million children, were lifted out of poverty in 2016 as a result of the program.
  • SNAP improves birth outcomes and infant health. Douglas Almond, Diane Schanzenbach and I have found that when an expecting mother has access to SNAP during her pregnancy, particularly her third trimester, it decreases the likelihood that her baby will be born with  low birth weight. We found that African American and white babies are respectively 6 percent and 2.4 percent less likely to be born at very low birth weight defined as below 1,500 grams (or 3 pounds 5 ounces). These improvements are largest in more vulnerable populations, such as babies born in high poverty counties and babies with the lowest birth weights. (See this policy brief for a summary of research findings on SNAP's impact.)
  • But, there is also evidence that the benefits of nutrition support can persist well into adulthood for those who have access to the program before birth and during early childhood. The roots of today's SNAP program originated with a food stamp pilot in 1961 and expanded gradually on a county-by-county basis until 1975, when all U.S. counties had a food stamp program in place. (The food stamp program was renamed SNAP in the 2008 Farm Bill, see here for a history of the program.) Since the program was introduced on a county-by-county basis over 50 years ago, it is possible to observe adult health and other outcomes of comparable individuals who differ in the age at which Food Stamps became available, depending on their county of birth. Doing this comparison, our research finds that having access to nutrition support before birth and in early childhood leads to a significant reduction in the incidence of obesity, high blood pressure, heart disease and diabetes in adulthood (see chart). The largest effect was for those children who were born after the Food Stamp Program had been implemented in their county of birth. Their mothers had access to the program during pregnancy, and it continued to be available throughout their entire childhood. In contrast, there was no reduction in the incidence of these metabolic diseases for children who were in the later stages of childhood when the Food Stamp Program was implemented in their county. In addition to the health effects, women with access to the program as children were also more likely to graduate from high school, earn more, and rely less on the social safety net as adults than those who did not.
  • The findings of benefits from SNAP that persist into adulthood are consistent with a growing consensus that the environment before birth and in early childhood can have a long-term impact on a person's earnings, health, and mortality. The periods before birth and during early childhood are critical to a person's development. Negative or positive events during this stage of life can have lasting consequences. Children exposed to extreme deprivation during episodes of war, disease, or famine, have been shown to be more likely to suffer from chronic health conditions as adults (for example see the effects of the influenza pandemic of 1918). Alternatively, children who are given the opportunity to participate in high-quality early education programs have been shown to have higher earnings as adults, to have better health, and to be less likely to be engaged in criminal activity.
  • How might SNAP provide benefits that persist into adulthood? One way this might happen is through the impact that the program has on preventing low birth weight. There is evidence linking low birth weight with school achievement, adult health and adult economic outcomes later in life. In addition, there may be benefits from reducing family stress or being able to pay more attention in school because of reduced hunger. The reduction in metabolic disorders from improved nutrition in utero and during early childhood is also consistent with what is known as the Barker hypothesis. The idea is that there are critical periods in development during which a body takes cues from the current environment to predict and adapt to what it might face in the future. For example, a child’s metabolic system may adapt in a manner that improves chances of survival in an environment with chronic food shortages. However, problems arise if there is no long-run food shortage. In that case, early life metabolic adaptations are a bad match to the actual environment, and will increase the likelihood of an individual developing high blood pressure, diabetes, obesity, and cardiovascular disease. Evidence from the Dutch Hunger Winter, a seven-month period at the end of World War II when people in the Netherlands experienced severe food restrictions, found that the children who were exposed to malnutrition in utero were more likely to be obese and have a higher incidence of heart disease in middle age.

 

What this Means:

The benefits of SNAP touch on many areas that go beyond improving food security in the short-run. Access to the program also helps prevent the negative, long-term effects of deprivation during childhood. This is especially important given that children, and families with children, compose a large share of SNAP beneficiaries. And the benefits of these broader effects accrue to more than just the program recipients. The long-term improvements in health due to the program imply a decrease in future taxpayer costs for health care. The fact that the children who had access to the program at early ages benefited from long-term health impacts highlights the importance of intervening in early childhood. A full accounting of the benefits of the program, not just the costs, should be taken into consideration when evaluating potential cuts to SNAP. 

 

 

Thursday, May 29, 2025

Like many populist leaders, Trump accuses judges of being illegitimate obstacles to safety and democracy by Michael Gregory

 

Federal judges and at times Supreme Court justices have repeatedly challenged – and blocked – President Donald Trump’s attempts to reshape fundamental aspects of American government.

Many of Trump’s more than 150 executive orders, including one aimed at eliminating the Department of Education, have been blocked by injunctions and lawsuits.

When a majority of Supreme Court justices ruled on May 16, 2025, that the Trump administration could not deport a group of Venezuelan immigrants without first giving them the right to due process in court, Trump attacked the court.

“The Supreme Court of the United States is not allowing me to do what I was elected to do,” Trump wrote on Truth Social. “This is a bad and dangerous day for America!” he continued in the post.

As the Trump administration faces other orders blocking its plans, the president and his team are framing judges not just as political opponents but as enemies of democracy.

Trump, for example, has called for the impeachment of James Boasberg, a federal judge who also issued orders blocking the deportation of immigrants in the U.S. to El Salvador. Attorney General Pam Bondi has said that Boasberg was “trying to protect terrorists who invaded our country over American citizens,” and Trump has also called Boasberg and other judges who ruled against him or his administration “left-wing activists.” 

 

“We cannot allow a handful of communist, radical-left judges to obstruct the enforcement of our laws and assume the duties that belong solely to the president of the United States,” Trump said at a rally in April 2025. “Judges are trying to take away the power given to the president to keep our country safe.”

As a scholar of legal and political theory, I believe this kind of talk about judges and the judicial system is not just misleading, it’s dangerous. It mirrors a pattern seen across many populist movements worldwide, where leaders cast independent courts and judges as illegitimate obstacles to what they see as the will of the people.

By confusing the idea that the people’s will must prevail with what the law actually says, these leaders justify intimidating judges and their sound legal rulings, a move that ultimately undermines democracy.

 

Thwarting ‘the will of the American people’?

In the face of judicial rulings against them, Trump and other administration officials have suggested on multiple occasions that judges are antagonistic to what the American people voted for.

Yet these rulings are merely a reflection of the rule of law.

Trump and supporters such as Elon Musk have characterized the rulings as a sign that a group of elite judges are abusing their power and acting against the will of the American people. The rulings that enforce the law, according to this argument, stand in opposition to the popular mandate American voters give to elected officials like the president.

“If ANY judge ANYWHERE can stop EVERY Presidential action EVERYWHERE, we do NOT live in a democracy,” Elon Musk posted on X in February 2025. “When judges egregiously undermine the democratic will of the people, they must be fired,” Musk added.

And U.S. Rep. Mike Johnson, the Republican speaker of the House of Representatives, said in March 2025, “We do have the authority over the federal courts, as you know. We can eliminate an entire district court.”

Framing judges as enemies of democracy or as obstacles to the people’s will departs sharply from the traditional view – held across political lines – that the judiciary is an essential, nonpartisan part of the American constitutional system.

While previous presidents have expressed frustration with specific court decisions or judges’ political leanings, their critiques mostly focused on specific legal reasoning.

Supreme Court Justice Ketanji Brown Jackson warned against the Trump administration’s charge that judges were actively undermining democracy. In late April 2025, she said during a conference for judges that “relentless attacks on judges are an attack on democracy.”

So, are judges obstructing democracy – or protecting it?

Are unelected judges a sign of democracy?

The U.S. Constitution established an independent judiciary as a coequal branch of government, alongside the legislative and executive branches. Federal judges are appointed for life and cannot be removed for political reasons. The country’s founders thought this protection could insulate judges from political pressures and ensure that courts uphold the Constitution, not the popularity of a given policy.

Yet as the federal judiciary has expanded in size and power, the arguments about the relationship between democracy and judicial independence have become louder among some political scientists and legal philosophers.

Some critics take issue with the fact that federal judges are appointed by politicians, not elected to their positions – a fact that others argue contributes to their independence.

Federal judges often serve longer on the bench than many elected officials.

Why, some critics argue, should a small group of unelected experts be allowed to overturn decisions made by elected officials?

Other democratic theorists, however, say that federal judges can act as a check on elected leaders who may misuse or abuse their power, or pass laws that violate people’s legal rights. This indirectly strengthens democracy by giving people a meaningful way to have recourse against laws that go against their rights and what they actually voted for.

A common story across countries

The argument that judges are an enemy to democracy is not unique to the U.S.

Authoritarian leaders from across the world have used similar language to justify undermining the courts.

In the Philippines, then-President Rodrigo Duterte in 2018 told Maria Lourdes Sereno, a top judge who was an outspoken critic of Duterte’s war on drugs, “I am now your enemy.” Shortly after, the Philippines Supreme Court voted to oust Sereno from the court. These judges cited Sereno’s failure to disclose personal financial information when she was first appointed to the court as the reason for her removal.

Filipino protesters and outside critics alike viewed Sereno’s removal as politically motivated and said it undermined the country’s judicial independence.

El Salvador President Nayib Bukele’s allies in the legislative assembly similarly voted in May 2021 to remove the government’s attorney general as well all five top judges for obstructing Bukele’s plans to imprison, without proper due process, large numbers of people. Bukele replaced the attorney general and judges with political loyalists, violating constitutional procedure.

Kamala Harris, then vice president of the U.S., was among the international observers who said the removal of judges in El Salvador made her concerned about El Salvador’s democracy. Bukele justified the judges’ removal by saying he was right and that he refused to “listen to the enemies of the people” who wanted him to do otherwise.

And in April 2024, a minister in Israeli Prime Minister Benjamin Netanyahu’s Cabinet called Attorney General Gali Baharav-Miara an “enemy of the people,” blaming her for protests outside Netanyahu’s home. This disparagement was part of Netanyahu’s broader efforts to weaken judges’ role and independence and to remove judicial constraints on executive power. 

 

Pushing against democracy

In the name of weakening what they call undemocratic institutions, these and other leaders try to discredit independent judges. This attempt helps these leaders gain power and silence dissent.

Their attempts to disparage and discredit judges misrepresent judges’ work by asserting that it is political in nature – and thus subject to political criticism and even intimidation. But in the U.S., judges’ constitutionally mandated work takes place in the realm of law, not politics.

By confusing the idea that the people’s will must prevail with what the law actually says, these leaders justify intimidating judges and their rulings, a move that ultimately undermines democracy.

Independent judges may not always make perfect decisions, and concerns about their interpretations or potential biases are legitimate. Judges sometimes make decisions that are objectionable from a moral and legal standpoint.

But when political leaders portray judges as the problem, I believe it’s crucial to ask: Who truly benefits from silencing judges?

There’s no evidence work requirements for Medicaid recipients will boost employment, but they are a key piece of Republican spending bill

 by 

 

Republicans in the U.S. Senate are sparring over their version of the multitrillion-dollar budget and immigration bill the House of Representatives passed on May 22, 2025.

Some GOP senators are insisting on shrinking the budget deficit, which the House version would increase by about US$3.8 trillion over a decade.

Others are saying they oppose the House’s cost-cutting provisions for Medicaid, the government’s health insurance program for people who are low income or have disabilities.

Despite the calls from U.S. Sen. Josh Hawley of Missouri and a few other Republican senators to protect Medicaid, as a scholar of American social policy I’m expecting to see the Senate embrace the introduction of work requirements for many adults under 65 who get health insurance through the program.

The House version calls for the states, which administer Medicaid within their borders and help pay for the program, to adopt work requirements by the end of 2026. The effect of this policy, animated by the conviction that coverage is too generous and too easy to obtain, will be to deny Medicaid eligibility to millions of those currently covered – leaving them without access to basic health services, including preventive care and the management of ongoing conditions such as asthma or diabetes.

Ending welfare

The notion that people who get government benefits should prove that they deserve them, ideally through paid labor, is now centuries old. This conviction underlay the Victorian workhouses in 19th-century England that Charles Dickens critiqued through his novels.

 U.S. Rep. Brett Guthrie, R-Ky., put it bluntly earlier this month: Medicaid is “subsidizing capable adults who choose not to work,” he said. 

 

This idea also animated the development of the American welfare state, from its origins in the 1930s organized around the goals of maintaining civil order and compelling paid labor. Enforcing work obligations ensured the ready availability of low-wage labor and supported the growing assumption that only paid labor could redeem the lives and aspirations of the poor.

“We started offering hope and opportunity along with the welfare check,” Wisconsin Gov. Tommy Thompson argued in the early 1990s, “and expecting certain responsibilities in return.”

This concept also was at the heart of the U.S. government’s bid to end “welfare as we know it.”

In 1996, the Democratic Clinton administration replaced Aid to Families with Dependent Children, or AFDC, a long-standing entitlement to cash assistance for low-income families, with Temporary Aid for Needy Families, known commonly as TANF. The TANF program, as its name indicates, was limited to short-term support, with the expectation that most people getting these benefits would soon gain long-term employment.

Since 1996, Republicans serving at the state and federal levels of government have pressed to extend this principle to other programs that help low-income people. They’ve insisted, as President Donald Trump put it halfway through his first term, that unconditional benefits have “delayed economic independence, perpetuated poverty, and weakened family bonds.”

Such claims are unsupported. There is no evidence to suggest that work requirements have ever galvanized independence or lifted low-income people out of poverty. Instead, they have punished low-income people by denying them the benefits or assistance they require.

Work requirements haven’t worked

Work requirements have consistently failed as a spur to employment. The transition from the AFDC to TANF required low-income families to meet work requirements, new administrative burdens and punitive sanctions.

The new work expectations, rolled out in 1997, were not accompanied by supporting policies, especially the child care subsidies that many low-income parents with young children require to hold a job. They were also at odds with the very low-paying and unstable jobs available to those transitioning from welfare.

Scholars found that TANF did less to lift families out of poverty than it did to shuffle its burden, helping the nearly poor at the expense of the very poor.

The program took an especially large toll on low-income Black women, as work requirements exposed recipients to long-standing patterns of racial and gender discrimination in private labor markets.

Restricting access to SNAP

Work requirements tied to other government programs have similar track records.

The Supplemental Nutrition Assistance Program, which helps millions of Americans buy groceries, adopted work requirements for able-bodied adults in 1996.

Researchers have found that SNAP’s work requirements have pared back eligibility without any measurable increase in labor force participation.

As happens with TANF, most people with SNAP benefits who have to comply with SNAP work requirements are already working to the degree their personal circumstances and local labor markets allow.

The requirements don’t encourage SNAP recipients to work more hours; they simply lead people to be overwhelmed by red tape and stop renewing their SNAP benefits.

Failing in Arkansas

The logic of work requirements collapses entirely when extended to Medicaid.

Red states have been pressing for years for waivers that would allow them to experiment with work requirements – especially for the abled-bodied, working-age adults who gained coverage under the Affordable Care Act’s Medicaid expansion.

The first Trump administration granted 13 such waivers for what it saw as “meritorious innovations,” building “on the human dignity that comes with training, employment and independence.”

 

Arkansas got the furthest with adding work requirements to Medicaid at that time. The results were disappointing.

“We found no evidence that the policy succeeded in its stated goal of promoting work,” as one research team concluded, “and instead found substantial evidence of harm to health care coverage and access.”

The Biden administration slowed down the implementation of these waivers by directing the Centers for Medicare and Medicaid Services to suspend or stem any state programs that eroded coverage. Meanwhile, state courts consistently ruled against the use of Medicaid work requirements.

In Trump’s second term, Iowa, Arizona and at least a dozen other states have proposed “work requirement” waivers for federal approval.

Trying it again

The waiver process is meant to allow state experiments to further the statutory objectives of the Medicaid program, which is to furnish “medical assistance on behalf of families with dependent children and of aged, blind, or disabled individuals, whose income and resources are insufficient to meet the costs of necessary medical services.”

On these grounds, the courts have consistently held that state waivers imposing work requirements not only fail to promote Medicaid’s objectives but amount to an arbitrary and capricious effort to undermine those objectives.

“The text of the statute includes one primary purpose,” the D.C. Circuit ruled in 2020, “which is providing health care coverage without any restriction geared to healthy outcomes, financial independence or transition to commercial coverage.”

Changing Medicaid in all states

The House spending bill includes a work requirement that would require all able-bodied, childless adults under 65 to demonstrate that they had worked, volunteered or participated in job training for 80 hours in the month before enrollment.

It would also allow states to extend such work requirements to six months and apply the new requirements not just to Medicaid recipients but to people who get subsidized health insurance through an Affordable Care Act exchange.

If passed in some form by the Senate, the House spending bill would transform the landscape of Medicaid work requirements, pushing an estimated 4.8 million Americans into the ranks of the uninsured.

 

Wednesday, May 21, 2025

The 10 richest Americans got $365 billion richer in the past year. Now they’re on the verge of a huge tax cut

 

Tuesday, May 20, 2025

For once, economists agree: Extending Section 199A is a bad idea by William G. Gale and Samuel I. Thorpe

 

The Princeton economist Alan Blinder once observed that “economists have the least influence on policy where they know the most and are most agreed.” We hope he will be proven wrong about one policy enacted in 2017: the special deduction for qualified business income. This provision, which economists are nearly united against, is a major part of the mammoth tax and spending bill currently winding its way through the House.

The “Section 199A deduction” allows owners of certain pass-through businesses—such as partnerships, sole proprietorships, and limited liability corporations—to shield 20 percent of their net business income from tax. Lawmakers pitched it as a tax break for small business, but many large firms and high-income professionals benefit from the deduction. Republicans in Congress not only want to permanently extend the rule – which expires at the end of this year under current law – but also to double down, raising the deduction to 23 percent and expanding the number and types of business covered.    

In 2024, the Peterson Foundation asked economists in seven think tanks across the political spectrum to come up with tax reform plans. All seven proposed eliminating the deduction. Why? Put simply, the rule has proven to be expensive, regressive, complicated, ineffective in promoting investment, and unfair to wage earners. Let’s take those in turn.

Expensive: According to the nonpartisan Joint Committee on Taxation (JCT), the deduction was estimated to cost $414.5 billion between 2018 and 2027. In practice it has been even more expensive, as taxpayers have shifted income into pass-through form to take advantage of the deduction.  Extending it would cost over $700 billion over the next 10 years, and the other proposed changes would cost another $100 billion or more.

Regressive: The benefits are highly skewed to the affluent. The JCT found that 44 percent of the tax benefits would go to taxpayers with annual incomes above $1 million. A Tax Policy Center study found that 55 percent of the tax benefits in 2019 went to households in the top 1 percent of the income distribution, and more than 26 percent went to just the top 0.1 percent.

Complicated: The deduction is notoriously complex and encourages tax-driven income shifting rather than economic growth. Its value, phase-ins, and eligibility criteria vary depending on industry, income level, and even taxpayers’ professions. Small business owners must overcome onerous administrative burdens to claim it.

Ineffective: The stated rationale for enacting the deduction was to create jobs and raise investment. But research shows the deduction did neither. One paper found “little evidence of changes in real economic activity,” including investment, employee wages, or job growth, while another found that 199A led to zero change in employment.

Unfair: A fundamental principle of an income tax is that two different people with the same income and same economic situation (both married, both have children, etc.) should pay the same tax. But the 199A deduction arbitrarily favors business income over wages, encouraging taxpayers who have the means to relabel their income to avoid paying taxes.

The changes considered by the House would make these problems worse. Along with a higher rate for the deduction, it would be offered to more owners of “specified service trades of businesses” like doctors, lawyers, and consultants. It would also provide a $10 billion tax break to Business Development Companies, a type of investment vehicle typically managed by private equity firms. These changes would increase the regressivity of the deduction and expand this complicated and inequitable provision to more parts of the economy.

If Congress wants to support small businesses in an effective and evidence-backed way, it has plenty of good options. They range from supporting sector-based training and workforce development to expanding access to small business credit and seed funding.

There are even ways to structure support as a tax break while improving its fairness. To reduce its cost and make it less regressive, Congress could cap the amount owners can claim (to prevent the largest passthrough business owners from receiving massive tax cuts) or change the structure of 199A to cover a fixed amount of qualifying business income (rather than 20 percent). A version of the structural change even wins support from a majority of small business owners.

As it currently stands, however, economists agree that 199A is expensive, complicated, and fails to accomplish its goals. Congress should not extend the deduction, and it certainly should not expand it.

Authors

Believe it or not, there was a time when the US government built beautiful homes for working-class Americans to deal with a housing crisis

 

 In 1918, as World War I intensified overseas, the U.S. government embarked on a radical experiment: It quietly became the nation’s largest housing developer, designing and constructing more than 80 new communities across 26 states in just two years.

These weren’t hastily erected barracks or rows of identical homes. They were thoughtfully designed neighborhoods, complete with parks, schools, shops and sewer systems.

In just two years, this federal initiative provided housing for almost 100,000 people.

Few Americans are aware that such an ambitious and comprehensive public housing effort ever took place. Many of the homes are still standing today.

But as an urban planning scholar, I believe that this brief historic moment – spearheaded by a shuttered agency called the United States Housing Corporation – offers a revealing lesson on what government-led planning can achieve during a time of national need.

Government mobilization

When the U.S. declared war against Germany in April 1917, federal authorities immediately realized that ship, vehicle and arms manufacturing would be at the heart of the war effort. To meet demand, there needed to be sufficient worker housing near shipyards, munitions plants and steel factories.

 

So on May 16, 1918, Congress authorized President Woodrow Wilson to provide housing and infrastructure for industrial workers vital to national defense. By July, it had appropriated US$100 million – approximately $2.3 billion today – for the effort, with Secretary of Labor William B. Wilson tasked with overseeing it via the U.S. Housing Corporation.

Over the course of two years, the agency designed and planned over 80 housing projects. Some developments were small, consisting of a few dozen dwellings. Others approached the size of entire new towns.

For example, Cradock, near Norfolk, Virginia, was planned on a 310-acre site, with more than 800 detached homes developed on just 100 of those acres. In Dayton, Ohio, the agency created a 107-acre community that included 175 detached homes and a mix of over 600 semidetached homes and row houses, along with schools, shops, a community center and a park.

Designing ideal communities

Notably, the Housing Corporation was not simply committed to offering shelter.

Its architects, planners and engineers aimed to create communities that were not only functional but also livable and beautiful. They drew heavily from Britain’s late-19th century Garden City movement, a planning philosophy that emphasized low-density housing, the integration of open spaces and a balance between built and natural environments.

   In 1918, as World War I intensified overseas, the U.S. government embarked on a radical experiment: It quietly became the nation’s largest housing developer, designing and constructing more than 80 new communities across 26 states in just two years.

These weren’t hastily erected barracks or rows of identical homes. They were thoughtfully designed neighborhoods, complete with parks, schools, shops and sewer systems.

In just two years, this federal initiative provided housing for almost 100,000 people.

Few Americans are aware that such an ambitious and comprehensive public housing effort ever took place. Many of the homes are still standing today.

But as an urban planning scholar, I believe that this brief historic moment – spearheaded by a shuttered agency called the United States Housing Corporation – offers a revealing lesson on what government-led planning can achieve during a time of national need.

Government mobilization

When the U.S. declared war against Germany in April 1917, federal authorities immediately realized that ship, vehicle and arms manufacturing would be at the heart of the war effort. To meet demand, there needed to be sufficient worker housing near shipyards, munitions plants and steel factories.

The Conversation is a news organization dedicated to facts and evidence

So on May 16, 1918, Congress authorized President Woodrow Wilson to provide housing and infrastructure for industrial workers vital to national defense. By July, it had appropriated US$100 million – approximately $2.3 billion today – for the effort, with Secretary of Labor William B. Wilson tasked with overseeing it via the U.S. Housing Corporation.

Over the course of two years, the agency designed and planned over 80 housing projects. Some developments were small, consisting of a few dozen dwellings. Others approached the size of entire new towns.

For example, Cradock, near Norfolk, Virginia, was planned on a 310-acre site, with more than 800 detached homes developed on just 100 of those acres. In Dayton, Ohio, the agency created a 107-acre community that included 175 detached homes and a mix of over 600 semidetached homes and row houses, along with schools, shops, a community center and a park.

Designing ideal communities

Notably, the Housing Corporation was not simply committed to offering shelter.

Its architects, planners and engineers aimed to create communities that were not only functional but also livable and beautiful. They drew heavily from Britain’s late-19th century Garden City movement, a planning philosophy that emphasized low-density housing, the integration of open spaces and a balance between built and natural environments.

Black and white map featuring winding streets and little black squares representing homes.
Milton Hill, a neighborhood designed and developed by the United States Housing Corporation in Alton, Ill. National Archives

Importantly, instead of simply creating complexes of apartment units, akin to the public housing projects that most Americans associate with government-funded housing, the agency focused on the construction of single-family and small multifamily residential buildings that workers and their families could eventually own.

This approach reflected a belief by the policymakers that property ownership could strengthen community responsibility and social stability. During the war, the federal government rented these homes to workers at regulated rates designed to be fair, while covering maintenance costs. After the war, the government began selling the homes – often to the tenants living in them – through affordable installment plans that provided a practical path to ownership.

Sepia-toned photograph of a group of men standing in front of a newly-built house.
A single-family home in Davenport, Iowa, built by the U.S. Housing Corporation. National Archives

Though the scope of the Housing Corporation’s work was national, each planned community took into account regional growth and local architectural styles. Engineers often built streets that adapted to the natural landscape. They spaced houses apart to maximize light, air and privacy, with landscaped yards. No resident lived far from greenery.

In Quincy, Massachusetts, for example, the agency built a 22-acre neighborhood with 236 homes designed mostly in a Colonial Revival style to serve the nearby Fore River Shipyard. The development was laid out to maximize views, green space and access to the waterfront, while maintaining density through compact street and lot design.

At Mare Island, California, developers located the housing site on a steep hillside near a naval base. Rather than flatten the land, designers worked with the slope, creating winding roads and terraced lots that preserved views and minimized erosion. The result was a 52-acre community with over 200 homes, many of which were designed in the Craftsman style. There was also a school, stores, parks and community centers.

Infrastructure and innovation

Alongside housing construction, the Housing Corporation invested in critical infrastructure. Engineers installed over 649,000 feet of modern sewer and water systems, ensuring that these new communities set a high standard for sanitation and public health.

Attention to detail extended inside the homes. Architects experimented with efficient interior layouts and space-saving furnishings, including foldaway beds and built-in kitchenettes. Some of these innovations came from private companies that saw the program as a platform to demonstrate new housing technologies.

One company, for example, designed fully furnished studio apartments with furniture that could be rotated or hidden, transforming a space from living room to bedroom to dining room throughout the day.

To manage the large scale of this effort, the agency developed and published a set of planning and design standards − the first of their kind in the United States. These manuals covered everything from block configurations and road widths to lighting fixtures and tree-planting guidelines.

A sepia-toned photograph of a newly-built single-family house.
A single-family home in Bremerton, Wash., built by the U.S. Housing Corporation. National Archives

The standards emphasized functionality, aesthetics and long-term livability.

Architects and planners who worked for the Housing Corporation carried these ideas into private practice, academia and housing initiatives. Many of the planning norms still used today, such as street hierarchies, lot setbacks and mixed-use zoning, were first tested in these wartime communities.

And many of the planners involved in experimental New Deal community projects, such as Greenbelt, Maryland, had worked for or alongside Housing Corporation designers and planners. Their influence is apparent in the layout and design of these communities.

A brief but lasting legacy

With the end of World War I, the political support for federal housing initiatives quickly waned. The Housing Corporation was dissolved by Congress, and many planned projects were never completed. Others were incorporated into existing towns and cities.

Yet, many of the neighborhoods built during this period still exist today, integrated in the fabric of the country’s cities and suburbs. Residents in places such as Aberdeen, Maryland; Bremerton, Washington; Bethlehem, Pennsylvania; Watertown, New York; and New Orleans may not even realize that many of the homes in their communities originated from a bold federal housing experiment.

Three homes side by side on a leafy suburban street.
Homes on Lawn Avenue in Quincy, Mass., that were built by the U.S. Housing Corporation. Google Street View

The Housing Corporation’s efforts, though brief, showed that large-scale public housing could be thoughtfully designed, community oriented and quickly executed. For a short time, in response to extraordinary circumstances, the U.S. government succeeded in building more than just houses. It constructed entire communities, demonstrating that government has a major role and can lead in finding appropriate, innovative solutions to complex challenges.

At a moment when the U.S. once again faces a housing crisis, the legacy of the U.S. Housing Corporation serves as a reminder that bold public action can meet urgent needs.

Wednesday, May 14, 2025

Cancer research in the US is world class because of its broad base of funding − with the government pulling out, its future is uncertain

 By

Cancer research in the U.S. doesn’t rely on a single institution or funding stream − it’s a complex ecosystem made up of interdependent parts: academia, pharmaceutical companies, biotechnology startups, federal agencies and private foundations. As a cancer biologist who has worked in each of these sectors over the past three decades, I’ve seen firsthand how each piece supports the others.

When one falters, the whole system becomes vulnerable.

The United States has long led the world in cancer research. It has spent more on cancer research than any other country, including more than US$7.2 billion annually through the National Cancer Institute alone. Since the 1971 National Cancer Act, this sustained public investment has helped drive dramatic declines in cancer mortality, with death rates falling by 34% since 1991. In the past five years, the Food and Drug Administration has approved over 100 new cancer drugs, and the U.S. has brought more cancer drugs to the global market than any other nation.

But that legacy is under threat. Funding delays, political shifts and instability across sectors have created an environment where basic research into the fundamentals of cancer biology is struggling to keep traction and the drug development pipeline is showing signs of stress.

These disruptions go far beyond uncertainty and have real consequences. Early-career scientists faced with unstable funding and limited job prospects may leave academia altogether. Mid-career researchers often spend more time chasing scarce funding than conducting research. Interrupted research budgets and shifting policy priorities can unravel multiyear collaborations. I, along with many other researchers, believe these setbacks will slow progress, break training pipelines and drain expertise from critical areas of cancer research – delays that ultimately hurt patients waiting for new treatments.

A 50-year foundation of federal investment

The modern era of U.S. cancer research began with the signing of the National Cancer Act in 1971. That law dramatically expanded the National Cancer Institute, an agency within the National Institutes of Health focusing on cancer research and education. The NCI laid the groundwork for a robust national infrastructure for cancer science, funding everything from early research in the lab to large-scale clinical trials and supporting the training of a generation of cancer researchers.

 This federal support has driven advances leading to higher survival rates and the transformation of some cancers into a manageable chronic or curable condition. Progress in screening, diagnostics and targeted therapies – and the patients who have benefited from them – owe much to decades of NIH support.

 

But federal funding has always been vulnerable to political headwinds. During the first Trump administration, deep cuts to biomedical science budgets threatened to stall the progress made under initiatives such as the 2016 Cancer Moonshot. The rationale given for these cuts was to slash overall spending, despite facing strong bipartisan opposition in Congress. Lawmakers ultimately rejected the administration’s proposal and instead increased NIH funding. In 2022, the Biden administration worked to relaunch the Cancer Moonshot.

This uncertainty has worsened in 2025 as the second Trump administration has cut or canceled many NIH grants. Labs that relied on these awards are suddenly facing funding cliffs, forcing them to lay off staff, pause experiments or shutter entirely. Deliberate delays in communication from the Department of Health and Human Services have stalled new NIH grant reviews and funding decisions, putting many promising research proposals already in the pipeline at risk.

Philanthropy’s support is powerful – but limited

While federal agencies remain the backbone of cancer research funding, philanthropic organizations provide the critical support for breakthroughs – especially for new ideas and riskier projects.

Groups such as the American Cancer Society, Stand Up To Cancer and major hospital foundations have filled important gaps in support, often funding pilot studies or supporting early-career investigators before they secure federal grants. By supporting bold ideas and providing seed funding, they help launch innovative research that may later attract large-scale support from the NIH.

Without the bureaucratic constraints of federal agencies, philanthropy is more nimble and flexible. It can move faster to support work in emerging areas, such as immunotherapy and precision oncology. For example, the American Cancer Society grant review process typically takes about four months from submission, while the NIH grant review process takes an average of eight months.

 

But philanthropic funds are smaller in scale and often disease-specific. Many foundations are created around a specific cause, such as advancing cures for pancreatic, breast or pediatric cancers. Their urgency to make an impact allows them to fund bold approaches that federal funders may see as too preliminary or speculative. Their giving also fluctuates. For instance, the American Cancer Society awarded nearly $60 million less in research grants in 2020 compared with 2019.

While private foundations are vital partners for cancer research, they cannot replace the scale and consistency of federal funding. Total U.S. philanthropic funding for cancer research is estimated at a few billion dollars per year, spread across hundreds of organizations. In comparison, the federal government has typically contributed roughly five to eight times more than philanthropy to cancer research each year.

Industry innovation − and its priorities

Private-sector innovation is essential for translating discoveries into treatments. In 2021, nearly 80% of the roughly $57 billion the U.S. spent on cancer drugs came from pharmaceutical and biotech companies. Many of the treatments used in oncology today, including immunotherapies and targeted therapies, emerged from collaborations between academic labs and industry partners.

But commercial priorities don’t always align with public health needs. Companies naturally focus on areas with strong financial returns: common cancers, projects that qualify for fast-track regulatory approval, and high-priced drugs. Rare cancers, pediatric cancers and basic science often receive less attention.

Industry is also saddled with uncertainty. Rising R&D costs, tough regulatory requirements and investor wariness have created a challenging environment to bring new drugs to market. Several biotech startups have folded or downsized in the past year, leaving promising new drugs stranded in limbo in the lab before they can reach clinical trials.

Without federal or philanthropic entities to pick up the slack, these discoveries may never reach the patients who need them.

A system under strain

Cancer is not going away. As the U.S. population ages, the burden of cancer on society will only grow. Disparities in treatment access and outcomes persist across race, income and geography. And factors such as environmental exposures and infectious diseases continue to intersect with cancer risk in new and complex ways.

Addressing these challenges requires a strong, stable and well-coordinated research system. But that system is under strain. National Cancer Institute grant paylines, or funding cutoffs, remain highly competitive. Early-career researchers face precarious job prospects. Labs are losing technicians and postdoctoral researchers to higher-paying roles in industry or to burnout. And patients, especially those hoping to enroll in clinical trials, face delays, disruptions and dwindling options.

 

This is not just a funding issue. It’s a coordination issue between the federal government, academia and industry. There are currently no long-term policy solutions that ensure sustained federal investment, foster collaboration between academia and industry, or make room for philanthropy to drive innovation instead of just filling gaps.

I believe that for the U.S. to remain a global leader in cancer research, it will need to recommit to the model that made success possible: a balanced ecosystem of public funding, private investment and nonprofit support. Up until recently, that meant fully funding the NIH and NCI with predictable, long-term budgets that allow labs to plan for the future; incentivizing partnerships that move discoveries from bench to bedside without compromising academic freedom; supporting career pathways for young scientists so talent doesn’t leave the field; and creating mechanisms for equity to ensure that research includes and benefits all communities.

Cancer research and science has come a long way, saving about 4.5 million lives in the U.S. from cancer from 1991 to 2022. Today, patients are living longer and better because of decades of hard-won discoveries made by thousands of researchers. But science doesn’t run on good intentions alone. It needs universities. It needs philanthropy. It needs industry. It needs vision. And it requires continued support from the federal government.