Thursday, December 29, 2022

Time of triumph for GOP turns into ‘distraction’ with Santos By FARNOUSH AMIRI and MICHAEL BALSAMO

WASHINGTON (AP) — It should be a time of triumph for Republicans ready to take back control of the House in the new Congress next week, but their leaders are struggling with an embarrassing distraction about one of their own: What to do about George Santos?

Weeks after winning a district that helped Republicans secure their razor-thin House majority, the congressman-elect is under investigation in New York after acknowledging he lied about his heritage, education and professional pedigree as he campaigned for office. 

 The top House Republican Rep. Kevin McCarthy of California, and his leadership team have kept silent about Santos, who is set to take the oath of office Tuesday, even after he publicly admitted to fabricating swaths of his biography. The now-embattled Republican has shown no signs of stepping aside, punting the decision to hold him accountable to his party and to the Congress, where he could quickly face a House ethics committee investigation once sworn into office.

Representatives for McCarthy, who is running to become the next House speaker, did not respond when asked what action he may take relating to Santos. On Tuesday, Santos was asked on Fox News about the “blatant lies” and responded that he had “made a mistake.”  

Democrats, who will be in the minority during the upcoming session, are expected to pursue several avenues against the 34-year-old Santos, including a potential complaint with the Federal Election Commission and introducing a resolution to expel him once he’s a sitting member of Congress, according to a senior Democratic aide who spoke on the condition of anonymity to discuss internal deliberations.

 “We need answers from George Santos. He appears to be a complete and utter fraud. His whole life story is made up,” the incoming House Democratic leader, Rep. Hakeem Jeffries of New York, told reporters last week. “He’s gonna have to answer that question: Did you perpetrate a fraud on the voters of the 3rd Congressional District of New York?”

 Questions were first raised about Santos earlier this month when The New York Times published an investigation into his resume and found a number of major discrepancies. Since then, Santos has admitted lying about having Jewish ancestry, lying about working for Wall Street banks and lying about obtaining a college degree

 Santos has yet to address other lingering questions, including the source of a personal fortune he appears to have amassed quickly despite recent financial problems, including evictions and owing thousands in back rent.

Santos in November won a seat in the Long Island area represented by Democratic Rep. Tom Suozzi, making headlines as the first non-incumbent, openly gay Republican to be elected to Congress. 

 If Santos assumes office, he could still face an investigation by the House ethics committee, which is responsible for reviewing allegations of misconduct by lawmakers. The committee, evenly decided between the parties, has the authority under the chamber’s rules to subpoena members for testimony or documents, and lawmakers are required to comply.

 Tom Rust, a committee spokesman, declined to comment this week on whether the committee can or would investigate Santos. Despite his actions occurring before joining the House, the committee has over the years held that it may investigate matters that violated laws, regulations or standards of conduct during an initial campaign for the House.

But an ethics complaint may end up being the least of Santos’ problems.

 

Federal prosecutors in New York have started to examine Santos’ background and his financial dealings, a person familiar with the matter said Thursday. The person, who cautioned the review was in its early stages, said prosecutors were specifically interested in earnings that Santos accrued and are reviewing campaign finance filings.

The person was not authorized to publicly discuss details of the ongoing review and spoke on condition of anonymity

 The district attorney in Nassau County, New York, announced on Wednesday an investigation into the fabrications Santos made while campaigning to represent the district, which includes some Long Island suburbs and a small part of the New York City borough of Queens.

 Nassau County District Attorney Anne T. Donnelly, a Republican, said the fabrications and inconsistencies were “nothing short of stunning.”

“The residents of Nassau County and other parts of the third district must have an honest and accountable representative in Congress,” she said. “If a crime was committed in this county, we will prosecute it.”

 

Rep.-elect Nick Lalota, a Republican whose district borders Santos’, issued a statement Wednesday calling for an ethics investigation into the allegations. “New Yorkers deserve the truth and House Republicans deserve an opportunity to govern without this distraction.”

___

Associated Press writers Bobby Caina Calvan in New York and Kevin Freking contributed to this report.

 

 

Wednesday, December 21, 2022

When fishing boats go dark at sea, they’re often committing crimes – we mapped where it happens

Researcher in Ecosystem Dynamics, University of California, Santa Cruz

 

In January 2019, the Korean-flagged fishing vessel Oyang 77 sailed south toward international waters off Argentina. The vessel had a known history of nefarious activities, including underreporting its catch and illegally dumping low-value fish to make room in its hold for more lucrative catch.

At 2 a.m. on Jan. 10, the Oyang 77 turned off its location transponder at the edge of Argentina’s exclusive economic zone – a political boundary that divides Argentina’s national waters from international waters, or the high seas. At 9 p.m. on Jan. 11, the Oyang 77 turned its transponder back on and reappeared on the high seas. For the 19 hours when the ship was dark, no information was available about where it had gone or what it did.

In a recent study, I worked with colleagues at Global Fishing Watch, a nonprofit that works to advance ocean governance by increasing transparency of human activity at sea, to show that these periods of missing transponder data actually contain useful information on where ships go and what they do. And authorities like the International Maritime Organization can use this missing data to help combat illegal activities at sea, such as overfishing and exploiting workers on fishing boats.

Illegal fishing causes economic losses estimated at $US10 billion to $25 billion annually. It also has been linked to human rights violations, such as forced labor and human trafficking. Better information about how often boats go dark at sea can help governments figure out where and when these activities may be taking place.

 

Going dark at sea

The high seas are the modern world’s Wild West – a vast expanse of water far from oversight and authority, where outlaws engage in illegal activities like unauthorized fishing and human trafficking. Surveillance there is aided by location transponders, called the Automatic Identification System, or AIS, which works like the Find My iPhone app.

Just as thieves can turn off phone location tracking, ships can disable their AIS transponders, effectively hiding their activities from oversight. Often it’s unclear whether going dark in this way is legal. AIS requirements are based on many factors, including vessel size, what country the vessel is flagged to, its location in the ocean and what species its crew is trying to catch.

A ship that disables its AIS transponder disappears from the view of whomever may be watching, including authorities, scientists and other vessels. For our study, we reviewed data from two private companies that combine AIS data with other signals to track assets at sea. Spire is a constellation of nanosatellites that pick up AIS signals to increase visibility of vessels in remote areas of the world. Orbcomm tracks ships, trucks and other heavy equipment using internet-enabled devices. Then, we used machine learning models to understand what drove vessels to disable their AIS devices.

Examining where and how often such episodes occurred between 2017 and 2019, we found that ships disabled their transponders for around 1.6 million hours each year. This represented roughly 6% of global fishing vessel activity, which as a result is not reflected in global tallies of what types of fish are being caught where. 

 

Vessels frequently went dark on the high-seas edge of exclusive economic zone boundaries, which can obscure illegal fishing in unauthorized locations. That’s what the Oyang 77 was doing in January 2019.

Laundering illegal catch

The AIS data we reviewed showed that the Oyang 77 disabled its AIS transponder a total of nine times during January and February 2019. Each time, it went dark at the edge of Argentinean national waters and reappeared several days later back on the high seas.

During the ninth disabling event, the vessel was spotted fishing without permission in Argentina’s waters, where the Argentinean coast guard intercepted it and escorted it to the port of Comodoro Rivadavia. The vessel’s owners were later fined for illegally fishing in Argentina’s national waters, and their fishing gear was confiscated.

AIS disabling is also strongly correlated with transshipment events – exchanging catch, personnel and supplies between fishing vessels and refrigerated cargo vessels, or “reefers,” at sea. Reefers also have AIS transponders, and researchers can use their data to identify loitering events, when reefers are in one place long enough to receive cargo from a fishing vessel.

It’s not unusual to see fishing vessels disable their AIS transponders near loitering reefers, which suggests that they want to hide these transfers from oversight. While transferring people or cargo can be legal, when it is poorly monitored it can become a means of laundering illegal catch. It has been linked to forced labor and human trafficking.

 

Valid reasons to turn off transponders

Making it illegal for vessels to disable AIS transponders might seem like an obvious solution to this problem. But just as people may have legitimate reasons for not wanting the government to monitor their phones, fishing vessels may have legitimate reasons not to want their movements monitored.

Many vessels disable their transponders in high-quality fishing grounds to hide their activities from competitors. Although the ocean is huge, certain species and fishing methods are highly concentrated. For example, bottom trawlers fish by dragging nets along the seafloor and can operate only over continental shelves where the bottom is shallow enough for their gear to reach.

Modern-day pirates also use AIS data to intercept and attack vessels. In response, ships frequently disable their transponders in historically dangerous waters of the Indian Ocean and the Gulf of Guinea. Making AIS disabling illegal would leave fishing vessels more vulnerable to piracy. 

Instead, in my view, researchers and maritime authorities can use these AIS disabling events to make inferences about which vessels are behaving illegally.

Our study reveals that AIS disabling near exclusive economic zones and loitering reefers is a risk factor for unauthorized fishing and transshipments. At sea, real-time data on where vessels disable their AIS transponders or change their apparent position using fake GPS coordinates could be used to focus patrols on illegal activities near political boundaries or in transshipment hot spots. Port authorities could also use this information onshore to target the most suspect vessels for inspection.

President Joe Biden signed a national security memorandum in 2022 pledging U.S. support for combating illegal, unreported and unregulated fishing and associated labor abuses. Our study points toward a strategy for using phases when ships go dark to fight illegal activities at sea.

 

 

The World-Changing Race to Develop the Quantum Computer By Stephen Witt December 12, 2022

Such a device could help address climate change and food scarcity, or break the Internet. Will the U.S. or China get there first?

 

On the outskirts of Santa Barbara, California, between the orchards and the ocean, sits an inconspicuous warehouse, its windows tinted brown and its exterior painted a dull gray. The facility has almost no signage, and its name doesn’t appear on Google Maps. A small label on the door reads “Google AI Quantum.” Inside, the computer is being reinvented from scratch.

In September, Hartmut Neven, the founder of the lab, gave me a tour. Neven, originally from Germany, is a bald fifty-seven-year-old who belongs to the modern cast of hybridized executive-mystics. He talked of our quantum future with a blend of scientific precision and psychedelic glee. He wore a leather jacket, a loose-fitting linen shirt festooned with buttons, a pair of jeans with zippered pockets on the legs, and Velcro sneakers that looked like moon boots. “As my team knows, I never miss a single Burning Man,” he told me.

In the middle of the warehouse floor, an apparatus the size and shape of a ballroom chandelier dangled from metal scaffolding. Bundles of cable snaked down from the top through a series of gold-plated disks to a processor below. The processor, named Sycamore, is a small, rectangular tile, studded with several dozen ports. Sycamore harnesses some of the weirdest properties of physics in order to perform mathematical operations that contravene all human intuition. Once it is connected, the entire unit is placed inside a cylindrical freezer and cooled for more than a day. The processor relies on superconductivity, meaning that, at ultracold temperatures, its resistance to electricity all but disappears. When the temperature surrounding the processor is colder than the deepest void of outer space, the computations can begin.

Classical computers speak in the language of bits, which take values of zero and one. Quantum computers, like the ones Google is building, use qubits, which can take a value of zero or one, and also a complex combination of zero and one at the same time. Qubits are thus exponentially more powerful than bits, able to perform calculations that normal bits can’t. But, because of this elemental change, everything must be redeveloped: the hardware, the software, the programming languages, and even programmers’ approach to problems.

On the day I visited, a technician—whom Google calls a “quantum mechanic”—was working on the computer with an array of small machine tools. Each qubit is controlled by a dedicated wire, which the technician, seated on a stool, attached by hand.

The quantum computer before us was the culmination of years of research and hundreds of millions of dollars in investment. It also barely functioned. Today’s quantum computers are “noisy,” meaning that they fail at almost everything they attempt. Nevertheless, the race to build them has attracted as dense a concentration of genius as any scientific problem on the planet. Intel, I.B.M., Microsoft, and Amazon are also building quantum computers. So is the Chinese government. The winner of the race will produce the successor to the silicon microchip, the device that enabled the information revolution.

A full-scale quantum computer could crack our current encryption protocols, essentially breaking the Internet. Most online communications, including financial transactions and popular text-messaging platforms, are protected by cryptographic keys that would take a conventional computer millions of years to decipher. A working quantum computer could presumably crack one in less than a day. That is only the beginning. A quantum computer could open new frontiers in mathematics, revolutionizing our idea of what it means to “compute.” Its processing power could spur the development of new industrial chemicals, addressing the problems of climate change and food scarcity. And it could reconcile the elegant theories of Albert Einstein with the unruly microverse of particle physics, enabling discoveries about space and time. “The impact of quantum computing is going to be more profound than any technology to date,” Jeremy O’Brien, the C.E.O. of the startup PsiQuantum, said recently. First, though, the engineers have to get it to work.

Imagine two pebbles thrown into a placid lake. As the stones hit the surface, they create concentric ripples, which collide to produce complicated patterns of interference. In the early twentieth century, physicists studying the behavior of electrons found similar patterns of wavelike interference in the subatomic world. This discovery led to a moment of crisis, since, under other conditions, those same electrons behaved more like individual points in space, called particles. Soon, in what many consider the most bizarre scientific result of all time, the physicists realized that whether an electron behaved more like a particle or more like a wave depended on whether or not someone was observing it. The field of quantum mechanics was born.

In the following decades, inventors used findings from quantum mechanics to build all sorts of technology, including lasers and transistors. In the early nineteen-eighties, the physicist Richard Feynman proposed building a “quantum computer” to obtain results that could not be calculated by conventional means. The reaction from the computer-science community was muted; early researchers had trouble getting slots at conferences. The practical utility of such a device was not demonstrated until 1994, when the mathematician Peter Shor, working at Bell Labs in New Jersey, showed that a quantum computer could help crack some of the most widely used encryption standards. Even before Shor published his results, he was approached by a concerned representative of the National Security Agency. “Such a decryption ability could render the military capabilities of the loser almost irrelevant and its economy overturned,” one N.S.A. official later wrote.

Shor is now the chair of the applied-mathematics committee at the Massachusetts Institute of Technology. I visited him there in August. His narrow office was dominated by a large chalkboard spanning one wall, and his desk and his table were overflowing with scratch paper. Cardboard boxes sat in the corner, filled to capacity with Shor’s scribbled handiwork. One of the boxes was from the bookseller Borders, which went out of business eleven years ago.

Shor wears oval glasses, his belly is rotund, his hair is woolly and white, and his beard is unkempt. On the day I met him, he was drawing hexagons on the chalkboard, and one of his shoes was untied. “He looks exactly like the man who would invent algorithms,” a comment on a video of one of his lectures reads.

An algorithm is a set of instructions for calculation. A child doing long division is following an algorithm; so is a supercomputer simulating the evolution of the cosmos. The formal study of algorithms as mathematical objects only began in the twentieth century, and Shor’s research suggests that there is much we don’t understand. “We are probably, when it comes to algorithms, at the level the Romans were vis-à-vis numbers,” the experimental physicist Michel Devoret told me. He compared Shor’s work to the breakthroughs made with imaginary numbers in the eighteenth century.

Shor can be obsessive about algorithms. “I think about them late at night, in the shower, everywhere,” he said. “Interspersed with that, I scribble funny symbols on a piece of paper.” Sometimes, when a problem is especially engrossing, Shor will not notice that other people are talking to him. “It’s probably very annoying for them,” he said. “Except for my wife. She’s used to it.” Neven, of Google, recalled strolling with Shor through Cambridge as he expounded on his latest research. “He walked right through four lanes of traffic,” Neven said. (Shor told me that both of his daughters have been diagnosed with autism. “Of course, I have some of those traits myself,” he said.)

 

Shor’s most famous algorithm proposes using qubits to “factor” very large numbers into smaller components. I asked him to explain how it works, and he erased the hexagons from the chalkboard. The key to factoring, Shor said, is identifying prime numbers, which are whole numbers divisible only by one and by themselves. (Five is prime. Six, which is divisible by two and by three, is not.) There are twenty-five prime numbers between one and a hundred, but as you count higher they become increasingly rare. Shor, drawing a series of compact formulas on the chalkboard, explained that certain sequences of numbers repeat periodically along the number line. The distances between these repetitions grow exponentially, however, making them difficult to calculate with a conventional computer.

Shor then turned to me. “O.K., here is the heart of my discovery,” he said. “Do you know what a diffraction grating is?” I confessed that I did not, and Shor’s eyes grew wide with concern. He began drawing a simple sketch of a light beam hitting a filter and then diffracting into the colors of the rainbow, which he illustrated with colored chalk. “Each color of light has a wavelength,” Shor said. “We’re doing something similar. This thing is really a computational diffraction grating, so we’re sorting out the different periods.” Each color on the chalkboard represented a different grouping of numbers. A classical computer, looking at these groupings, would have to analyze them one at a time. A quantum computer could process the whole rainbow at once.

The challenge is to realize Shor’s theoretical work with physical hardware. In 2001, experimental physicists at I.B.M. tried to implement the algorithm by firing electromagnetic pulses at molecules suspended in liquid. “I think that machine cost about half a million dollars,” Shor said, “and it informed us that fifteen equals five times three.” Classical computing’s bits are relatively easy to build—think of a light switch, which can be turned either “on” or “off.” Quantum computing’s qubits require something like a dial, or, more accurately, several dials, each of which must be tuned to a specific amplitude. Implementing such precise controls at the subatomic scale remains a fiendish problem.

Still, in anticipation of the day that security experts call Y2Q , the protocols that safeguard text messaging, e-mail, medical records, and financial transactions must be torn out and replaced. ​Earlier this year, the Biden Administration announced that it was moving toward new, quantum-proof encryption standards that offer protection from Shor’s algorithm. Implementing them is expected to take more than a decade and cost tens of billions of dollars, creating a bonanza for cybersecurity experts. “The difference between this and Y2K is we knew the actual date when Y2K would occur,” the cryptographer Bruce Schneier told me.

In anticipation of Y2Q , spy agencies are warehousing encrypted Internet traffic, hoping to read it in the near future. “We are seeing our adversaries do this—copying down our encrypted data and just holding on to it,” Dustin Moody, the mathematician in charge of U.S. post-quantum encryption standards, said. “It’s definitely a real threat.” (When I asked him if the U.S. government was doing the same, Moody said that he didn’t know.) Within a decade or two, most communications from this era will likely be exposed. The Biden Administration’s deadline for the cryptography upgrade is 2035. A quantum computer capable of running a simple version of Shor’s algorithm could appear as early as 2029.

At the root of quantum-computing research is a scientific concept known as “quantum entanglement.” ​​Entanglement is to computing what nuclear fission was to explosives: a strange property of the subatomic world that could be harnessed to create technology of unprecedented power. If entanglement could be enacted at the scale of everyday objects, it would seem like a magic trick. Imagine that you and a friend flip two entangled quarters, without looking at the results. The outcome of the coin flips will be determined only when you peek at the coins. If you inspect your quarter, and see that it came up heads, your friend’s quarter will automatically come up tails. If your friend looks and sees that her quarter shows heads, your quarter will now show tails. This property holds true no matter how far you and your friend travel from each other. If you were to travel to Germany—or to Jupiter—and look at your quarter, your friend’s quarter would instantaneously reveal the opposite result.

If you find entanglement confusing, you are not alone: it took the scientific community the better part of a century to begin to understand its effects. Like so many concepts in physics, entanglement was first described in one of Einstein’s Gedankenexperiments. Quantum mechanics dictated that the properties of particles assumed fixed values only once they were measured. Before that, a particle existed in a “superposition” of many states at once, which were described using probabilities. (A famous thought experiment, proposed by the physicist Erwin Schrödinger, imagined a cat trapped in a box with a quantum-activated vial of poison, the cat superpositioned in a state between life and death.) This disturbed Einstein, who spent his later years formulating objections to the “new physics” of the generation that had succeeded him. In 1935, working with the physicists Boris Podolsky and Nathan Rosen, he revealed an apparent paradox in quantum mechanics: if one took the implications of the discipline seriously, it should be possible to create two entangled particles, separated by any distance, that could somehow interact faster than the speed of light. “No reasonable definition of reality could be expected to permit this,” Einstein and his colleagues wrote. In subsequent decades, however, the other predictions of quantum mechanics were repeatedly verified in experiments, and Einstein’s paradox was ignored. “Because his views went against the prevailing wisdom of his time, most physicists took Einstein’s hostility to quantum mechanics to be a sign of senility,” the historian of science Thomas Ryckman wrote.

Mid-century physicists focussed on particle accelerators and nuclear warheads; entanglement received little attention. In the early sixties, the Northern Irish physicist John Stewart Bell, working alone, reformulated Einstein’s thought experiment into a five-page mathematical argument. He published his results in the obscure journal Physics Physique Fizika in 1964. During the next four years, his paper was not cited a single time.

In 1967, John Clauser, a graduate student at Columbia University, came across Bell’s paper while paging through a bound volume of the journal at the library. Clauser had struggled with quantum mechanics, taking the course three times before receiving an acceptable grade. “I was convinced that quantum mechanics had to be wrong,” he later said. Bell’s paper provided Clauser with a way to put his objections to the test. Against the advice of his professors—including Richard Feynman—he decided to run an experiment that would vindicate Einstein, by proving that the theory of quantum mechanics was incomplete. In 1969, Clauser wrote a letter to Bell, informing him of his intentions. Bell responded with delight; no one had ever written to him about his theorem before.

Clauser moved to the Lawrence Berkeley National Laboratory, in California, where, working with almost no budget, he created the world’s first deliberately entangled pair of photons. When the photons were about ten feet apart, he measured them. Observing an attribute of one photon instantly produced opposite results in the other. Clauser and Stuart Freedman, his co-author, published their findings in 1972. From Clauser’s perspective, the experiment was a disappointment: he had definitively proved Einstein wrong. Eventually, and with great reluctance, Clauser accepted that the baffling rules of quantum mechanics were, in fact, valid, and what Einstein considered a grotesque affront to human intuition was merely the way the universe works. “I confess even to this day that I still don’t understand quantum mechanics,” Clauser said, in 2002.

But Clauser had also demonstrated that entangled particles were more than just a thought experiment. They were real, and they were even stranger than Einstein had thought. Their weirdness attracted the attention of the physicist Nick Herbert, a Stanford Ph.D. and LSD enthusiast whose research interests included mental telepathy and communication with the afterlife. Clauser showed Herbert his experiment, and Herbert proposed a machine that would use entanglement to communicate faster than the speed of light, enabling the user to send messages backward through time. Herbert’s blueprint for a time machine was ultimately deemed unfeasible, but it forced physicists to start taking entanglement seriously. “Herbert’s erroneous paper was a spark that generated immense progress,” the physicist Asher Peres recalled, in 2003.

 

Ultimately, the resolution to Einstein’s paradox was not that the particles could signal faster than light; instead, once entangled, they ceased to be distinct objects, and functioned as one system that existed in two parts of the universe at the same time. (This phenomenon is called nonlocality.) Since the eighties, research into entanglement has led to continuing breakthroughs in both theoretical and experimental physics. In October, Clauser shared the Nobel Prize in Physics for his work. In a press release, the Nobel committee described entanglement as “the most powerful property of quantum mechanics.” Bell did not live to see the revolution completed; he died in 1990. Today, his 1964 paper has been cited seventeen thousand times.

At Google’s lab in Santa Barbara, the objective is to entangle many qubits at once. Imagine hundreds of coins, arranged into a network. Manipulating these coins in choreographed sequences can produce astonishing mathematical effects. One example is Grover’s algorithm, developed by Lov Grover, Shor’s colleague at Bell Labs in the nineties. “Grover’s algorithm is about unstructured search, which is a nice example for Google,” Neven, the founder of the lab, said. “I like to think about it as a huge closet with a million drawers.” One of the drawers contains a tennis ball. A human rooting around in the closet will, on average, find the ball after opening half a million drawers. “As amazing as this may sound, Grover’s algorithm could do it in just one thousand steps,” Neven said. “I think the whole magic of quantum mechanics can essentially be seen here.”

Neven has had a peripatetic career. He originally majored in economics, but switched to physics after attending a lecture on string theory. He earned a Ph.D. focussing on computational neuroscience, and was hired as a professor at the University of Southern California. While he was at U.S.C., his research team won a facial-recognition competition sponsored by the U.S. Department of Defense. He started a company, Neven Vision, which developed the technology used in social-media face filters; in 2006, he sold the company to Google, for forty million dollars. At Google, he worked on image search and Google Glass, switching to quantum computing after hearing a story about it on public radio. His ultimate objective, he told me, is to explore the origins of consciousness by connecting a quantum computer to someone’s brain.

Neven’s contributions to facial-analysis technology are widely admired, and if you have ever pretended to be a dog on Snapchat you have him to thank. (You may thank him for the more dystopian applications of this technology as well.) But, in the past few years, in research papers published in the world’s leading scientific journals, he and his team have also unveiled a series of small, peculiar wonders: photons that bunch together in clumps; identical particles whose properties change depending on the order in which they are arranged; an exotic state of perpetually mutating matter known as a “time crystal.” “There’s literally a list of a dozen things like this, and each one is about as science fictiony as the next,” Neven said. He told me that a team led by the physicist Maria Spiropulu had used Google’s quantum computer to simulate a “holographic wormhole,” a conceptual shortcut through space-time—an achievement that recently made the cover of Nature.

 

Google’s published scientific results in quantum computing have at times drawn scrutiny from other researchers. (One of the Nature paper’s authors called their wormhole the “smallest, crummiest wormhole you can imagine.” Spiropulu, who owns a dog named Qubit, concurred. “It’s really very crummy, for real,” she told me.) “With all these experiments, there’s still a huge debate as to what extent are we actually doing what we claim,” Scott Aaronson, a professor at the University of Texas at Austin who specializes in quantum computing, said. “You kind of have to squint.” Nor will quantum computing replace the classical approach anytime soon. “Quantum computers are terrible at counting,” Marissa Giustina, a research scientist at Google, said. “We got ours to count to four.”

Giustina is one of the world’s leading experts on entanglement. In 2015, while working in the laboratory of the Austrian professor Anton Zeilinger, she ran an updated version of Clauser’s 1972 experiment. In October, Zeilinger was named a Nobel laureate, too. “After that, I got a bunch of pings saying, ‘Congratulations on winning your boss the Nobel Prize,’ ” Giustina said. She talked with some frustration about a machine that may soon model complex molecules but for now can’t do basic arithmetic. “It’s antithetical to what we experience in our everyday lives,” she said. “That’s what’s so annoying about it, and so beautiful.”

The main problem with Google’s entangled qubits is that they are not “fault-tolerant.” The Sycamore processor will, on average, make an error every thousand steps. But a typical experiment requires far more than a thousand steps, so, to obtain meaningful results, researchers must run the same program tens of thousands of times, then use signal-processing techniques to refine a small amount of valuable information from a mountain of data. The situation might be improved if programmers could inspect the state of the qubits while the processor is running, but measuring a superpositioned qubit forces it to assume a specific value, causing the calculation to deteriorate. Such “measurements” need not be made by a conscious observer; any number of interactions with the environment will result in the same collapse. “Getting quiet, cold, dark places for qubits to live is a fundamental part of getting quantum computing to scale,” Giustina said. Google’s processors sometimes fail when they encounter radiation from outside our solar system.

In the early days of quantum computing, researchers worried that the measurement problem was intractable, but in 1995 Peter Shor showed that entanglement could be used to correct errors, too, ameliorating the high fault rate of the hardware. Shor’s research attracted the attention of Alexei Kitaev, a theoretical physicist then working in Moscow. In 1997, Kitaev improved on Shor’s codes with a “topological” quantum-error-correction scheme. John Preskill, a theoretical physicist at Caltech, spoke of Kitaev, who is now a professor at the school, with something approaching awe. “He’s very creative, and he’s technically very deep,” Preskill said. “He’s one of the few people I know that I can call, without any hesitation, a genius.”

I met Kitaev in his spacious office at Caltech, which was almost completely empty. He was wearing running shoes. After spending the day thinking about particles, Kitaev told me, he walks for about an hour to clear his mind. On hard days, he might walk for longer. A few miles north of Caltech sits Mt. Wilson, where, in the nineteen-twenties, Edwin Hubble used what was then the world’s largest telescope to deduce that the universe was expanding. “I’ve been on Mt. Wilson maybe a hundred times,” Kitaev said. When a problem is really tough, Kitaev skips Mt. Wilson, and instead hikes nearby Mt. Baldy, a ten-thousand-foot peak that is often covered in snow.

Quantum computing is a Mt. Baldy problem. “I made a prediction, in 1998, that the computers would be realized in thirty years,” Kitaev said. “I’m not sure we’ll make it.” Kitaev’s error-correction scheme is one of the most promising approaches to building a functional quantum computer, and, in 2012, he was awarded the Breakthrough Prize, the world’s most lucrative science award, for his work. Later, Google hired him as a consultant. So far, no one has managed to implement his idea.

Preskill and Kitaev teach Caltech’s introductory quantum-computing course together, and their classroom is overflowing with students. But, in 2021, Amazon announced that it was opening a large quantum-computing laboratory on Caltech’s campus. Preskill is now an Amazon Scholar; Kitaev remained with Google. The two physicists, who used to have adjacent offices, today work in separate buildings. They remain collegial, but I sensed that there were certain research topics on which they could no longer confer.

In early 2020, scientists at Pfizer began producing hundreds of experimental pharmaceuticals intended to treat Covid-19. That July, they synthesized seven milligrams of a research chemical labelled PF-07321332, one of twenty formulations the company produced that week. PF-07321332 remained an anonymous vial in a laboratory refrigerator until September, when experiments showed that it was effective at suppressing Covid-19 in rats. The chemical was subsequently combined with another substance and rebranded as Paxlovid, a drug cocktail that reduces Covid-19-related hospitalizations by some ninety per cent. Paxlovid is a lifesaver, but, with the assistance of a quantum computer, the laborious process of trial and error that led to its development might have been shortened. “We are just guessing at things that can be directly designed,” the venture capitalist Peter Barrett, who is on the board of the startup PsiQuantum, told me. “We’re guessing at things which our civilization entirely depends on—but that is by no means optimal.”

Fault-tolerant quantum computers should be able to simulate the molecular behavior of industrial chemicals with unprecedented precision, guiding scientists to faster results. In 2019, researchers predicted that, with just a thousand fault-tolerant qubits, a method for producing ammonia for agricultural use, called the Haber-Bosch process, could be accurately modelled for the first time. An improvement to this process would lead to a substantial decrease in carbon-dioxide emissions. Lithium, the primary component of batteries for electric cars, is a simple element with an atomic number of three. A fault-tolerant quantum computer, even a primitive one, might show how to expand its capacity to store energy, increasing vehicle range. Quantum computers could be used to develop biodegradable plastics, or carbon-free aviation fuel. Another use, suggested by the consulting company McKinsey, was “simulating surfactants to develop a better carpet cleaner.” “We have good reason to believe that a quantum computer would be able to efficiently simulate any process that occurs in nature,” Preskill wrote, a few years ago.

The world we live in is the macroscopic scale. It is the world of ordinary kinetics: billiard balls and rocket ships. The world of subatomic particles is the quantum scale. It is the world of strange effects: interference and uncertainty and entanglement. At the boundary of these two worlds is what scientists call the “nanoscopic” scale, the world of molecules. For the most part, molecules behave like billiard balls, but if you zoom in close enough you begin to notice quantum effects. It is at the nanoscopic scale that researchers expect quantum computing to solve its first meaningful problems, in pharmaceuticals and materials design, perhaps with just a few hundred fault-tolerant qubits. And it is in this discipline—quantum molecular chemistry—that analysts expect the first real money in quantum computing to be made. Quantum physics wins the Nobel. Quantum chemistry will write the checks.

The potential windfall from licensing royalties has excited investors. In addition to the tech giants, a raft of startups are trying to build quantum computers. The Quantum Insider, an industry trade publication, has tallied more than six hundred companies in the sector, and another estimate suggests that thirty billion dollars has been invested in developing quantum technology worldwide. Many of these businesses are speculative. IonQ , based in College Park, Maryland, went public last year, despite having almost no sales. Researchers there compute with qubits obtained using the “trapped ion” approach, arranging atoms of the rare-earth element ytterbium into a tidy row, then manipulating them with a laser. Jungsang Kim, IonQ’s C.T.O., told me that his ion traps maintain entanglement better than Google’s processors, but he admitted that, as more qubits are added, the laser system gets more complicated. “Improving the controller, that’s kind of our sticking point,” he said.

 

At PsiQuantum, in Palo Alto, engineers are making qubits from photons, the weightless particles of light. “The advantage of this approach is that we use preëxisting silicon-fabrication technology,” Pete Shadbolt, the company’s chief scientific officer, said. “Also, we can operate at somewhat higher temperatures.” PsiQuantum has raised half a billion dollars. There are other, weirder approaches. Microsoft, building on Kitaev’s work, is attempting to construct a “topological” qubit, which requires synthesizing an elusive particle in order to work. Intel is trying the “silicon spin” approach, which embeds qubits in semiconductors. The competition has led to bidding wars for talent. “If you have an advanced degree in quantum physics, you can go out into the job market and get five offers in three weeks,” Kim said.

Even the most optimistic analysts believe that quantum computing will not earn meaningful profits in the next five years, and pessimists caution that it could take more than a decade. It seems likely that a lot of expensive equipment will be developed with little durable purpose. “You walk down the hall at the Computer History Museum, in Mountain View, and you see a mercury delay line,” Shadbolt said, referring to an obsolete contraption from the nineteen-forties that stored information using sound waves. “I love thinking about the guys who built that.”

It is difficult, even for insiders, to determine which approach is currently in the lead. “ ‘Pivot’ is the Silicon Valley word for a near-death experience,” Neven said. “But if one day we see that superconducting qubits are outcompeted by some other technology, like photonics, I would pivot in a heartbeat.” Neven actually seemed relieved by the competition. His laboratory is expensive, and quantum computing is the kind of moon-shot project that thrived during the era of low interest rates. “Because of the present financial situation, startups in our field have more difficulties finding investors,” Devoret, the experimental physicist, told me. But, as long as Amazon is investing in quantum computing, it’s a good bet that Google will keep funding it, too. There is also the tacit support of the state—the U.S. intelligence apparatus has made quantum decryption a priority, regardless of market fluctuations. In fact, Neven’s stiffest competition comes not from the private sector but from the Chinese Communist Party. John Martinis, a former head of quantum computing at Google, said, “In terms of making high-quality qubits, one could say the Chinese are in the lead.”

At the campuses of the University of Science and Technology of China, four competing quantum-computing technologies are being developed in parallel. In a paper published in Science, in 2020, a team led by the scientists Lu Chao-Yang and Pan Jian-Wei announced that their processor had solved a computational task millions of times faster than the best supercomputer. Pan is one of the most daring researchers in quantum entanglement. In 2017, his team ran an experiment that entangled two photons at an observatory in Tibet, and transmitted one of them to an orbiting satellite. The scientists then transferred attributes from a third photon on Earth to the one in space, using the technique of “quantum teleportation.”

Lu and I spoke by video earlier this year. He joined the call late and was covered in sweat, having sprinted home from a mandatory Covid test. Lu immediately began debunking claims made by his competitors, and even claims made about his own effort. One widely reported figure stated that China has invested fifteen billion dollars in developing a quantum computer. “I have no idea how that was started,” Lu said. “The actual money is maybe twenty-five per cent of that.”

Jiuzhang, Lu’s photonic quantum computer, is undoubtedly one of the world’s fastest, but Lu has repeatedly chided his colleagues for overhyping the technology. On our call, he pulled up a video clip of a woman attempting to arrange ten kittens in a line. “Here is the problem we face,” he said. A kitten scurried to the back and the woman raced to grab it. “You want to control multiple qubits with high precision,” Lu said, “but they should be very well isolated from the environment.” As the woman replaced the first kitten, several others fled.

Lu cautioned that quantum computers faced stiff competition from ordinary silicon chips. The earliest electronic computers, from the forties, had to beat only humans. Quantum computers must prove their superiority to supercomputers that can run a quintillion calculations per second. “We see fairly few quantum algorithms where there is proof of exponential speedup,” he said. “In many cases, it’s not clear that it wouldn’t be better to use a regular computer.” Lu also disputed Martinis’s contention that China was making the best qubits. “Actually, I think Google’s in the lead,” he said.

Neven agreed. “Sometime in the next year, I think we will make the first fully fault-tolerant qubit,” he said. From there, Google plans to scale up its computing effort by chaining processors together. Adjacent to the warehouse I visited was a second, bigger space, where sunshine streamed into a dusty construction site. There, Google plans to build a computer that will require a freezer as large as a one-car garage. A thousand fault-tolerant qubits should be enough to run accurate simulations of molecular chemistry. Ten thousand fault-tolerant qubits could begin to unlock new findings in particle physics. From there, researchers could start to run Shor’s algorithm at full power, exposing the secrets of our era. “It’s quite possible that I will die before it happens,” Shor, who is sixty-three, told me. “But I would really like to see it happen, and I think it’s also quite possible that I will live long enough to see it.” ♦

Published in the print edition of the December 19, 2022, issue, with the headline “The Future of Everything

 

 

 

 

Trump repeatedly paid little or nothing in federal income taxes between 2015 and 2020

Former President Donald Trump repeatedly paid little or nothing in federal income taxes between 2015 and 2020 despite reporting millions in earnings.

Documents released Tuesday night by House Democrats said Trump frequently made tens of millions of dollars annually during that period. But he was able to whittle away his tax bill by claiming steep business losses that offset that income.

In 2016, he paid $750. The following year he again paid just $750. In 2020, he paid nothing.

And though the IRS has a longstanding policy of automatically auditing every president, Democrats say the agency did not begin vetting Trump’s filings until they began asking about them in 2019.

 Trump frequently claimed he was under continuous audits by the IRS to justify his refusal to voluntarily disclose his taxes, which snapped a longstanding tradition of presidents and White House contenders volunteering their returns.

The revelations, which came after House Democrats voted Tuesday to make Trump’s returns public, marks the culmination of the long-running mystery of what’s in his filings, something he’s fought for years in court to conceal. It promises to create yet another controversy for the scandal-plagued Trump, one that is sure to shadow his bid to return to the White House and raise uncomfortable questions for his fellow Republicans.

It also puts a spotlight on the IRS as well, including its recently departed commissioner, Chuck Rettig, and its promise to impartially administer the tax code.

 At the same time, it is a last-minute victory for Democrats, particularly Ways and Means Committee Chairman Richard Neal, who waged a three-and-a-half year court battle for the returns but whose enthusiasm for the fight was often questioned by his party’s liberal wing.

Neal obtained the returns just last month, after the Supreme Court declined to block their release to him, and Democrats raced to get the information out before they slip into the minority in the House on Jan. 3.

Democrats said it would take some time to scrub the filings of Trump’s personal information, such as his Social Security number and addresses, but that the returns would be released in the coming days.

But Democrats asked the Joint Committee on Taxation, a nonpartisan office of tax lawyers and economists that advises lawmakers on tax issues, to examine Trump’s filings and lawmakers released the agency’s findings along with their own report on the IRS audit system.

Though Trump has cultivated an image of a wildly successful businessman, and bragged about paying little in taxes, he appears to do that mostly by reporting big losses.

In 2015, JCT said, Trump reported making more than $50 million, through a combination of capital gains, interest, dividends and other earnings. That was offset though by more than $85 million in reported losses.

He was hit by the alternative minimum tax that year, a levy designed to make it harder for the rich to zero out their tax bills. It generated a $641,931 tax bill.

The following year he reported making $30 million but also claimed $60 million in losses. He was dinged again by the AMT that year, though he still ended up owing just $750.

 In other years, Trump paid more. In 2018, he had far fewer losses to report and ended up paying $999,466.

JCT did not itself audit Trump’s returns and did not attempt to verify the numbers he reported. It criticized the IRS though for not more vigorously examining the filings.

The IRS has a policy dating to the Nixon administration of automatically vetting every president’s returns — something designed to both assure the public that the tax system is being administered equitably and also to spare the IRS from having to make politically fraught decisions over which presidents to audit.

Little is known publicly about how that process work. A 1998 law makes White House meddling in audits a felony.

Neal told reporters though he had not come across evidence that Trump had tried to influence the agency when it came to examining his taxes.

 The super-rich are typically subject to relatively high audit rates. The IRS says it examined 8.7 percent of those who made more than $10 million in 2019.

People in the top 0.01 percent of incomes, making more than $7.4 million in 2020, paid an average tax rate of 25.1 percent. The average rate that year for everyone was 13.6 percent.

Earlier this year, President Joe Biden reported paying 24.6 percent.

It is highly unusual for lawmakers to forcibly release private tax information, and Trump was not legally required to disclose his taxes.

But he defied a decades-old tradition of presidents volunteering their filings, incensing congressional Democrats who seized his returns under a century-old law allowing the chairs of Congress’s tax committees to examine anyone’s private tax information.

Many Democrats said the public not only had a right to know about Trump’s finances, they also said they wanted to know how vigorously the IRS was questioning the president’s returns.

Republicans scoffed, saying Democrats were simply looking for ways to embarrass Trump and warned Neal’s move would create a precedent that could be used against other people.

 Democrats got his personal returns as well as a handful of business filings from 2015 to 2020 — mostly coinciding with Trump’s time in office.

Some said Democrats did not demand to see enough of Trump’s records, arguing they should have asked for more from previous years, before he was running for president, that could have still been under audit when he came to the White House.

Democrats’ impending release of the returns will create an opportunity for a crowdsourced audit, with outside tax experts eager to weigh in, something Trump confidante Michael Cohen told Congress in 2019 that Trump feared if they ever became public.

“What he didn’t want was to have an entire group of think tanks that are tax experts run through his tax returns and start ripping it to pieces, and then he’ll end up in an audit and he’ll ultimately have taxable consequences, penalties and so on,” Cohen said.

 

 

 

Friday, December 16, 2022

The Big Potential of Karen Bass’s Homelessness Agenda By Jay Caspian Kang December 16, 2022

On her first day in office, Karen Bass, the newly elected mayor of Los Angeles, declared a state of emergency over the city’s homelessness crisis. The move was accompanied by a touch of political theatre—Bass, who was sworn in by Vice-President Kamala Harris, chose to begin her term at the city’s Emergency Operations Center rather than at City Hall. “My mandate is to move Los Angeles in a new direction with an urgent and strategic approach to solving one of our city’s toughest challenges and creating a brighter future for every Angeleno,” Bass said.

 The focus on homelessness should not have come as a surprise. Bass’s closely contested runoff election against the billionaire real-estate developer Rick Caruso was always a referendum on how to deal with the thousands of people who now live in tent encampments, R.V.s, and the city’s overburdened shelter system. Caruso, who promised to “end street homelessness” and wanted to expand the number of police officers on patrol, ran as a maverick and political outsider who would end the do-nothing way of doing things in Los Angeles. His actual policies were hard to pin down—at some point, he suggested that giant tent cities for the homeless could be modelled after migrant holding areas in Texas—but his appeal, outside of the massive amount of money he put in a never-ending advertising blitz, was borne out of the frustration that many of his fellow-Angelenos felt with homelessness and crime. Something had to change, and Caruso’s argument was that Bass, a veteran of Los Angeles politics, would just mean business as usual.

 Bass ultimately defeated Caruso by a nearly ten-point margin, and the city’s political priorities have not changed. Like nearly every politician in California, Bass will be judged entirely on how she addresses homelessness and the speed with which she gets results. A state of emergency certainly signals Bass’s intentions, but what do results actually look like? During the mayoral race, both candidates engaged in an unhinged arms race of promises about tens of thousands of new shelter beds, lofty permanent-housing goals, and the like. Now Bass faces the nearly impossible task of housing people in a city with nowhere for them to go.

Bass’s agenda, which is still largely undefined, but includes an expansion of permanent supportive housing and temporary-shelter sites, doesn’t vary all that much from progressive or even centrist policies throughout the state. Bass, for example, has not argued for an extension of the city’s eviction moratorium, which has been tied up in court proceedings but is set to expire at the end of January, 2023—a move that would likely stop many people from falling into homelessness, which, as most experts or even basic logic will tell you, is the first step in keeping people off the street.

 The revolution that Bass hopes to spark in Los Angeles, then, isn’t quite ideological or even policy-based but, rather, bureaucratic. If you read the text of her state-of-emergency declaration, you’ll find resolution after resolution that state the problem and very little in terms of actual proposals except a section in which she redirects power and oversight over the homelessness issue to the Emergency Operations Organization (E.O.O.), and then declares that as “director of the EOO,” she will “coordinate Citywide planning and response with respect to unsheltered individuals in conjunction with the City Administrative Officer, Los Angeles Homeless Services Authority, Los Angeles City Housing Department, Los Angeles City Planning Department and any and all necessary departments and agencies.” Los Angeles, in other words, now has a homelessness czar who can override civic organizations and clear red tape as she sees fit.

“She is making herself the face of this issue,” Hugo Soto-Martinez, a newly elected member of the city council, told me. “Nobody took charge up to this point, everyone was blaming and pointing the finger. She’s putting her entire reputation on this, and that’s a monumental shift.”

 Under the old regime, homelessness responses within different parts of the city could vary wildly. In some districts, including that of Nury Martinez, the disgraced former head of the city council who resigned last month, sanitation teams would perform sweeps of homeless encampments. In others, activists and residents of the encampments have successfully blocked such cleanups. The council member Mike Bonin, who had called for higher standards that would make the sweeps less intrusive and harmful, and later opposed the actions altogether, faced a recall campaign by angry voters for what they saw as permissive or even encouraging policies around homelessness in Venice Beach. (Bonin, whose Black son was called a “little monkey” in the leaked audio tapes that led to Martinez’s resignation, did not seek reëlection in November.)

 This fractured state of affairs was made possible by the unusual strength of the city council in years past, but the recent scandal and the ongoing drama around the council member Kevin de Léon—one of the four people recorded on the leaked audio tapes—and his refusal to resign, have thrown everything into chaos. Protesters demanding de Léon’s resignation now disrupt every council meeting. Last week, a widely distributed video showed de Léon in a physical altercation with a protester. This past Tuesday, when the council convened to vote unanimously to approve Bass’s state of emergency, the meeting was interrupted

 several times, and de Léon ultimately had to cast his vote from a back room.

 What a Bass-led homeless response might actually look like, Soto-Martinez believes, is that the factional way of doing business in Los Angeles will now be replaced, at least in some part, by a more comprehensive approach to building and acquiring housing, increasing the capacity of services for the homeless, and “giving a vision of a city-wide approach.” This, of course, is what a mayor should do when faced with a crisis, especially one like homelessness, which requires the coördination of politicians, civic workers, and the fleet of third-party nonprofits that California’s cities have employed to do much of the on-ground work. It does no good for one district to clear an encampment if another one pops up a few miles down the road.

In what will almost certainly become the most controversial part of her new emergency powers, Bass now has the ability to bypass a lot of obstacles and approvals that accompany any building project in the city. She could theoretically build a shelter, for example, without going through an endless procession of community meetings, competitive bids, backdoor deals with local representatives, and fights with homeowners-association leaders. She can now spend money to convert a rundown hotel into temporary housing without city-council approval. In an interview with the Los Angeles Times, she even revealed that she will be able to convert city-owned properties into housing. (She also technically has the right to commandeer private property for the same purpose, but has said that she will not.)

 The hypocrisy at the heart of California’s homelessness problem is that everyone says they want to help, but, for many people, this just means getting rid of the homeless: almost nobody wants to live near a place that will actually provide places for unhoused people to live. The outrage that Caruso tapped into is fuelled by a type of magical thinking that posits that Los Angeles’s forty-two thousand homeless residents will somehow disappear with the right mixture of tough love, big-picture thinking, and gumption.

 The more mundane truth is that Bass’s emergency powers can easily be thwarted by the sort of hyperlocal homeowner resistance that is the lifeblood of the city’s politics. These groups, whom the late historian Mike Davis called the “most powerful ‘social movement’ ” in Southern California, have the power to block every shelter plan or affordable-housing project in their area through a variety of obstructive tactics, whether threatening their council member, voting in large blocks to remove any local representative that does not do their bidding, or tying up every process in frivolous lawsuits. Perhaps Bass’s streamlined power will be able to circumvent some of that and deliver real wins for her homelessness agenda, but the homeowners associations will certainly let Bass know that she is in for a fight.

What all this means is that Bass will almost certainly fall short of the ridiculous expectations that have been placed upon her. But from the perspective of actually solving the homelessness crisis—a process that will take decades, not a single mayoral term—if all Bass does is smooth out the absurdly parochial and bureaucratic nature of Los Angeles city politics, she will have achieved a major victory. Breaking through the power of homeowners in California is like shovelling through bedrock. You can only guess at the depths of the resistance, and even when you break through, you’re never sure what lies just a few feet underneath.

 Bass’s swearing-in ceremony was attended by Harris, Governor Gavin Newsom, and Democratic leadership in the state. The symbolism of this moment should not be lost. Bass is sixty-nine years old, and although this might not be old by Presidential or even congressional or Senate standards, especially in California, this will likely be the highest office she holds. This should give her freedom to pursue the gamble of appointing herself as the face of the city’s most trenchant and explosive problem. Her success will require the support of the state and federal government, which can pass legislation, apply political pressure, and provide funding in the form of housing vouchers.

Fixing homelessness isn’t about big ideas, or tough talk, or getting real, whatever that means. Billions of dollars are spent every year in the state to combat the crisis, and there’s already a slate of proposals that span the political spectrum. Better governance, alone, will not get people indoors. But it’s a fitting task for a seasoned politician like Bass, who seems to understand the need to build a big enough political hammer to blast through the resistance of homeowners. ♦

 

he Fed’s Response to Inflation May Pose a Bigger Threat than the Inflation Itself by Mark Weisbrot

Do Americans understand what is happening with inflation in this country? This is an important question, because the public’s perception can influence national policy and political choices. Before the midterm elections one month ago, 87 percent of likely voters told pollsters that inflation was extremely or very important in deciding their vote.

Let’s take a simple example of what most Americans see most in the news, and compare this with the data that economists, and journalists who cover the US economy, are looking at.

This week our government released the November data for the Consumer Price Index (CPI-U), and the headline number was 7.1 percent; which was down from 7.7 percent last month, but still a very high rate of inflation for the United States. The phrase, “highest levels since the early 1980s” has accompanied much of the reporting for the last few months.

But these numbers are, in some very important ways, out of date.

It’s true that prices as measured by the CPI-U in November this year were 7.1 percent higher than a year earlier.

But if we look at the five months that we have just experienced (July through November), the bigger news is really how much the rate of inflation has been coming down.

For these five months, annualized inflation has been just 2.5 percent. Just to be clear: that 2.5 percent is not the increase in prices over these five months (July through November). It’s the increase in prices that we would have if this inflation over five months continued for a year.

 By contrast, annualized inflation for the five months prior to July (February through June) was 11.8 percent.

Although the Fed’s target is 2 percent (the Fed uses a different measure of inflation than the CPI, but pretty close to it), most economists would not be worried about inflation of 2.5 percent. In fact, some of the world’s most prominent economists would favor 3 percent, over 2 percent, as a target.

There is a long list of facts and issues known to economists that have led many of us to less alarmist views than those of the wider public, including some politicians. For starters: the spike in inflation over the past 18 months was primarily a result of external shocks. The war in Ukraine was a big one, especially for gasoline prices.

Gasoline prices have been an even bigger issue in California than in most of the country, because they are substantially higher in the state, where they have recently fallen from a record $6.49 per gallon in October to $4.61 this week.

Inflation was also driven higher by supply chain disruptions, and large shifts in demand because of the pandemic and then recovery.

There has been little evidence of any self-reinforcing mechanisms such as a wage-price spiral (where prices cause workers to demand higher wages, further driving up prices), or accelerating changes in the expectations of consumers or investors

 There has been little evidence of any self-reinforcing mechanisms such as a wage-price spiral (where prices cause workers to demand higher wages, further driving up prices), or accelerating changes in the expectations of consumers or investors.

The data also show that, despite the hardships inflation has caused for many people, tens of millions of Americans are economically better off than they were before the change of government two years ago. This is especially true for people who gained employment from the record 10 million jobs that have been created since then. Wages for lower-paid workers rose faster than inflation; workers in the hotel and restaurant sector (production and nonsupervisory) saw their wages rise by 3.8 percent more than inflation since the pandemic began.

And importantly, over the past five months, wages for all production and nonsupervisory workers have risen at a 4.1 percent annualized rate.

The misunderstanding of inflation and the economy distorts American politics and can influence not only elections but also the most important economic policy decisions that our government makes. The Federal Reserve itself has caused most of the US recessions since World War II by raising interest rates, and it may well be on track to do that again in the coming months, potentially throwing millions of people out of work. This would also very likely have serious political consequences.

Such a tragic mistake would be much less likely if the public — including members of Congress and other decision-makers — had a better understanding of the economic reality of the current episode of inflation, as well as the available choices and consequences. The way inflation is looking these past five months, the Fed should take a break from its interest rate hikes before, rather than after, it causes the next recession. That would be a much better choice than the course it is on.

 

Tuesday, December 6, 2022

Louisiana attorney general creates 'protecting minors' tip line to report library books

Louisiana Attorney General Jeff Landry has created an online tip line to report books found in libraries that residents think are inappropriate to stop the "taxpayer-subsided sexualization of children."

Landry, who is running for governor in 2023, announced the tip line in a Facebook post on Wednesday with few details. In the post, he said he met with residents in Slidell who "want to protect the children" in St. Tammany Parish.

"Since taking office, Attorney General Jeff Landry has been committed to working with Louisiana communities to protect minors from exploitation, including early sexualization, grooming, sex trafficking, and abuse," Cory Dennis, a spokesperson for Landry's office said in an email statement.

"Recently, he spoke with parents and grandparents who are concerned about specific books of a sexual nature that are not age-appropriate yet remain accessible to young children within public libraries. This recent discussion touched on the important work our Cyber Crime Unit does every day to protect Louisiana children from exploitation, outlined the very real risks & potential consequences of the early sexualization of children, and encouraged parents and guardians to remain not only engaged in their child's development but also vigilant over their content consumption.

"Our submissions portal was created to give parents across the state a voice in this matter, and we look forward to future discussions."

 Deborah Caldwell-Stone, the director for the American Library Association's Office of Intellectual Freedom, said the tip line is the first the advocacy organization has seen.

"I am both dismayed and puzzled. Libraries have long had policies on the books that allow any library user to raise a concern about a book," she said. "Every book has its reader. Public libraries serve a wide range of information needs for everyone in the community. There are going to be books that people disagree with or don't think are suitable for their kids. But they're there because they serve the information needs of someone in the community."

 Libraries should connect children with data that will help them be more informed, thoughtful and productive members of society, the tip line's website states.

"Librarians and teachers are neither empowering nor liberating our children by connecting them with books that contain extremely graphic sexual content that is far from age appropriate for young audiences," the website states.

"If this type of taxpayer-subsided sexualization of children has impacted you or your family, tell us about it below. Please use the form to share your experience with librarians, teachers, school board members, district superintendents, and/or library supervisors."

 Dennis did not answer questions from The Daily Advertiser about how the tip line will be monitored, how a complaint will be deemed credible nor what actions could be taken as a result of a tip being submitted.

"We would also remind the community that local public libraries are controlled by their local governments, and the community should have a say in those standards," Dennis said in the statement.

Librarians are thoughtful when selecting books for their communities, Caldwell-Stone said. Reporting librarians and library materials to government agencies stigmatizes access to information that some families may need, she said.

"It makes targets out of librarians who are public servants, just trying to serve the information needs of their community," she said. "But it suggests that in fact, that information that people want and need books about gender identity, sexual orientation, sex ed books are somehow illegal when they are not."

Caldwell-Stone said the ALA has seen an increase in attacks on librarians in recent months. She also said it has seen an enormous demands to censor books in libraries, primarily books that deal with the experiences of LGBTQ+ people, race or depict the experiences of BIPOC people.

 In June, the Lafayette Public Library Directory said the libraries would no longer feature book displays on subjects deemed to be “political” in a bid to take library workers “off of the frontlines” of culture war issues.

This year, at least three LGBTQ+ items were challenged for removal from the library’s collection, though none were removed. None of those materials were part of displays when they were challenged.

After recent challenges to LGBTQ+ materials failed, the library’s Board of Control voted to expand its ability to remove books from the library’s collection since librarians had twice thwarted those efforts in opposition to board members.

Attempts to ban some library books failed: Now a rule change could make it easier

A librarian in Livingston Parish filed a defamation lawsuit in August against the owners of two conservative Facebook groups, according to court filings. Amanda Jones spoke against restricting access to LGBTQ+ and sexual health books for youth.

Michael Lunsford, of Lafayette, and Ryan Thames, of Watson, posted in their groups accusing Jones of fighting to "keep sexually erotic and pornographic materials in the kid's section."

 

The lawsuit was dismissed in September after the judge ruled the statements were matters of opinion and not fact, KTBS reported. Jones' attorney said she would appeal the decision.

Caldwell-Stone said people should remember libraries are important for communities and are places where people can access the internet, prepare for college or new careers, and get information about being an entrepreneur.

"We really urge everyone who supports the library as an institution and as a source of information for everyone in the community, to be aware of what's going on at their local board meetings to speak up in favor of the work that librarians are doing and the role of the library in the community to fight back against the idea that just because someone doesn't agree with a book that it should be off the shelf, particularly books that deal ... with gender identity, sexual orientation," she said.

She said anyone who has an issue with a book should speak to a librarian and anyone concerned about book censorship can visit www.uniteagainstbookbans.org.