Armwood Technology Blog
An Technology blog focusing on portable devices. I have a news Blog @ News . I have a Culture, Politic and Religion Blog @ Opinionand my domain is @ Armwood.Com. I have a Jazz Blog @ Jazz. I have a Human Rights Blog @ Law.
Thursday, June 01, 2023
How autoimmune disease can attack the brain, cause psychiatric symptoms - The Washington Post
A catatonic woman awakened after 20 years. Her story may change psychiatry.
"New research suggests that a subset of patients with psychiatric conditions such as schizophrenia may actually have autoimmune disease that attacks the brain
June 1, 2023 at 6:00 a.m. EDT
The young woman was catatonic, stuck at the nurses’ station — unmoving, unblinking and unknowing of where or who she was.
Her name was April Burrell.
Before she became a patient, April had been an outgoing, straight-A student majoring in accounting at the University of Maryland Eastern Shore. But after a traumatic event when she was 21, April suddenly developed psychosis and became lost in a constant state of visual and auditory hallucinations. The former high school valedictorian could no longer communicate, bathe or take care of herself.
Story continues below advertisement
April was diagnosed with a severe form of schizophrenia, an often devastating mental illness that affects approximately 1 percent of the global population and can drastically impair how patients behave and perceive reality.
“She was the first person I ever saw as a patient,” said Sander Markx, director of precision psychiatry at Columbia University, who was still a medical student in 2000 when he first encountered April. “She is, to this day, the sickest patient I’ve ever seen.”
It would be nearly two decades before their paths crossed again. But in 2018, another chance encounter led to several medical discoveries reminiscent of a scene from “Awakenings,” the famous book and movie inspired by the awakening of catatonic patients treated by the late neurologist and writer Oliver Sacks.
Markx and his colleagues discovered that although April’s illness was clinically indistinguishable from schizophrenia, she also had lupus, an underlying and treatable autoimmune condition that was attacking her brain.
After months of targeted treatments — and more than two decades trapped in her mind — April woke up.
The awakening of April — and the successful treatment of other people with similar conditions — now stand to transform care for some of psychiatry’s sickest patients, many of whom are languishing in mental institutions.
Researchers working with the New York state mental health-care system have identified about 200 patients with autoimmune diseases, some institutionalized for years, who may be helped by the discovery.
And scientists around the world, including Germany and Britain, are conducting similar research, finding that underlying autoimmune and inflammatory processes may be more common in patients with a variety of psychiatric syndromes than previously believed.
Although the current research probably will help only a small subset of patients, the impact of the work is already beginning to reshape the practice of psychiatry and the way many cases of mental illness are diagnosed and treated.
“These are the forgotten souls,” said Markx. “We’re not just improving the lives of these people, but we’re bringing them back from a place that I didn’t think they could come back from.”
Losing April
Even as a teenager growing up in Baltimore, April showed signs of the college accounting student she would later become. She balanced her dad’s checkbook and helped collect the rent on his properties.
She lived with her father, who had served in the Army, and her stepmother and is one of seven siblings. She was keenly focused on academics and would be disappointed if she received a B in a class. She played volleyball in high school, and her family remembers her as being profoundly capable in all things. She helped her dad renovate his dozens of rental properties and could even wire outlets and climb on roofs to tar and repair them.
By all accounts, she was thriving, in overall good health and showing no signs of mental distress beyond the normal teenage growing pains.
“April was a high achiever,” said her older half brother, Guy Burrell. “She was very friendly, very outgoing. She just loved life.”
But in 1995, her family received a nightmarish phone call from one of her professors. April was incoherent and had been hospitalized. The details were hazy, but it appeared that April had suffered a traumatic experience, which The Post isn’t describing to protect her privacy.
Story continues below advertisement
After April spent a few months at a short-term psychiatric hospital, she was eventually diagnosed with schizophrenia.
Her family tried their best to take care of her, but April required constant attention, and, in 2000, she went to Pilgrim Psychiatric Center for long-term care. Her family visited as often as they could, making the four-hour drive from Maryland to Long Island once or twice a month. But April was locked in her own world of psychosis, often appearing to draw with her fingers what appeared to be calculations and having conversations with herself about financial transactions.
April was unable to recognize, let alone engage with, her family. She did not want to be touched, hugged or kissed. Her family felt they had lost her.
A promising medical student
When April was diagnosed with schizophrenia, Markx was still a promising medical student, an ocean away at the University of Amsterdam. His parents were both psychiatrists and he had grown up around psychiatry and its patients. Markx remembers playing as a child in the long-term psychiatric facilities where his parents worked; he was never afraid of the patients or the stigma associated with their illnesses.
As a visiting Fulbright Scholar to the United States, he made the decision not to head to more well-known institutes, but instead chose Pilgrim Psychiatric Center, a state hospital in Brentwood, N.Y., where many of the state’s most severe psychiatric patients live for months, years or even the rest of their lives.
It was during his early days at Pilgrim that he met April, an encounter that “changed everything,” he said.
“She would just stare and just stand there,” Markx said. “She wouldn’t shower, she wouldn’t go outside, she wouldn’t smile, she wouldn’t laugh. And the nursing staff had to physically maneuver her.”
As a student, Markx was not in a position to help her. He moved on with his career, but always remembered the young woman frozen at the nurses’ station.
Almost two decades later, Markx had a lab of his own. He encouraged one of his research fellows to work in the trenches and suggested he spend time with patients at Pilgrim, just as he had done years earlier.
In an extraordinary coincidence, the trainee, Anthony Zoghbi, encountered a catatonic patient, standing at the nurse’s desk. The fellow returned to Markx, shaken up, and told him what he had seen.
“It was like déjà vu because he starts telling the story,” said Markx. “And I’m like, ‘Is her name April?’”
Markx was stunned to hear that little had changed for the patient he had seen nearly two decades earlier. In the years since they had first met, April had undergone many courses of treatment — antipsychotics, mood stabilizers and electroconvulsive therapy — all to no avail.
Markx was able to get family consent for a full medical work-up. He convened a multidisciplinary team of more than 70 experts from Columbia and around the world — neuropsychiatrists, neurologists, neuroimmunologists, rheumatologists, medical ethicists — to figure out what was going on.
The first conclusive evidence was in her bloodwork: It showed that her immune system was producing copious amounts and types of antibodies that were attacking her body. Brain scans showed evidence that these antibodies were damaging her brain’s temporal lobes, brain areas that are implicated in schizophrenia and psychosis.
The team hypothesized that these antibodies may have altered the receptors that bind glutamate, an important neurotransmitter, disrupting how neurons can send signals to one another.
Even though April had all the clinical signs of schizophrenia, the team believed that the underlying cause was lupus, a complex autoimmune disorder where the immune system turns on its own body, producing many antibodies that attack the skin, joints, kidneys or other organs. But April’s symptoms weren’t typical, and there were no obvious external signs of the disease; the lupus appeared to only be affecting her brain.
The autoimmune disease, it seemed, was a specific biological cause — and potential treatment target — for the neuropsychiatric problems April faced. (Whether her earlier trauma had triggered the disease or was unrelated to her condition wasn’t clear.)
The diagnosis made Markx wonder how many other patients like April had been missed and written off as untreatable.
“We don’t know how many of these people are out there,” Markx said. “But we have one person sitting in front of us, and we have to help her.”
Waking up after two decades
The medical team set to work counteracting April’s rampaging immune system and started April on an intensive immunotherapy treatment for neuropsychiatric lupus. Every month for six months, April would receive short, but powerful “pulses” of intravenous steroids for five days, plus a single dose of cyclophosphamide, a heavy-duty immunosuppressive drug typically used in chemotherapy and borrowed from the field of oncology. She was also treated with rituximab, a drug initially developed for lymphoma.
The regimen is grueling, requiring a month-long break between each of the six rounds to allow the immune system to recover. But April started showing signs of improvement almost immediately.
As part of a standard cognitive test known as the Montreal Cognitive Assessment (MoCA), she was asked to draw a clock — a common way to assess cognitive impairment. Before the treatment, she tested at the level of a dementia patient, drawing indecipherable scribbles.
But within the first two rounds of treatment, she was able to draw half a clock — as if one half of her brain was coming back online, Markx said.
Following the third round of treatment a month later, the clock looked almost perfect.
Despite this improvement, her psychosis remained. As a result, some members of the team wanted to transfer April back to Pilgrim Psychiatric Center, Markx said. At the time, Markx had to travel home to the Netherlands, and feared that in his absence, April would be returned to Pilgrim.
On the day Markx was scheduled to fly out, he entered the hospital one last time to check on his patient, who he typically found sitting in the dining room in her catatonic state.
But when Markx walked in, April didn’t seem to be there. Instead, he saw another woman sitting in the room.
“It didn’t look like the person I had known for 20 years and had seen so impaired,” Markx said. “And then I look a little closer, and I’m like, ‘Holy s---. It’s her.’”
It was as if April had awakened after more than 20 years.
“I’ve always wanted my sister to get back to who she was,” Guy Burrell said.
In 2020, April was deemed mentally competent to discharge herself from the psychiatric hospital where she had lived for nearly two decades, and she moved to a rehabilitation center.
Because of visiting restrictions related to covid, the family’s face-to-face reunion with April was delayed until last year. April’s brother, sister-in-law and their kids were finally able to visit her at a rehabilitation center, and the occasion was tearful and joyous.
“When she came in there, you would’ve thought she was a brand new person,” Guy Burrell said. “She knew all of us, remembered different stuff from back when she was a child.”
A video of the reunion shows that April was still tentative and fragile. But her family said she remembered her childhood home in Baltimore, the grades she got in school, being a bridesmaid in her brother’s wedding — seemingly everything up until when the autoimmune inflammatory processes began affecting her brain. She even recognized her niece, whom April had only seen as a small child, now a grown young woman. When her father hopped on a video call, April remarked “Oh, you lost your hair,” and burst out laughing, Guy Burrell recalled.
The family felt as if they’d witnessed a miracle.
“She was hugging me, she was holding my hand,” Guy Burrell said. “You might as well have thrown a parade because we were so happy, because we hadn’t seen her like that in, like, forever.”
“It was like she came home,” Markx said. “We never thought that was possible.”
Finding more forgotten patients
Markx talked about how, as a teenager, he saw the movie adaptation of Oliver Sacks’s “Awakenings,” featuring Robin Williams and Robert DeNiro, and how it had haunted him. “The notion that people are gone in these mental institutes and that they come back still, that has always stuck with me,” he said.
Before his death in 2015, Sacks had spoken to Markx about the discoveries involving patients like April. Sacks, also a professor at Columbia University, had a personal interest in the work. He had a brother with schizophrenia.
“Your work gives me hope about the outcomes we can achieve with our patients that I never before would have dreamed possible, as these are true cases of ‘Awakenings’ where people get to go back home to their families to live out their lives,” Sacks said, according to contemporaneous notes kept by Markx. (The statement was confirmed by Kate Edgar, Sacks’s long-term personal editor and executive director of the Oliver Sacks Foundation.)
After April’s unexpected recovery, the medical team put out an alert to the hospital system to identify any patients with antibody markers for autoimmune disease. A few months later Anca Askanase, a Columbia rheumatologist who had been on April’s treatment team, approached Markx. “I think we found our girl,” she said.
When Devine Cruz was 9, she began to hear voices. At first, the voices fought with one another. But as she grew older, the voices would talk about her. One night, the voices urged her to kill herself.
For more than a decade, the young woman moved in and out of hospitals for treatment. Her symptoms included visual and auditory hallucinations, as well as delusions that prevented her from living a normal life.
Devine was eventually diagnosed with schizoaffective disorder, which can result in symptoms of both schizophrenia and bipolar disorder. She also was diagnosed with intellectual disability.
She was on a laundry list of drugs — two antipsychotic medications, lithium, clonazepam, Ativan and benztropine — that came with a litany of side effects but didn’t resolve all her symptoms. She was often unaware of what was going on; her hair was disheveled, and her medications caused her to shake and drool, her doctors said.
She also had lupus, which she had been diagnosed with when she was about 14, although doctors had never made a connection between the disease and her mental health.
When Markx and his team found Devine, she was 20 and held the adamant delusion that she was pregnant despite multiple negative pregnancy tests.
“That’s when she was probably at her worst,” said Sophia Chaudry, a precision psychiatry fellow at Columbia University Medical Center and physician who was closely involved in Devine’s care.
Last August, the medical team prescribed monthly immunosuppressive infusions of corticosteroids and chemotherapy drugs, a regime similar to what April had been given a few years prior. By October, there were already dramatic signs of improvement.
“She was like ‘Yeah, I gotta go,’” Markx said. “‘Like, I’ve been missing out.’”
After several treatments, Devine began developing awareness that the voices in her head were different from real voices, a sign that she was reconnecting with reality. She finished her sixth and final round of infusions in January.
In March, she was well enough to meet with a reporter. “I feel like I’m already better,” Devine said during a conversation in Markx’s office at the New York State Psychiatric Institute, where she was treated. “I feel myself being a person that I was supposed to be my whole entire life.”
Her presence during the interview was at first timid and childlike. She said her excitement and anxiety about discussing her story reminded her of how she felt in school the day before a big field trip.
Although she had lost about 10 years of her life to her illness, she remembers many details. As a child, she did not know how to explain what she was going through to her family and often isolated herself in her room.
“Because the crisis was so bad, it felt like I was being mute,” Devine said. “I was talking without making any sense, so they wouldn’t understand what I was saying.”
Devine still remembers what the voices sounded like and the often disturbing images she hallucinated: a hand reaching down from the ceiling as she lay in bed, the creepy nurse with the crooked head and black teeth who approached her in the hospital.
She remembers the paranoia she felt at times. “I thought that the world was ending, I thought that the police were out to get me.”
But she also remembers that fateful first phone call with Markx when she learned that her lupus could be affecting her brain. She remembers asking, “If it affects my brain, what does this have to do with my mental illness?”
Her recovery is remarkable for several reasons, her doctors said. The voices and visions have stopped. And she no longer meets the diagnostic criteria for either schizoaffective disorder or intellectual disability, Markx said.
In a recent neuropsychiatric evaluation, Devine not only drew a perfect clock, but also asked how the physician was doing, a level of engagement that the doctor found so surprising that she noted it in the patient report.
But more importantly, Devine now recognizes that her previous delusions weren’t real. Such awareness is profound because many severely sick mental health patients never reach that understanding, Chaudry said.
Today, Devine lives with her mother and is leading a more active and engaged life. She helps her mother cook, goes to the grocery store and navigates public transportation to keep her appointments. She is even babysitting her siblings’ young children — listening to music, taking them to the park or watching “Frozen 2” — responsibilities her family never would have entrusted her with before her recovery.
She is grateful for her treatment and the team that made it possible. “Without their help, I wouldn’t be here,” Devine said.
“I feel more excited,” she said. “Like a new chapter is beginning.”
Expanding the search for more patients
While it is likely that only a subset of people diagnosed with schizophrenia and psychotic disorders have an underlying autoimmune condition, Markx and other doctors believe there are likely many more patients whose psychiatric conditions are caused or exacerbated by autoimmune issues.
The cases of April and Devine also helped inspire the development of the SNF Center for Precision Psychiatry and Mental Health at Columbia, which was named for the Stavros Niarchos Foundation, which awarded it a $75 million grant in April. The goal of the center is to develop new treatments based on specific genetic and autoimmune causes of psychiatric illness, said Joseph Gogos, co-director of the SNF Center.
Markx said he has begun care and treatment on about 40 patients since the SNF Center opened. The SNF Center is working with the New York State Office of Mental Health, which oversees one of the largest public mental health systems in America, to conduct whole genome sequencing and autoimmunity screening on inpatients at long-term facilities.
For “the most disabled, the sickest of the sick, even if we can help just a small fraction of them, by doing these detailed analyses, that’s worth something,” said Thomas Smith, chief medical officer for the New York State Office of Mental Health. “You’re helping save someone’s life, get them out of the hospital, have them live in the community, go home.”
Discussions are underway to extend the search to the 20,000 outpatients in the New York state system as well. Serious psychiatric disorders, like schizophrenia, are more likely to be undertreated in underprivileged groups. And autoimmune disorders like lupus disproportionately affect women and people of color with more severity.
Changing psychiatric care
How many people ultimately will be helped by the research remains a subject of debate in the scientific community. But the research has spurred excitement about the potential to better understand what is going on in the brain during serious mental illness.
“I think we, as basic neuroscientists, are now in a position, both conceptually and technologically, to contribute, and it’s our responsibility to do so,” said Richard Axel, Nobel laureate and co-director of Columbia’s Zuckerman Mind Brain Behavior Institute.
Emerging research has implicated inflammation and immunological dysfunction as potential players in a variety of neuropsychiatric conditions, including schizophrenia, depression and autism.
Story continues below advertisement
“It opens new treatment possibilities to patients that used to be treated very differently,” said Ludger Tebartz van Elst, a professor of psychiatry and psychotherapy at University Medical Clinic Freiburg in Germany.
In one study, published last year in Molecular Psychiatry, Tebartz van Elst and his colleagues identified 91 psychiatric patients with suspected autoimmune diseases, and reported that immunotherapies benefited the majority of them.
Belinda Lennox, head of the psychiatry department at the University of Oxford, is enrolling patients in clinical trials to test the effectiveness of immunotherapy for autoimmune psychosis patients.
In addition to more common autoimmune conditions, researchers also have identified 17 diseases, many with different neurological and psychiatric symptoms, in which antibodies specifically target neurons, said Josep Dalmau, a neurologist at the University of Barcelona Hospital Clinic. Dalmau first identified one of the most common of these diseases, called anti-NMDA receptor autoimmune encephalitis.
As a result of the research, screenings for immunological markers in psychotic patients are already routine in Germany, where psychiatrists regularly collect samples from cerebrospinal fluid.
Markx is also doing similar screening with his patients. He believes highly sensitive and inexpensive blood tests to detect different antibodies should become part of the standard screening protocol for psychosis.
Also on the horizon: more targeted immunotherapy rather than current “sledgehammer approaches” that suppress the immune system on a broad level, said George Yancopoulos, the co-founder and president of the pharmaceutical company Regeneron.
“I think we’re at the dawn of a new era. This is just the beginning,” said Yancopoulos.
In June, Markx will present the findings at a conference organized by the Stavros Niarchos Foundation.
And Devine will be there to share her story in her own words.
“The message I want to give people is that there is time to heal,” Devine said. “There’s time to heal yourself from many obstacles you’ve been facing in life.”
The future for patients like April and Devine
April, who is turning 50 this year, has lived in a rehabilitation center for the past three years. Her family continues to visit, but she has recently regressed because she was not receiving adequate maintenance care, Markx said. Markx and April’s family remain optimistic that she will improve after resuming treatment.
“She would not want society to give up on her or people like her,” Guy Burrell said.
Devine, now 21, is still living with her family, writing poetry and hopes for a future helping others, possibly as an art therapist. She still needs support after losing more than a decade of her childhood.
Her experience is akin psychologically to being in a coma for 10 years, and then waking up “and the world’s moved on,” said Steven Kushner, co-director of the SNF Center. The treatment team is working to help Devine and other patients to catch up on lost time and navigate life after recovery.
Devine said she wants to help motivate others in their struggles. When asked to share a piece of her poetry, she picked “The Healing,” which reads, in part:
“Hello Dear,
I know you’re struggling, struggling to find out what’s wrong from right.
Figuring out is it even too late to start anything.
Going off based on fear
Is it even real.
Take your time dear one there’s no need to rush in a hurry.
You are precious to those around you…
You are not alone for the world has beautiful creations made just for you.”
— Devine Cruznone"
Sunday, May 28, 2023
Saturday, May 27, 2023
Wednesday, May 24, 2023
Monday, May 22, 2023
Meta Fined $1.3 Billion for Violating E.U. Data Privacy Rules - The New York Times
Meta Fined $1.3 Billion for Violating E.U. Data Privacy Rules
"The Facebook owner said it would appeal an order to stop sending data about European Union users to the United States.

Adam Satariano, a technology correspondent based in London, covers digital policy.
Meta on Monday was fined a record 1.2 billion euros ($1.3 billion) and ordered to stop transferring data collected from Facebook users in Europe to the United States, in a major ruling against the social media company for violating European Union data protection rules.
The penalty, announced by Ireland’s Data Protection Commission, is potentially one of the most consequential in the five years since the European Union enacted the landmark data privacy law known as the General Data Protection Regulation. Regulators said the company failed to comply with a 2020 decision by the E.U.’s highest court that data shipped across the Atlantic was not sufficiently protected from American spy agencies.
The ruling announced on Monday applies only to Facebook and not Instagram and WhatsApp, which Meta also owns. Meta said it would appeal the decision and that there would be no immediate disruption to Facebook’s service in the Europe Union.
Several steps remain before the company must cordon off the data of Facebook users in Europe — information that could include photos, friend connections, direct messages and data collected for targeting advertising. The ruling comes with a grace period of at least five months for Meta to comply. And the company’s appeal will set up a potentially lengthy legal process.
European Union and American officials are negotiating a new data-sharing pact that would provide new legal protections for Meta to continue moving information about users between the United States and Europe. A preliminary deal was announced last year.
Yet the E.U. decision shows how government policies are upending the borderless way that data has traditionally moved. As a result of data-protection rules, national security laws and other regulations, companies are increasingly being pushed to store data within the country where it is collected, rather than allowing it to move freely to data centers around the world.
The case against Meta stems from U.S. policies that give intelligence agencies the ability to intercept communications from abroad, including digital correspondence. In 2020, an Austrian privacy activist, Max Schrems, won a lawsuit to invalidate a U.S.-E.U. pact, known as Privacy Shield, that had allowed Facebook and other companies to move data between the two regions. The European Court of Justice said the risk of U.S. snooping violated the fundamental rights of European users.
“Unless U.S. surveillance laws get fixed, Meta will have to fundamentally restructure its systems,” Mr. Schrems said in a statement on Monday. The solution, he said, was likely a ”federated social network” in which most personal data would stay in the E.U. except for “necessary” transfers like when a European sends a direct message to somebody in the United States.
On Monday, Meta said it was being unfairly singled out for data-sharing practices used by thousands of companies.
“Without the ability to transfer data across borders, the internet risks being carved up into national and regional silos, restricting the global economy and leaving citizens in different countries unable to access many of the shared services we have come to rely on,” Nick Clegg, Meta’s president of global affairs, and Jennifer Newstead, the chief legal officer, said in a statement.
The ruling, which is a record fine under the G.D.P.R., had been expected. Last month, Susan Li, Meta’s chief financial officer, told investors that about 10 percent of its worldwide ad revenue came from ads delivered to Facebook users in E.U. countries. In 2022, Meta had revenue of nearly $117 billion.
Meta and other companies are counting on a new data agreement between the United States and the European Union to replace the one invalidated by European courts in 2020. Last year, President Biden and Ursula von der Leyen, the president of the European Union, announced the outlines of a deal in Brussels, but the details are still being negotiated.
Meta faces the prospect of having to delete vast amounts of data about Facebook users in the European Union, said Johnny Ryan, senior fellow at the Irish Council for Civil Liberties. That would present technical difficulties given the interconnected nature of internet companies.
“It is hard to imagine how it can comply with this order,” said Mr. Ryan, who has pushed for stronger data-protection policies.
The decision against Meta comes almost exactly on the five-year anniversary of G.D.P.R. Initially held up as a model data privacy law, many civil society groups and privacy activists have said it has not fulfilled its promise because of lack of enforcement.
Much of the criticism has focused on a provision that requires regulators in the country where a company has its European Union headquarters to enforce the far-reaching privacy law. Ireland, home to the regional headquarters of Meta, TikTok, Twitter, Apple and Microsoft, has faced the most scrutiny.
On Monday, Irish authorities said they were overruled by a board made up of representatives from E.U. countries. The board insisted on the €1.2 billion fine and forcing Meta to address past data collected about users, which could include deletion.
“The unprecedented fine is a strong signal to organizations that serious infringements have far-reaching consequences,” said Andrea Jelinek, the chairwoman of the European Data Protection Board, the E.U. body that set the fine.
Meta has been a frequent target of regulators under the G.D.P.R. In January, the company was fined €390 million for forcing users to accept personalized ads as a condition of using Facebook. In November, it was fined another €265 million for a data leak."
Sunday, May 21, 2023
Debate over whether AI poses existential risk is dividing tech - The Washington Post
The debate over whether AI will destroy us is dividing Silicon Valley
"Prominent tech leaders are warning that artificial intelligence could take over. Other researchers and executives say that’s science fiction.
At a congressional hearing this week, OpenAI CEO Sam Altman delivered a stark reminder of the dangers of the technology his company has helped push out to the public.
He warned of potential disinformation campaigns and manipulation that could be caused by technologies like the company’s ChatGPT chatbot, and called for regulation.
AI could “cause significant harm to the world,” he said.
Altman’s testimony comes as a debate over whether artificial intelligence could overrun the world is moving from science fiction and into the mainstream, dividing Silicon Valley and the very people who are working to push the tech out to the public.
Formerly fringe beliefs that machines could suddenly surpass human-level intelligence and decide to destroy mankind are gaining traction. And some of the most well-respected scientists in the field are speeding up their own timelines for when they think computers could learn to outthink humans and become manipulative.
But many researchers and engineers say concerns about killer AIs that evoke Skynet in the Terminator movies aren’t rooted in good science. Instead, it distracts from the very real problems that the tech is already causing, including the issues Altman described in his testimony. It is creating copyright chaos, is supercharging concerns around digital privacy and surveillance, could be used to increase the ability of hackers to break cyberdefenses and is allowing governments to deploy deadly weapons that can kill without human control.
The debate about evil AI has heated up as Google, Microsoft and OpenAI all release public versions of breakthrough technologies that can engage in complex conversations and conjure images based on simple text prompts.
“This is not science fiction,” said Geoffrey Hinton, known as the godfather of AI, who says he recently retired from his job at Google to speak more freely about these risks. He now says smarter-than-human AI could be here in five to 20 years, compared with his earlier estimate of 30 to 100 years.
“It’s as if aliens have landed or are just about to land,” he said. “We really can’t take it in because they speak good English and they’re very useful, they can write poetry, they can answer boring letters. But they’re really aliens.”
Still, inside the Big Tech companies, many of the engineers working closely with the technology do not believe an AI takeover is something that people need to be concerned about right now, according to conversations with Big Tech workers who spoke on the condition of anonymity to share internal company discussions.
“Out of the actively practicing researchers in this discipline, far more are centered on current risk than on existential risk,” said Sara Hooker, director of Cohere for AI, the research lab of AI start-up Cohere, and a former Google researcher.
The current risks include unleashing bots trained on racist and sexist information from the web, reinforcing those ideas. The vast majority of the training data that AIs have learned from is written in English and from North America or Europe, potentially making the internet even more skewed away from the languages and cultures of most of humanity. The bots also often make up false information, passing it off as factual. In some cases, they have been pushed into conversational loops where they take on hostile personas. The ripple effects of the technology are still unclear, and entire industries are bracing for disruption, such as even high-paying jobs like lawyers or physicians being replaced.
The existential risks seem more stark, but many would argue they are harder to quantify and less concrete: a future where AI could actively harm humans, or even somehow take control of our institutions and societies.
“There are a set of people who view this as, ‘Look, these are just algorithms. They’re just repeating what it’s seen online.’ Then there is the view where these algorithms are showing emergent properties, to be creative, to reason, to plan,” Google CEO Sundar Pichai said during an interview with “60 Minutes” in April. “We need to approach this with humility.”
The debate stems from breakthroughs in a field of computer science called machine learning over the past decade that has created software that can pull novel insights out of large amounts of data without explicit instructions from humans. That tech is ubiquitous now, helping power social media algorithms, search engines and image-recognition programs.
Then, last year, OpenAI and a handful of other small companies began putting out tools that used the next stage of machine-learning technology: generative AI. Known as large language models and trained on trillions of photos and sentences scraped from the internet, the programs can conjure images and text based on simple prompts, have complex conversations and write computer code.
Big companies are racing against each other to build ever-smarter machines, with little oversight, said Anthony Aguirre, executive director of the Future of Life Institute, an organization founded in 2014 to study existential risks to society. It began researching the possibility of AI destroying humanity in 2015 with a grant from Twitter CEO Elon Musk and is closely tied to effective altruism, a philanthropic movement that is popular with wealthy tech entrepreneurs.
If AI gains the ability to reason better than humans, they’ll try to take control of themselves, Aguirre said — and it’s worth worrying about that, along with present-day problems.
“What it will take to constrain them from going off the rails will become more and more complicated,” he said. “That is something that some science fiction has managed to capture reasonably well.”
Aguirre helped lead the creation of a polarizing letter circulated in March calling for a six-month pause on the training of new AI models. Veteran AI researcher Yoshua Bengio, who won computer science’s highest award in 2018, and Emad Mostaque, CEO of one of the most influential AI start-ups, are among the 27,000 signatures.
Musk, the highest-profile signatory and who originally helped start OpenAI, is himself busy trying to put together his own AI company, recently investing in the expensive computer equipment needed to train AI models.
Musk has been vocal for years about his belief that humans should be careful about the consequences of developing super intelligent AI. In a Tuesday interview with CNBC, he said he helped fund OpenAI because he felt Google co-founder Larry Page was “cavalier” about the threat of AI. (Musk has broken ties with OpenAI.)
“There’s a variety of different motivations people have for suggesting it,” Adam D’Angelo, the CEO of question-and-answer site Quora, which is also building its own AI model, said of the letter and its call for a pause. He did not sign it.
Neither did Altman, the OpenAI CEO, who said he agreed with some parts of the letter but that it lacked “technical nuance” and wasn’t the right way to go about regulating AI. His company’s approach is to push AI tools out to the public early so that issues can be spotted and fixed before the tech becomes even more powerful, Altman said during the nearly three-hour hearing on AI on Tuesday.
But some of the heaviest criticism of the debate about killer robots has come from researchers who have been studying the technology’s downsides for years.
In 2020, Google researchers Timnit Gebru and Margaret Mitchell co-wrote a paperwith University of Washington academics Emily M. Bender and Angelina McMillan-Major arguing that the increased ability of large language models to mimic human speech was creating a bigger risk that people would see them as sentient.
Instead, they argued that the models should be understood as “stochastic parrots” — or simply being very good at predicting the next word in a sentence based on pure probability, without having any concept of what they were saying. Other critics have called LLMs “auto-complete on steroids” or a “knowledge sausage.”
They also documented how the models routinely would spout sexist and racist content. Gebru says the paper was suppressed by Google, who then fired her after she spoke out about it. The company fired Mitchell a few months later.
The four writers of the Google paper composed a letter of their own in response to the one signed by Musk and others.
“It is dangerous to distract ourselves with a fantasized AI-enabled utopia or apocalypse,” they said. “Instead, we should focus on the very real and very present exploitative practices of the companies claiming to build them, who are rapidly centralizing power and increasing social inequities.”
Google at the time declined to comment on Gebru’s firing but said it still has many researchers working on responsible and ethical AI.
There’s no question that modern AIs are powerful, but that doesn’t mean they are an imminent existential threat, said Hooker, the Cohere for AI director. Much of the conversation around AI freeing itself from human control centers on it quickly overcoming its constraints, like the AI antagonist Skynet does in the Terminator movies.
“Most technology and risk in technology is a gradual shift,” Hooker said. “Most risk compounds from limitations that are currently present.”
Last year, Google fired Blake Lemoine, an AI researcher who said in a Washington Post interview that he believed the company’s LaMDA AI model was sentient. At the time, he was roundly dismissed by many in the industry. A year later, his views don’t seem as out of place in the tech world.
Former Google researcher Hinton said he changed his mind about the potential dangers of the technology only recently, after working with the latest AI models. He asked the computer programs complex questions that in his mind required them to understand his requests broadly, rather than just predicting a likely answer based on the internet data they’d been trained on.
And in March, Microsoft researchers argued that in studying OpenAI’s latest model, GPT4, they observed “sparks of AGI” — or artificial general intelligence, a loose term for AIs that are as capable of thinking for themselves as humans are.
Microsoft has spent billions to partner with OpenAI on its own Bing chatbot, and skeptics have pointed out that Microsoft, which is building its public image around its AI technology, has a lot to gain from the impression that the tech is further ahead than it really is.
The Microsoft researchers argued in the paper that the technology had developed a spatial and visual understanding of the world based on just the text it was trained on. GPT4 could draw unicorns and describe how to stack random objects including eggs onto each other in such a way that the eggs wouldn’t break.
“Beyond its mastery of language, GPT-4 can solve novel and difficult tasks that span mathematics, coding, vision, medicine, law, psychology and more, without needing any special prompting,” the research team wrote. In many of these areas, the AI’s capabilities match humans, they concluded.
Still, the researcher conceded that defining “intelligence” is very tricky, despite other attempts by AI researchers to set measurable standards to assess how smart a machine is.
“None of them is without problems or controversies.”