Contact Me By Email

Wednesday, February 07, 2024

Earth Struck By Enormous Burst Of Gamma Rays From Two Billion Light-Years Away

Earth Struck By Enormous Burst Of Gamma Rays From Two Billion Light-Years Away

“A massive burst of gamma rays produced by the explosion of a star almost two billion light-years away was so powerful that it changed Earth’s atmosphere, according to scientists.

Gamma rays are the shortest wavelength electromagnetic waves, containing the most energy. On Earth, they come from lightning, nuclear explosions and radioactive decay. In space, they’re thought to originate from a star exploding as a supernova or from two dense neutron stars—the leftovers of a supernovae—colliding with each other.

This gamma-ray burst has come from two billion light-years away, which means it occurred two billion years ago.

MORE FROM FORBESGiant Cosmic Explosion May Be Biggest Since Human Civilization Began, Say ScientistsBy Jamie Carter

Significant Disturbance

Published today in the journal Nature Communications, a new paper reveals that on October 9, 2022, an enormous burst of gamma rays—labeled GRB 221009A—caused a significant disturbance in a layer of Earth’s atmosphere called the ionosphere.

The ionosphere sits from around 31 miles (50 kilometers) to 590 miles (950 kilometers) above Earth’s surface. “It was probably the brightest gamma-ray burst we have ever detected,” said Mirko Piersanti, University of L’Aquila, Italy, and lead author of the paper, in a press release.

The burst of energy, which lasted just over 13 seconds, is reckoned to be a once-in-10,000-years event. New research shows that Earth’s ionosphere was disturbed for several hours by the blast. Enough energy arrived at Earth to activate lightning detectors.

MORE FROM FORBESOnce-In-A-Thousand-Year Cosmic Explosion Has Been Explained At LastBy Jamie Carter

Atmospheric Changes

It’s not unusual for Earth’s atmosphere to be affected by space weather. It’s been subjected to a lot of geomagnetic activity in recent months, with a potent solar wind producing a marked increase in the frequency and intensity of aurora. However, that comes from the sun.

Forbes Innovation 00:12 01:12As AI Rapidly Becomes A Commodity, Time To Consider The Next StepApple Documents Reveal Radical New MacBook Pro Design3 Relationship Green Flags That Are Mistaken For Red FlagsToday’s ‘Quordle’ Hints And Answers For Wednesday, February 7Unlocking The Potential Of Cannabis For Menstrual Pain ReliefA Psychologist Explains ‘Philophobia’—The Fear Of Falling In LoveCDPR’s ‘Cyberpunk 2077’ Follow-Up Gets Award-Winning Lead Writer3 Relationship Green Flags That Are Mistaken ForRed Flags

GRB 221009A appears to have come from two billion light-years away, yet it still had a marked effect on the ionosphere.

“This disturbance impacted the very lowest layers of Earth's ionosphere, situated just tens of kilometers above our planet’s surface, leaving an imprint comparable to that of a major solar flare,” said Laura Hayes, research fellow and solar physicist at ESA.

MORE FROM FORBESLargest Cosmic Explosion Ever Seen Leaves Scientists BaffledBy Jamie Carter

Earth’s Mass Extinctions

GRB 221009A cannot have been the first time gamma rays from a supernova have struck Earth. “There has been a great debate about the possible consequences of a gamma-ray burst in our galaxy,” said Piersanti.

This new research reinforces the idea that a supernova in the Milky Way could affect the ionosphere and even damage the ozone layer, which protects us against dangerous ultraviolet radiation from the sun.

Mass extinctions in Earth’s history may have been caused by something similar to—but much stronger than—GRB 221009A.

Biggest Mysteries In Astronomy

GRB 221009A was detected by the European Space Agency's International Gamma-Ray Astrophysics Laboratory (Integral) space telescope, the first capable of simultaneously observing objects in gamma rays, X-rays and visible light. Launched in October 2002, it studies explosions, radiation, formation of elements, black holes and other exotic objects, according to ESA to solve some of the biggest mysteries in astronomy—such as gamma rays.

The burst’s effect on Earth’s atmosphere was detected by the China Seismo-Electromagnetic Satellite (CSES), also known as Zhangheng, a Chinese-Italian space mission launched in 2018. It monitors the top side of the ionosphere for changes in its electromagnetic behavior.

Wishing you clear skies and wide eyes.“

Monday, January 22, 2024

I love everything about the Apple Vision Pro — except wearing it

I love everything about the Apple Vision Pro — except wearing it

A man wears an Apple Vision Pro headset.
Apple

“One week ago, if you had asked me if I planned on getting an Apple Vision Pro, I would have scoffed. Why try typing on a clumsy digital keyboard when I could work much faster on a real one? Why watch a movie on a flat screen inside a headset when I could just turn on my TV? Wildly expensive, superfluous tech is just not for me.

But after one half-hour session with the Vision Pro, I might be a changed man. Maybe it’s just the stunning resolution talking, but I can more easily see a future where the tech enhances some aspects of my life. The only thing I’m still not sold on? Actually wearing the thing.

Learning the ropes

In my hands-on session, Apple would guide me through several impressive experiences across a handful of apps. First thing was first, though: I’d need to learn how to use it. Compared to VR headsets I use regularly, the Vision Pro is much easier to put on and get running. I’d pull it over my head, turn a dial to tighten its knit headband, and look at a few dots to set up eye-tracking in the span of a few short minutes. With that sparse setup out of the way, I was tossed right into an app selection screen with nothing but my eyes and hands to control my experience from there.

What immediately stands out is how quickly I was able to navigate menus with barely a moment of tutorial. All I had to do was stare at an app icon, and the headset would highlight it exactly as I had expected. Clicking on it was a simple matter of pinching my fingers. While I wondered how accurate that would be, the Vision Pro barely missed a single gesture, whether my hand was resting on my lap or raised closer to my chest.

The only learning curve came from holding back my fidgety tendencies. Early in the demo, I had a few clumsy interactions from subconsciously mirroring gestures with my other hand or moving my free fingers around. That led to a few hiccups where I’d zoom in on a picture instead of clicking a button. I quickly cut out those bad habits, though it took a few minutes of sitting on my left hand to keep it out of the camera’s view.

My experiences built up in complexity through the half hour. I started by simply scrolling through a photo album. I was immediately struck by how intuitive that experience was. I could grab the bottom right corner of a photo to expand it, pinch with two fingers to zoom in, move my fingers up to scroll, or move the app around by grabbing a white bar at the bottom of the app. I barely had to think about what I was doing; every gesture came naturally.

Using a pricey headset to look at photos is certainly wasteful, but Apple made a much stronger case for the Vision Pro as the demo went on. I first saw that when I opened a panoramic photo that stretched across my field of vision. Panoramas have never made much sense to me on devices like the iPhone, but it feels like they were built for the Vision Pro all along. Those long photos become immersive landscapes in the headset.

What’s significantly more impressive is the headset’s spatial image tech. I opened up a folder full of images and videos captured on the device, mostly family photos. Those images look almost fully 3D, with way more depth than in a flat photo. It’s so uncanny that it’s almost creepy at first. I was even more blown away when I was shown a few videos, complete with spatial audio. One shows a mother blowing bubbles with her child, the bubbles in the foreground almost popping out of the screen. It’s hard to describe (and perhaps playing exactly into Apple’s ideal marketing playbook), but it actually feels like I’m viewing a memory. It’s like living in a piece of sci-fi pulled straight from Blade Runner.

It’s important to note that all of this is only possible because of the Vision Pro’s miraculous display. The micro-OLED display is superior to anything I’ve seen in a headset yet, providing crystal clear visuals. I didn’t even have to fiddle with it to get a clean image; I strapped it on, and the headset took care of the rest after automatically calibrating the eyecups. The headset’s tight fit also significantly cuts down on outside light – I only noticed a very slight leak in the bottom left corner. All of that cuts out any of the technical friction that comes with other headsets that require immersion-breaking fiddling.

More experiences

Now fully acquainted with the headset, I worked my way through a small list of apps with varying degrees of success. I’ll get the low point out of the way first: The Vision Pro isn’t the most intuitive desktop computer. When opening up the Safari browser, I’d click the browser search bar to open a virtual keyboard in front of me. This was the most imprecise part of the demo, as I had to slowly and carefully pluck keys one at a time to write out a website. My taps didn’t always register as expected. When I realized I’d made a typo in my URL, I struggled to single out the wrong letter and backspace it. The UI is simply too small to control with my clumsy fingers (though its microphone would very clearly hear me register my request to write out “Apple dot com”).

The more traditional the experience, the less practical it was. It’s simply more efficient to surf the web or write a document with a mouse and keyboard. While “spatial computing” is the hot term for what the Vision Pro can do, I don’t see many people engaging with the actual computing part of that. It’s like when your Xbox comes with a web browser; it has to be there, and it works when you need it, but that doesn’t mean you’ll ever want to use it.

Apple Vision Pro being worn by a person while using a keyboard.
Apple

The Vision Pro’s power squarely lies in apps built for the device. I’d see plenty of those in action. An experience titled Encounter Dinosaurs put me in an interactive museum exhibit where the wall in front of me opened up, and I got to interact with a dinosaur as it scampered around me. I’d drop a life-sized Alfa Romeo F1 car in the room using the JigSpace app and pluck off its parts with a pinch to examine them closely. I even found myself sitting in a 360-degree recreation of Avengers Tower through the Disney+ app, which I could turn into a personal movie theater. Each of those experiences showed the unique power of mixed reality, making great use of the headset’s shockingly clear passthrough video.

How clear? It’s like looking through a freshly cleaned window. I hardly noticed any unsightly grain, putting it leagues ahead of devices like the Meta Quest 3. Granted, I was testing it in a very brightly lit room surrounded by white walls. The real test will be whether or not the clarity holds up in your average living room. Everything I’m praising here was done in a tightly controlled environment; Digital Trends wasn’t even allowed to take our own photos of the device (an Apple photographer took the images in this article). Whether or not all of this will work as well outside of the perfect test scenario is yet to be seen.

Entertainment seems to be a big focus for Apple at launch and I saw some strong examples of that during my demo. Using the Apple TV app, I dropped into a virtual movie theater to watch a scene from The Super Mario Bros. Movie. Here, I got a full movie theater simulation where I could choose how close I was to the screen and whether I was viewing it from a floor or balcony. That may sound hokey, but it really did feel like sitting in a theater. Even more impressive, though, was when I watched a reel of 180-degree videos captured in 8K. One quick shot put me right above a soccer net during a game, allowing me to see the action up close.

That single shot convinces me that the Vision Pro could truly be the future; I feel like I’m actually in the moment. I can see a world where I could virtually attend something like Taylor Swift’s Eras Tour and have a blast exploring the stage from a few well-placed cameras on stage.

No pain, no gain

While all of this is impressive, there’s a key element that may torpedo the entire experiment for many: The Apple Vision Pro isn’t terribly comfortable to wear for a long period of time.

It’s not so much the weight that’s the problem, as some have speculated. As a regular VR user, it doesn’t feel that much heavier than my PlayStation VR2. The problem is more in the discomfort that comes with an electronic device squeezing your head. To Apple’s credit, the knit headband goes a long way toward making the experience more comfortable than most headsets. The problem more so lies in the front piece. I felt hard materials squeezing down on my temple the entire demo. When my 30 minutes were up, I was relieved to pull it off.

Zeiss lenses inside the Apple Vision Pro headset.
Apple

I hope that’s just the result of me being a novice. Perhaps I could get a more comfortable fit with some more practice. That feeling, however, matches what I’ve felt wearing headsets like Meta Quest 3. No matter how incredible the tech is, it is undeniably annoying to wear a giant helmet for a long period of time. The Vision Pro doesn’t solve that classic issue, which makes it feel like another stopgap device on the road to less intrusive mixed reality tech.

What turns me off even more is one of the Vision Pro’s most defining features: EyeSight. After my demo, Apple pulled me into a room where I got to see someone else wearing the device. Digital eyes appeared on the visor as we talked, replicating his movements and blinks. I’m no technophobe, but talking to someone while staring at their digitized eyes is downright creepy. Granted, it’s a functional feature too. A blue shimmer across the visor told me when the demoist had an app open and was no longer looking at me. That’s a handy tool in a world where offices adopt the tech, but I wouldn’t want to work at any place that required me to wear one.

Beyond comfort, any concerns I have with the Vision Pro are existential. As I lounged on a couch watching movies in the Vision Pro, I couldn’t help but think back to Wall-E, its oversized humans floating around a space station in headsets. Just because a device like Vision Pro can exist, does that mean it should? I can’t help but worry about what mass adoption would mean for humanity after having an entire conversation with a person and never seeing their real eyes.

Perhaps that’s a worry for another day, though. With its price point, there’s no universe in which every person will own a Vision Pro. Instead, it’ll be a niche tool for those who want to experiment with incredibly promising tech like spatial video. A lot of the apps I saw felt like convincing proofs of concept. The only question now is whether or not developers will drop thousands of dollars to start creating with it. A great guided demo is barely step one.“

Sunday, January 07, 2024

Uranus and Neptune Reveal Their True Colors - The New York Times

Uranus and Neptune Reveal Their True Colors

"Neptune is not as blue as you’ve been led to believe, and Uranus’s shifting colors are better explained, in new research.

An animation of seasonal color changes on Uranus during two Uranus years (84.02 Earth years), starting just before the southern summer solstice, when its south pole points almost directly toward the sun. At left, Uranus’s color appears as it would to the naked eye; at right, it has been enhanced to make atmospheric features clearer.University of Oxford

Sign up for Science Times  Get stories that capture the wonders of nature, the cosmos and the human body.

Think of Uranus and Neptune, the solar system’s outermost planets, and you may picture two distinct hues: pale turquoise and cobalt blue. But astronomers say that the true colors of these distant ice giants are more similar than their popular depictions.

Neptune is a touch bluer than Uranus, but the difference in shade is not nearly as great as it appears in common images, according to a study published on Friday in Monthly Notices of the Royal Astronomical Society.

The results help to “set the record straight,” said Leigh Fletcher, a professor of planetary science at the University of Leicester in England and an author of the study. “There is a subtle difference in the blue shade between Uranus and Neptune, but subtle is the operative word there.”

The deep blue attributed to Neptune dates back to an artificial enhancement in the 1980s, when NASA’s Voyager 2 became the first (and still the only) spacecraft to visit the two planets.

Scientists at that time cranked up the blue in images of Neptune made by Voyager’s cameras to highlight the planet’s many curiosities, such as its south polar wave and dark spots. But as many sky watchers have known for decades, both Neptune and Uranus appear pale greenish-blue to the human eye.

“Uranus, as seen by Voyager, was pretty bland, so they made it as near to true color as we can,” said Patrick Irwin, a professor of planetary physics at the University of Oxford and an author of the study. “But with Neptune, there’s all sorts of weird things,” he said, that “get a bit washed out” with proper color correction.

Enhanced images of Neptune often include captions that address the artificial color, but the vision of a deep blue planet has endured.

Dr. Irwin and his colleagues used advanced instruments on the Hubble Space Telescope and on the Very Large Telescope in Chile to resolve the colors of the planets as accurately as possible.

They also reviewed an immense observational record of both planets captured by Lowell Observatory in Arizona between 1950 and 2016.

The results confirm that Uranus is only slightly paler than Neptune, because of the thicker layer of aerosol haze that lightens its color.

The Lowell data set also shed new light on the mysterious color shifts that Uranus experiences over its extreme seasons.

A figure from the study featuring two images of Uranus and two of Neptune. On the top are Voyager 2 images of Uranus and Neptune, with Uranus rendered in pale greenish-blue and Neptune in deep blue with storms and bands on its surface. On the bottom are reprocessed images of Uranus and Neptune in almost the same pale shade of blue.
A 1986 image of Uranus and a 1989 image of Neptune released shortly after each Voyager 2 flyby, compared with the study’s reprocessed images of the planets that better approximate their true colors.University of Oxford

For years, astronomers have puzzled over why Uranus is tinted green during its solstices but radiates a bluer glow at its equinoxes. The pattern is linked to Uranus’s odd position — tilted almost entirely on its side. Over the course of an 84-year orbit around the sun, Uranus’s poles are plunged into decades of perpetual light or darkness in the summers and winters, while the equatorial regions face the sun near the equinoxes.

Uranus’s shifting colors can be partly explained by atmospheric methane. Because methane absorbs red and green light, the equator ends up reflecting more blue light; by contrast, the poles, which have half as much methane, are tinted slightly green. The new study confirms this dynamic, and shows that a “hood” of ice particles coalesces over the sunlit poles of Uranian summer, boosting the greening effect.

The study “opens the door to many future studies aiming at understanding Uranus’s atmosphere and its seasons,” said Ravit Helled, a professor of theoretical astrophysics at the University of Zurich who was not involved in the research. This work, she added, can “improve our understanding of the internal structure and thermal evolution of the planet.”

For Heidi Hammel, an astronomer who worked on Voyager’s imaging team in 1989, the new study is the latest chapter in a longstanding quest to bring the planet’s real color to light.

“For the public, I hope that this paper can help undo the decades of misinformation about Neptune’s color,” said Dr. Hammel, who now serves as vice president for science at the Association of Universities for Research in Astronomy. “Strike the word ‘azure’ from your vocabulary when discussing Neptune!”

The gap between the public perception and the reality of Neptune illustrates just one of the many ways data is manipulated to emphasize certain features or enhance the appeal of astronomical visualizations. For instance, the stunning images released from the James Webb Space Telescope are composite false-color versions of the original infrared observations.

“There’s never been an attempt to deceive,” Dr. Fletcher said, “but there has been an attempt to tell a story with these images by making them aesthetically pleasing to the eye so that people can enjoy these beautiful scenes in a way that is, maybe, more meaningful than a fuzzy, gray, amorphous blob in the distance.”

Uranus and Neptune Reveal Their True Colors - The New York Times

Saturday, January 06, 2024

NASA Spacecraft Takes New Images of Jupiter’s Volcanic Moon - The New York Times

New Images of Jupiter’s Moon Io Capture Infernal Volcanic Landscape

"Juno, a NASA mission designed to study Jupiter’s origins, sent back new views of the most eruptive world in the solar system.

A view of the entire globe of the moon Io, which is reddish-orangish-brown and pockmarked with volcanoes on its surface. Half of it is cached in shadow.
A new image of Jupiter’s moon Io captured by the Juno spacecraft on Dec. 30.NASA/SwRI/MSSS

Sign up for Science Times  Get stories that capture the wonders of nature, the cosmos and the human body.

A NASA spacecraft swooped past Io, one of Jupiter’s largest moons and the most volcanically active world in our solar system. The spacecraft, the Juno orbiter, made its closest flyby yet of Io’s turbulent landscape, and sent back snapshots speckled with sharp cliffs, edgy mountain peaks, lakes of pooled lava and even a volcanic plume.

“I was in awe,” said Scott Bolton, a physicist at the Southwest Research Institute and principal investigator of the Juno mission. Dr. Bolton noted how “incredibly colorful” Io is — tinted in orangy browns and yellows because of the presence of sulfur and flowing lava. He likened the moon to a pepperoni pizza.

Studying these features can help scientists figure out what drives Io’s volcanoes, some of which shoot lava dozens of miles into space, and confirm that this activity comes from an ocean of magma hidden beneath the moon’s crust. Deciphering the secrets of the volcanoes may eventually reveal the influence Jupiter has over its eruptions, which could be a clue to how the gas giant and its satellites formed.

The Juno spacecraft, designed to study the origin and evolution of Jupiter, arrived at the planet in 2016. NASA extended the mission in 2021, and the orbiter has since captured photos of the Jovian moons Ganymede, Europa and most recently Io.

It’s not the first time a NASA spacecraft has flown by Io. In 1979, Voyager 1 discovered Io was volcanically active during its journey to interstellar space. Two decades later, NASA’s Galileo mission sent back what Dr. Bolton calls “postage stamps,” or close-ups of specific features on Io’s surface.

Juno conducted a number of more distant observations of Io in recent years. Its latest flyby occurred on Dec. 30, when the spacecraft came within 932 miles of the moon. The images captured during this visit were made with an instrument called JunoCam and are in visible wavelengths. They are some of the highest resolution views of Io’s global structure. The mission’s managers shared six images of Io on the mission’s website, and members of the public have since uploaded digitally enhanced versions that highlight features on Io’s surface.

Dr. Bolton said he was struck by the sharpness of the edges on some of the mountains in the images, which left him pondering how they get shaped and what it would be like to visit such a place.

“I wonder what it’s like to hike there,” he said, “or to snowboard off that peak.”

Mission scientists are already at work analyzing these images, searching for differences across Io’s surface to learn how often its volcanoes erupt, how bright and hot those eruptions are and how the resulting lava flows. According to Dr. Bolton, the team will also compare Juno’s images to older views of the Jovian moon to determine what has changed on Io over a variety of encounters.

And they’ll get a second set of data to work with in a month, when Juno completes another close flyby of the explosive world on Feb. 3.:

NASA Spacecraft Takes New Images of Jupiter’s Volcanic Moon - The New York Times

Friday, January 05, 2024

A girl was allegedly raped in the metaverse. Is this the beginning of a dark new future? | Nancy Jo Sales | The Guardian

A girl was allegedly raped in the metaverse. Is this the beginning of a dark new future? | Nancy Jo Sales

"The cheerful language with which tech companies describe their platforms is often in stark contrast to the dark possibilities lurking within them. Meta, for example, describes its virtual world, the metaverse, as “the next evolution in social connection and the successor to the mobile internet”, a place where “virtual reality lets you explore new worlds and shared experiences”. But for a young girl in the UK recently, that “shared experience” was an alleged gang rape perpetrated by several adult men.

British police are investigating the sexual assault of the girl, identified only as being under the age of 16, in what is said to be the first investigation of its kind in the UK. The girl was reportedly wearing a virtual reality headset and playing an immersive game in the metaverse when her avatar was attacked.

Was this really rape? some have asked. The comments on an Instagram post for a storyabout the case in the New York Post were characteristically skeptical: “Couldn’t she have just turned it off?” “Can we focus on real-life crime please?” “I was killed in [the war video game Call of Duty],” one person said sarcastically: “Been waiting for my killer to be brought to justice.”

The difference, of course, is that while Call of Duty players can expect to be virtually killed sometimes as part of the game, the girl had no reason to expect that she would be raped. It isn’t yet known what game she was playing when the alleged assault occurred, but obviously there isn’t an online game where the goal for adult players is to rape children. The fact that they are able to in the metaverse is the issue at the heart of this case, which has attracted international attention.

The question of whether virtual rape is “really rape” goes back to at least 1993, when the Village Voice published an article by Julian Dibbell about “a rape in cyberspace”. Dibbell’s piece reported on how the people behind avatars that were sexually assaulted in a virtual community felt emotions similar to those of victims of physical rape.

As did the girl whose avatar was attacked in the metaverse, according to a senior police officer familiar with the case; he told the Daily Mail: “There is an emotional and psychological impact on the victim that is longer-term than any physical injuries.” In addition, the immersive quality of the metaverse experience makes it all the more difficult for a child, especially, to distinguish between what’s real and what is make-believe.

So while it is necessary for the police to investigate this case – with the courts to decide on the appropriate punishment for the alleged offenders – it is equally important for Meta to be held accountable.

Meta has a notoriously bad track record when it comes to protecting children and teenagers. In 2021, the whistleblower Frances Haugen revealed that Facebook’s own internal research showed how using Instagram (which the company owns) adversely affects teen girls’ confidence and body image. In October of last year, a bipartisan coalition of 33 attorneys general filed a lawsuit against Meta in California, alleging that Facebook and Instagram are responsible for a “national youth mental health crisis”.

If gone unchecked, sex crimes in the metaverse, against both children and adults, will become more common. A police investigator told the Daily Mail that the metaverse is already “rife” with sexual offenses. The Meta game Horizon Worlds has reportedly been the site of several sexual assaults. In 2022, the psychotherapist Nina Jane Patel, who does research on the metaverse, wrote of the “surreal nightmare” of being gang-raped in Horizon Venues (now Horizon Worlds). “Unlike in the physical world, there’s a lack of clear and enforceable rules in the metaverse,” said Patel.

A spokesman for Meta has said that users in the metaverse have “an automatic protection called personal boundary, which keeps people you don’t know a few feet away from you”. But apparently this feature isn’t doing enough to protect users from harm. This recent rape of a girl in the metaverse will be an important test for the UK’s new Online Safety Bill, a year-old set of laws to protect children and adults online. Some experts have expressed concerns that the bill doesn’t go far enough, focusing more on the content users publish rather than their actions.

The next generation of kids will spend an estimated 10 years in virtual reality over the course of their lifetimes – close to three hours a day – new research suggests. It may be that lawmakers need to add further protections to keep them safe. In the meantime, Meta could surprise everyone by stepping up and making the metaverse a place that lives up to its upbeat marketing.

  • Nancy Jo Sales is the author, most recently, of Nothing Personal: My Secret Life in the Dating App Inferno"

A girl was allegedly raped in the metaverse. Is this the beginning of a dark new future? | Nancy Jo Sales | The Guardian