Skip to content

World Economic Forum: “Chip Implants” Is Part of “Natural Evolution”

“Are we moving towards a ‘brave new world’?” writes Karen Philips, Vice President R&D of the Interuniversity Microelectronics Center, in a recent World Economic Forum (WEF) article. “As scary as chip implants may sound, they form part of a natural evolution that wearables once underwent.”

Interesting.

Did Philips mention Aldous Huxley’s Brave New World because she knows that the 1932 novel examines a society where citizens are sorted to be part of a certain social class, and the idea of individuality is non-existent?

Did Philips mention a “brave new world” because she recognizes that normalizing the implant of an electronic component—a chip —into humans is the last nail in the coffin of personal autonomy and privacy in an increasingly surveillance-based America?

Selling the idea of chip implants for everyday people. 

Objectively speaking, the essence of the article justifies using a technology called “augmentation” to improve the lives of ordinary people. The author claims that “compelling” arguments exist to interface our organic human self with chip implants.

For rationalization, Philips delivers a background of technology currently used for medical purposes. For example, electroceuticals produce electrical impulses to treat ailments and “interfacing with the brain using neural probes” to manage epilepsy.

Moreover, the author refers to the growth of wearable technology, such as “augmented reality (AR) goggles,” that superimposes layers of extra information in real-time over users’ real-world scenes to “upgrade” their life experience.

The article coxes the reader to entertain the idea of the “natural evolution” of augmented technology, which implies normalizing the fusion of our God-given organic biology with manufactured devices, to chiefly enhance our lives and safety.

Philips reasons that:

Augmentation can be defined as the extension of rehabilitation where technological aids such as glasses, cochlear implants or prosthetics are designed to restore a lost or impaired function.”

And that’s key: such implants serve niche medical purposes and aren’t normative in society. But that’s exactly what the author has in mind:

“Add it [augmentation] to completely healthy individuals and such technology can augment. Night goggles, exoskeletons and brain-computer interfaces build up the picture. The augmenting technology will help in all stages of life: children in a learning environment, professionals at work and ambitious senior citizens. There are many possibilities.”

And here’s what “many possibilities” might mean:

  • connect our brain to the Internet and “download” new information in an increasingly fast-paced society where we don’t have time to learn anymore
  • enable facial recognition without physical ID cards or passports because we can’t responsibly carry these around
  • perform financial transactions without the hassle of getting out our card or smartphone, which could obviously get lost or stolen
  • track our children’s whereabouts because we obviously can’t teach them to be vigilant, and we can’t trust them anymore because they’re bound to lose their smartphone.

For now, Americans generally appear hesitant about implants, in the brain at least. A Pew Research Center poll released in March showed that only 13 percent of surveyed 10,260 U.S. adults thought that chips implanted in the brain would be a good idea, despite the potential for faster and more accurate information processing.

But, pause for a moment if you’re wondering whether implanting chips into healthy people is a “thing” from an alternative dystopian universe.

Chip implants have already taken place in Sweden—and Wisconsin. 

According to media pieces by the New York Post and NBC News, over 4,000 people in Sweden have had a “microchip” the size of a large grain of rice implanted in their hand.

Consequently, many Swedish people can use their hand to carry out everyday activities such as financial transactions. They can even scan their hand against their smartphone to access personal COVID-related data or monitor aspects of their health.

Make no mistake: here in the United States, microchipping has already taken place.

According to the DailyMail and CNBC News, around 80 of the 250 employees at software company Three Square Market in Wisconsin had a microchip inserted beneath the skin of their hand.

The microchip transmits data when scanned by an electronic reader, which can be used to access the Three Square Market’s building. As such, the microchips are connected to employee accounts, so they can purchase food and drinks by swiping the reader against their hand, and funds are automatically deducted to cover the cost of the purchase.

“It is really convenient having the chip in your hand with all the things it can do,” Three Square Market CEO Todd Westby said in an interview.

When asked about who owns the microchip, Westby clarified the company’s position:

“It was never designed to be our property. We decided to put it in employees as a form of convenience for them. When employees leave, we actually consider it an employee retention tool. We do not plan on taking it out. You know, it’s up to the employees.”

Well, for now. Essentially, it’s your choice if you want the convenience of not carrying a heavy stack of access cards and identification documents. That’s less responsibility and it’s just all so convenient.

Normalization starts with “individual choice” for adults restricted to a particular company or a niche group of microchip enthusiasts. But, before long, an implant becomes compulsory for all employees, and within years, the tech world of Silicon Valley will have embraced microchipping. Normalization needs to move fast before people have too much time to build opposition, and the next thing we know, schools are looking at microchipping our children for their “safety.”

It’s all about the little ones.

Philips actually begins the first paragraph of the WEF article with: “Superheroes have been dominating big and small screens for a while…”

Indeed, following the Great Depression in the 1930s, superhero characters with abilities beyond those of everyday people began to emerge in the form of Superman (1938) and Captain Marvel (1939), to name a few. This trend continued well into the 1960s with The Incredible HulkThe X-Men and Iron Man, through the 1990s with Power Rangers and emerging female characters such as Buffy the Vampire Slayer.

Whether decades of superhero characters awash in children’s comic books and movies serve as predictive programming to normalize fusing their body with a chip, capable of reproducing a favorite superhero’s enhanced physical or cognitive skills, is a different subject matter altogether.

Nonetheless, the rest of Philips’ introductory statement is fantastic: “…but there’s a subtle change happening. Many children expect to develop superpowers themselves.”

Really? Since when?

My 7-year-old son might expect to develop flying powers or the ability to see in the dark, but he is pretending to have wings and he is pretending to possess night vision. Indeed, many children want to be superheroes, but they are pretending to possess superpowers of invisibility, 360-degree vision or teleportation.

“The limits on implants are going to be set by ethical arguments rather than scientific capacity,” writes Philips. “For example, should you implant a tracking chip in your child? There are solid, rational reasons for it, like safety. Would you actually do it? Is it a bridge too far?”

Ah, there goes that word. “Safety.” To protect ordinary people from a big, dangerous world where people are scamming one another, hijacking someone else’s identity and kidnapping their children. And unless they have a chip implant that tracks their location, no child ever will be safe.

But no, it’s never a bridge too far—for as long as the public allows it.

Like Brave New World, George Orwell’s novel, 1984, warns about the dangers of totalitarianism—a world governed by censorship and continual surveillance.

To quote Orwell’s final warning: “don’t let it happen. It depends on you.”

By: Cameron Keegan

Cameron Keegan is an independent researcher and writer on American politics, faith, and culture affecting young people through a conservative disposition. To learn more about Cameron’s work, visit https://ckeeganan.substack.com, and for comments or questions, send an email to ckeeganan@substack.com.

Featured image: Amal Graafstra, CC BY-SA 2.0 <https://creativecommons.org/licenses/by-sa/2.0>, via Wikimedia Commons

This story syndicated with permission from The Blue State Conservative