Augmented Humanity: The Man Machine | Avnet Silica

Display portlet menu

Augmented Humanity: The Man Machine | Avnet Silica

Display portlet menu

Augmented Humanity: The Man Machine

Illustration of a man showing digital, virtual world

Psychokinesis, or mind over matter, is a popular dream of sci-fi and horror story writers. In the digital world, dreams often come true and there are signs that soon we may be able to complete tasks without moving off our sofas. Remote control is changing to cerebral control – if you think it, you can do it.

The idea of linking humans with the machines we have created and to each other is intriguing and far from new. Today, many people carry a multitude of electronic signatures and interfaces. The smartphone is the most obvious and most capable, but credit cards also carry chips with personal information and, with the pandemic ongoing, the idea of chipping vaccinated people is not out of the realm of possibility – and let’s not forget our biometric signatures. Despite their usefulness, all of these require manual manipulation. Maybe it’s time to rethink the processes by just thinking about the process.

Science fiction has ridden the concept of controlling things with the power of the mind for years, notably Firefox, a Clint Eastwood thriller involving a thought-controlled fighter jet. The reality of this has been getting closer of late.

To Think it is to Do it: Wacker Neuson challenged visitors to Ars Electronica in Linz to operate its heavy construction equipment entirely by eye movement and thought control. (source ©: Linz GmbH & Co KG)

Facebook shared ideas about thought-to-text capabilities in 2017, shortly after entrepreneur Elon Musk announced his plans to build Neuralink, a firm promising to ‘wire’ brains in the future. Startups and research projects are now emerging across the globe and the key to much of this future is the ‘brain computer interface’ (BCI).Alexandre Gonfalonieri, an AI consultant based in Switzerland and head of innovation at DNA Global Analytics, recently wrote in the Harvard Business Review, “The development of BCI technology was initially focused on helping paralysed people control assistive devices using their thoughts, but new use cases are being identified all the time. For example, BCIs can now be used as a neuro-feedback training tool to improve cognitive performance. I expect to see a growing number of professionals leveraging BCI tools to improve their performance at work. For example, your BCI could detect that your attention level is too low, compared with the importance of a given meeting or task, and trigger an alert. It could also adapt the lighting of your office based on how stressed you are or prevent you from using your company car if drowsiness is detected.”

 

Your BCI could detect that your attention level is too low or adapt the lighting based on how stressed you are.

Alexandre Gonfalonier, DNA Global Analytics

 

Imagine, Gonfalonieri says, if your manager could know whether you actually paid attention in your last Zoom meeting. Imagine if you could prepare your next presentation using only your thoughts. Scenarios like this might soon become a reality thanks to developments in the field.

BCIs may even find their way into such mundane tasks as steering an excavator. At Ars Electronica, an annual expo in Linz, Austria, that celebrates advances in digitalisation, Wacker Neuson, a British manufacturer of heavy construction equipment, challenged visitors to steer a 15-ton backhoe excavator entirely by eye movement and brain power. The company envisages applications for this new technology in controlling heavy machinery in tight or hazardous environments where human operators would be in danger.

 

Release the Beast

A lot comes down to definitions, says Evan Coopersmith, executive VP of data science at AE Studio, a software development and data science venture studio that conducts projects for clients and also funds its own BCI research. “Right now, we think of BCI in terms of something internal to a person manifesting as external action,” he says. Is a cochlear implant BCI? Yes, says Coopersmith, though he admits that others might disagree with that assessment. A BCI could also be a device that notifies the brain of an oncoming seizure. “I would say anything that is a neuro-logical interface that connects the brain to the outside world, in either direction, is BCI and I think those boundaries will increasingly blur,” he explains.

 

The pandemic accelerated our plans for in-home testing but this has been a goal for a long time.

Jennifer L. Collinger, University of PittsburghRehab Neural Engineering Labs (RNEL)

 

The work AE Studio is pursuing is leading edge but without any literal cutting edge involved. Coopersmith says implanting devices in someone’s skull is not the name of the game. Rather, he sees an opportunity to make better use of what can be learned from brain activity to get to a much more powerful kind of interface than is available today. “Our expertise is in software development and machine learning, and in understanding data”, he says. No two brains are identical and they can produce a broad range of electrical responses. The challenge, he believes, is understanding how to interpret those responses and fit that knowledge into other evidence and, thereby, enhance our means of interacting.Coopersmith’s other aspiration is to see BCI developed by neutral parties rather than big, for-profit enterprises that may be tempted to use what they learn to supercharge their marketing. “The risk is both for some upstart looking to become the next Facebook/Meta, as well as Meta itself attempting to gain even more intimate access to our thought patterns,” he says. “We see this [approach to BCI] as aligning with the goals of a Web3 or Web 3.0 that is more decentralised than what we have today.”

Pittsburgh’s RNEL lab is investigating the feasibility of using intracortical microelectrode arrays implanted in motor cortex for providing high degree of freedom control of a robotic arm. (source ©: UPMC, Pitt Health Science)

For now, BCI is still mostly useful for people who have some degree of diminished agency, for example those dealing with paralysis, but here, too, targets are shifting. “We don’t know how we will define agency in 20 to 50 years; we went from a 12-second flight on a beach [the Wright brothers, 1903) to landing on the moon [1969], events separated by just 66 years,” Coopersmith notes. “It is hubristic to predict what BCI will do.” Nonetheless, it seems likely that human beings will increasingly interact with the world and each other through the mediation of technology

The direction things are heading is obvious. In May of this year, a team of neural engineers at the University of Pittsburgh’s Rehab Neural Engineering Labs (RNEL) published a proof-of-principle for a bidirectional BCI – a type of BCI that enables not just data reading but also provides feedback through data writing abilities.

 

Taking Control

In other words, it enables patients with paralysis to control a robotic arm using their thoughts and enables them to feel how hard the arm is clutching an object. Study participants were able to take control, without assistance from the researchers, to perform difficult tasks from home.Jennifer Collinger, the senior author on this report, believes this proves that BCI studies no longer need to be restricted to an onsite lab. “The pandemic accelerated our plans for in home testing but this has been a goal for a long time,” she explains. “We need to get the technology into real-world environments. We just want study participants to be able to do the things they want to do with a BCI.”

 

It’s All in Your Head

BCI is in fact already an industry segment of significant size. A recent study, Brain Computer Interface Market Size, Share & Trends Analysis Report by Grand View Research, an Indian consultancy, says the global brain-computer interface market size was valued at $1.2 billion in 2019 and is anticipated to grow at an annual rate of 15.5 percent through 2027.

According to this report, brain-computer interface technology is increasingly used in mobile and virtual gaming industries by integrating BCI within virtual reality (VR) headsets. Virtual gaming has opened a plethora of new opportunities for mind-controlled headsets and gadgets, which is further driving the adoption of brain-controlled interface technology. Manufacturers are increasingly focusing on the development of BCI-enabled video games. Grand View believes innovations such as these, are likely to drive the market for brain-controlled computer interface technology over the next few years.

International Data Corporation (IDC) is taking a wider view of the market, one that includes technologies such as augmented and virtual reality (AR/VR), biometrics, exoskeletons, affective computing, ingestibles, injectables, implantables, wearables, and smart devices, as well as brain-computer interfaces. The research firm predicts that the European segment of this augmented humanity (AH) market will reach over $50 billion by the start of 2022 and more than $100 billion by 2025.

The new Reality Qualcomm’s latest AR headset supports seven cameras, including two internal cameras for eye tracking and four external cameras which give the wearer the capability of spatial mapping.

Augmented humanity removes accessibility barriers that limit humans from performing their daily tasks, according to IDC. AH aims to empower people by developing data-driven, tech-based, innovative solutions to enable them to perform at levels they could not achieve before.

IDC’s The Future of Augmented Humanity in Europe: 2020–2025 Forecast estimates that investments in AH technologies will surge as both people and corporations invest in tech to improve their quality of life and achieve enhanced ways of working.

Apple’s Private EyesThe much-rumored Apple Glasses won’t be on the market until 2023. But details have leaked out. A new Apple patent mentions “privacy eyewear” that aim to stop people snooping on your iPhone’s display.

Consumer needs change fast but streamlining and automating domestic and routine tasks remains a priority, according to the report. In fact, investments in smart devices and wearables will remain high and will drive a significant share of the overall AH market. At the same time, businesses are searching for techbased ways to drive innovation and achieve relevant key performance indicators (KPIs), such as lowered costs, increased productivity and improved safety. Integration is key and IDC predicts that in the future we will see strong investments in integrated technologies such as Apple’s AR glasses set to launch in 2023.

 

A Human Need

“For as long as I can remember, I wanted to be in medicine. I thought about going into surgery, growing up, but I love science and math,” says Dustin Tyler, director of the Human Fusions Institute at Case Western Reserve University (CWRU) in Cleveland, Ohio. His path took an evolutionary twist when he was in high school in the early 1980s and purchased an Atari 800XL, one of the first personal computers. “That explains why I didn’t get married until I was in my 30s,” he quips. His love of computers, even in their primitive form, and trying to understand how to get them to do what he wanted reconnected him to his biological interests and the question of how our brains control our bodies.”

Working Together by linking the gamer’s cognitive By directly connecting the human experience to the Avatar’s experience, the human and the robotic system become symbiotically linked. The human experiences the world of the robot. The robot becomes the extension of the human and the human intelligence. (source ©: Human Fusions Institute, Case Western Reserve University)

The brain obviously is fascinating and the duality between brain and computer has been very interesting to me,” Tyler says. In the past, Alan Turing said we created computers to serve us but then Marvin Minsky came along and said that these machines actually do things like humans. That perception has led to some people thinking that AI is replacing the brain, he observes. “We can learn from both Turing and Minsky,” he concludes.

 

We can learn from both Turing and Minsky.

Dustin Tyler, Human Fusions Institute Case Western Reserve University

 

While happy to quote theoreticians, Tyler is very much results-oriented. A project he directed allowed a student at CWRU to touch, feel and hold a banana that was 2,300 miles away at the University of California, Los Angeles (UCLA). It was more than just a scientific stunt because it demonstrated the possibility of placing a prosthesis somewhere in the world and via ‘neural reality’ someone can literally feel identical sensations remotely. Of more immediate consequence, his team provided the sensation of physical touch to a prosthesis so an amputee could safely pick up his granddaughter or effectively slice a tomato. The Human Fusions Institute team is also aiming to win some, or all, of the Avatar XPrize, a contest sponsored by Japan’s All Nippon Airways, which is focused on the development of an avatar system that will transport a human’s sense, actions and presence to a remote location in real time, leading to a more connected world. Pursuing similar goals is Raviraj Nataraj at the Stevens Institute of Technology in Hoboken, New Jersey. His laboratory is creating instrumented wearables and virtual reality environments to leverage sensory feedback and cognitive factors for training better movement functions. Clinical solutions are being developed for people with neurotraumas, including spinal cord injury, stroke, traumatic brain injury and amputation. Restorative devices of interest include sensorimotor prostheses and powered exoskeletons.Nataraj explains that the challenge is to command and control prosthetics more naturally, through biological signals. Until now that has been accomplished through muscle movement but it could also be done with the brain. “We currently take measurements at the brain but more to see how people respond to the training we are providing, he says. That effort is especially crucial for people dealing with severe motor disabilities, for whom signaling through muscles is problematic.

 

The principles have been established but now it's about how to do it better.

Raviraj Nataraj, Stevens Institute of Technology

 

“The principles have been established but now it is about how to do it better so we can have an actual impact at the clinical level,” says Nataraj. Cost is also an issue, he admits because specialised, sophisticated products typically carry a high price tag.

A Step AheadResearchers have created a powered, individualized orthosis that can automatically adapt the level of response, but their goal is to eventually get rid of the robot and walk normally without the exoskeleton. (source ©: Stevens Institute of Technology)

 

Gaming the World

Cost associated with new, experimental technology has often been addressed by finding a larger-scale market to drive prices down. While serious and critical needs in human wellness and the military are top targets for BCI, a surprising level of interest has also materialised in the ‘fun’ segments such as gaming. The dream here is to create total immersion into a game world Working Together by linking the gamer’s cognitive perceptions directly to the virtual environment and allowing direct control through thought processes. This dream, at least, is now close to fruition. Two years ago, computer scientists at the Graz Technical University in Austria showcased versions of such simple, but popular, video console games as Pong and Pacman. Users could control the games through an array of sensors attached to a close-fitting cap.

Pointing the Way: The FinchRing is a new kind of wearable that enables hands-free gesture control for AV and VR mixed reality. (source ©: Finch Technologies Ltd)

Last April, the team managed to implant a device containing 1,024 electrodes directly into the motor-cortex of a monkey’s brain – the region of the cerebral cortex involved in the planning, control and execution of voluntary movements. The animal was taught to play Pong with a joystick and subsequently learned to control the game by brainwaves alone. Gary Yamamoto, CEO of Finch Technologies, which focuses on VR and AR technologies, asks, “Today’s human-machine interfaces are woefully behind new advances in technology. What is the purpose of all these new, incredible technologies if there aren’t intuitive and natural ways for people to interact with them?”

 

Sooner than most expect, we will see a basic BCI.

Gary Yamamoto, Finch Technologies

 

Finch is looking at how AR and VR applications can include all current human-machine interfaces (HMIs), including BCI, computer vision, voice solutions, inertial measurement units (IMU) for finger and hand tracking, eye tracking and more. Finch’s fusion integrates third-party technologies with its own products to move HMI experiences to the next level and accelerate the realisation of the Metaverse.

“If we don’t do enough to evolve human-machine interfaces with this technology, I fear they may never reach their full massmarket potential,” Yamamoto warns. Despite this, he maintains the company believes that “sooner than most expect” we will see a basic BCI, fused with more conventional inputs, deliver a broader and more immersive sensory VR/AR user experience.

 

Technology will help humans elevate their skills.

Andrea Minonne, IDC Augmented Humanity Launchpad

 

What’s the Point?

While Finch focuses on fun and the researchers at CWRU and at Stevens Institute are keen on restoring capabilities to those with disabilities, other visionaries, such as billionaire Musk, are aiming even higher and have suggested a future in which our thoughts and desires are also communicable and actionable through technology. Most significantly in the case of Musk, the ambition comes with real misgivings about the growing power of artificial intelligence and machine learning and its potential to at least sideline, if not in fact conquer, humans. His Neuralink company, which is working to develop what it calls a brain-machine interface, is about leveling that playing field and giving humans abilities that will help them compete better on a planet with billions of connected things, a growing number of which are also intelligent.

Navigating the Metaverse: Human-machine interfaces (HMIs) could empower users to compete with technology in an connected world.

If anyone is looking for another reason to pay attention, they can adopt the IDC view, articulated by Andrea Minonne, senior research analyst and colead of the Augmented Humanity Launchpad at the firm’s UK offices: “Augmented humanity is the advocate of cultural change across the commercial and consumer segments,” he says. “Promoting an AH-oriented culture and complementing human skills with technology will help humans elevate their skills, automate business processes or domestic chores, unlock new capabilities, bring disruption, promote workforce transformation, and enable humanised customer experiences.” Could much the same be said for BCI itself?

READ MORE ARTICLES

Augmented Humanity: The Man Machine | Avnet Silica

Display portlet menu

Sign up for the Avnet Silica Newsletter!

Stay up-to-date with latest news on products, training opportunities and more!

Take a DEEP look into the future!

Get the latest market trends and in-depth trainings on our Digital Event Experience Portal!

Avnet Silica Design Hub

Browse and review hundreds of proven reference designs to accelerate your design process. Our designs can be modified and saved in our AVAIL design tool and then exported to your CAD tool of choice.

Augmented Humanity: The Man Machine | Avnet Silica

Display portlet menu
Related Articles
laptop with graphic overlay
How is AI changing the electronics industry?
By Philip Ling   -   May 22, 2024
Artificial intelligence has the potential to influence every part of any company. As a trusted partner and leader in technology, Avnet has a responsibility to its customers and suppliers to consider the impact of AI from all angles.
tensors
Why AI and embedded design share the same DNA
By Philip Ling   -   May 22, 2024
Intelligence comes in many forms. More of us are interacting with devices that appear to understand us. What and how they understand depends on the technology inside. How are embedded engineers implementing intelligence?

Augmented Humanity: The Man Machine | Avnet Silica

Display portlet menu
Related Events
Man holding 3D hologram of earth and IOT devices
CI/CD Strategies for HIL Testing of Embedded Systems
Date: July 5, 2023
Location: online, on-demand