It was only just recently that Netflix revealed some truly fascinating new technology to let Netflix users control the Netflix iPhone app with just their eyes.
It truly was incredible to watch. Still is.
But now there's news about another new technology that could make "eye-controlled" apps quickly seem like something from the days of dial-up Internet. Because it turns out Samsung is working on something truly mind-blowing: eye-controlled television that leads ultimately to mind-controlled technology.
Samsung unveiled the progress it's made at a conference last week in San Francisco, and, as CNET reported, it's pretty impressive.
Much like what Netflix said about its new EyeNav functionality, which lets people control the Netflix app simply by staring at it on an iPhone or making other facial expressions, Samsung says its new technology, called Project Pontis, is aimed at making its products more accessible for people with disabilities.
But let's be honest: Nobody would imagine this would stop there. Mind-controlled electronics would usher in a completely new era of technology, something straight out of the most advanced science fiction.
The giant electronics manufacturer says it's teamed up on this project with the Center of Neuroprosthetics of the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland.
"How can we provide accessibility to people who cannot move or who have extreme limitations on their movements?" Ricardo Chavarriaga, a senior scientist at EPFL who works on the project, asked rhetorically at the Samsung Developer Conference last week.
To be sure, this isn't exactly "coming to a television near you" anytime soon. Much like the Netflix technology, it's also in early stages. But as a view of how we could interact with technology within a few years, it's fascinating.
Mind-controlled electronics requires an accurate map of the user's brainwaves. So the Samsung system broadly works like this:
- The first step is to collect a baseline of the user's brain activity, so that we have something to compare it to when the user wants to perform a specific action.
- To accomplish that, the user has to wear a headset that contains 64 brain-scan sensors, while his or her eye movements are also tracked.
- Finally, the current prototype tries to correlate the user's eye movement, along with when he or she stares at a particular video profile on a screen, and figure out which brain function suggests that user's intention, as evidenced by where his or her eyes land.
For the time being, the user's ultimate selection on the interface is determined by eye positioning. That sounds a lot like what Netflix developed with a lot less effort, basically by hacking a new application for the facial recognition software built into the current iOS on up-to-date iPhones.
From there, however, the ultimate goal for Samsung and EPFL is to create visual technology that responds purely to a user's thoughts. And that will be truly mind-blowing.
So, what's the timeline? Well, for one thing, the current version of the 64-sensor headset is unwieldy and uncomfortable--and it also "requires a layer of gel applied to the head," which as CNET dryly observes, is "something consumers likely aren't going to do at home."
And so, as one Samsung official said after the conference: "If it's applicable to us one day as pro couch potatoes, I have no idea."
Meantime, Samsung says the version it unveiled last week is prototype No. 1. It hopes to roll out a more advanced prototype in the first quarter of 2019, and then start testing it in Swiss hospitals.
But the future comes quicker than we think. Bookmark this column and come back in a couple of years. I'll bet the tech will be widespread, and we'll be pining for the days before our electronic devices paid attention to our thoughts.