Nanalyze

Smart Contact Lenses – How Far Away Are They?

When thinking about the potential of emerging technologies, it’s often good to try and visualize the most extreme endpoint that you can think of, where the technology would be fully matured. We’ve done this before when we thought about what real-time virtual reality might look like, or when we thought about what a fully functioning autonomous taxi company would need to operate. Now we’d like to think about the same thing for augmented reality (AR), a technology that we think may present one of the biggest investing opportunities ever.

With augmented reality, it all comes down to the hardware. We saw that Google Glass had some problems with adoption, mainly that people are uncomfortable with the fact that other people are walking around possibly recording everything. We know that there are at least 13 other pieces of hardware being developed that all promise to spur the adoption of augmented reality. We need to think beyond glasses though. We need to think what augmented reality hardware will be like at the peak of its maturity. Ideally we can do away with glasses entirely, and go right to smart contact lenses that look something like this:

Source: Popular Science

The idea of smart contact lenses isn’t as far away as you might think. The first problem that crops up is how exactly do we power the electronics in a set of “smart” contact lenses. As it turns out, we can use the energy of motion or kinetic energy. Every time the eye blinks, we get some power. Now that we have the power problem solved, there are at least several applications we can think of in order of easiest first:

So when we ask the question “how far away are we from having smart contact lenses” the answer isn’t that simple. The first level we have already achieved. Now let’s look at companies that are addressing Level 2 smart contact lenses which is gathering information about the human body, using a method we like to call “EYE-O-T”.

The most notable player in the smart contact lenses game is Google Alphabet. Just last month, their life sciences division called Verily took in an $800 million investment. Navigating to their website shows a whole plethora of projects that they’re working on, one of which is a glucose-sensing “smart lens” for diabetics:

While Verily was working with Novartis to test the product on people last year, this didn’t happen which led to speculation that the technology may not even be feasible. If you’re at all interested in some controversy around the topic of “smart contact lenses”, then give this article from Stat a read. Here’s a quote and a really cool term you can throw around now, “slideware”:

But a former Verily manager recently called the lens “slideware” — a Silicon Valley term for breakthroughs that exist only on PowerPoint images. The company indeed produced a prototype, but it didn’t work, the former manager told STAT.

Why didn’t it work? Some experts are saying it’s because you cannot measure glucose level from tears. We call that a “showstopper”. Here’s another excerpt from the very interesting STAT investigation:

Smith has evaluated more than 30 “noninvasive” technologies that measure glucose from sweat, saliva, and tears. “I saw people working on this, and time after time after time, failing in the same ways or in entirely new ones,” he said in an interview. They all faced a problem no technical advance can overcome, Smith said. None of those fluids offers glucose readings that reflect the levels of glucose in blood.

Smith referenced above is a chemist and former chief scientific officer of the LifeScan division of Johnson & Johnson so definitely subject matter expert material there. This makes us wonder just what data you can gather from the human body using smart contact lenses. Verily isn’t the only company trying to answer that question.

Founded in 2013, Canadian startup Medella Health took in their first round of $1.4 million in summer of last year to fund a 15-person team that wants to build the exact same glucose monitoring solution as Verily. They claim their smart contact lens will cost “roughly $25/contact lens to create, whereas based on Google’s public disclosures, their’s will cost around between $200-300 per lens“. The CEO, Harry Gandhi, is a Thiel Fellow which means that Peter Thiel gave him $100,000 to drop out of college and run with this idea. According to an interview last year with Communitech News, Mr. Gandhi stated that the company plans to have their smart contact lens ready for testing in around November of this year.

To conclude, it’s still debatable as to whether Level 2 smart contact lenses are technically feasible for monitoring glucose or what else they may be able to monitor.

Moving on to Level 3 (AR) and Level 4 (VR) smart contact lenses, we’re running into a few problems understanding how the hardware part of this technology might work. In the case of augmented reality, you have two types: location-based and image-based. For location-based, the AR can navigate with GPS and that’s how it knows to display things like street directions or a map. For image-based AR, you’ll need to have a video camera. This type of AR needs to “see” what it is augmenting digital content over. There’s no way you could get that sort of hardware onto a contact lens – could you? Maybe you could just place it in a pendant that the user wears around their neck. Or eyeglasses?

One startup that is developing eyeglasses that project an image onto smart contact lenses is Innovega, a Bellevue Washington startup which just took in a seed funding round of $3 million from Chinese company Tencent. Here’s a look at their hardware:

Note that they’ll need to get FDA approval before they can sell the solution because it will support prescription eyeware, and obviously Tencent thinks that’s going to happen. eMACULA has entered the process of regulatory approval, and expects to receive market clearance by early 2018. The contact lenses are expected to cost about 20% more than regular disposable contacts. The glasses will cost a little more than the price of regular designer spectacles and sunglasses.

When we move from augmented reality to virtual reality, we don’t necessarily need that camera anymore but we will need motion sensors. You’ll need motion sensors not only for knowing where the user’s gaze is in the virtual world but also for foveated rendering which helps make VR less intensive by monitoring the location of the eyeball. How are you going to do that without utilizing an external piece of hardware? Can motion sensors actually be shrunk small enough to fit on a pair of contact lenses? Nanotechnology can probably make that happen – eventually. Until then, it looks like the Innovega “smart contact lenses” with “smart glasses” solution is going to be the most viable for AR/VR applications.

As for “Level 2″ where the smart contact lenses read vital information from the human body, like glucose monitoring, the jury is still out as to whether or not that will work. We’ll just have to continue waiting.

Want to get started with Virtual Reality? The HTC Vive and the Oculus Rift are the two best headsets right now. Right now HTC is offering $200 off their headset which comes with two controllers, a whole bunch of accessories, $50 gift card for Steam (content and games), and some free content to blow your mind right out of the box like Everest VR and Star Trek. Maybe the best part is watching how your friends and family react when they try it for the first time! Click here to pick one up.

  • Michael Tyner, OD

    There’s a glaring problem anytime you consider using contacts to project an “overlay” onto human vision.

    It is essentially impossible to project a focused image from an object at the cornea onto the retina. You would need another focusing lens inside the eye.

    • Nanalyze

      Thank you for the information Michael! Sincerely, we hadn’t even thought that it would be more complex than just a simple “smart contact lens”.

      Could we visualize a point in time in the future where part of growing up is just going in and getting your bionic eye modifications?

    • John L Waite

      Agreed….but perhaps if you viewed vergence manipulation form a different aspect, much like multi focal contacts, it could be done…..just an abstract idea. John L. Waite, Ph.D

      • Michael Tyner, OD

        Multifocal contacts have the advantage of dealing with objects at realistic distances.

        Focusing an object at 1 m requires a +1 diopter lens.

        Focusing an object at 10 cm requires a +10 diopter lens.

        Focusing at 1 cm requires a +100 diopter lens, 1 mm requires a +1000 diopter lens between the object and the focal plane.

        The crystalline lens inside the eye measures about +20 D.

  • Cheri Simonne Rubens

    I would love to be a test subject for this innovation. Please do not hesitate to contact me.

    • Nanalyze

      Hi Cheri,

      Fascinating stuff eh?

      You’ll need to contact the companies directly as we’re only a media site!

      • Jordan Kanaylo

        When are the google smart contact lenses coming out

        • Nanalyze

          We haven’t heard a pip but can tell you that there is tremendous interest from our readers on the topic. Looks like people have been all over the place with estimates so just hope Google says something about it at some point. We can’t even confirm they’re still working on it though you’d think so with $800 million being thrown at it.

Subscribe to the Nanalyze Weekly Digest

Subscribe to our Nanalyze Weekly Digest to receive a summary of all articles every week.

We’ll never use your email for anything other than sending you great articles about investing in disruptive technologies.

  • This field is for validation purposes and should be left unchanged.

129 Shares
Tweet15
Share18
Share90
Reddit
+12
Buffer4