Augmented Reality

Univet Develops Modular AR Glasses with Interchangeable Lenses

From Red Dot Award Winning Tools: Univet 5.0 Augmented Reality Safety Glasses – Core77

Italian safety equipment manufacturer Univet just received a 2017 Red Dot Award for its augmented reality safety glasses. The glasses integrate Sony’s “holographic waveguide technology” into eye protection that allows wearers to view real time data without looking up from what they are doing.

A monocular projection system displays data on a holographic screen behind the right protective lens. The screen is clear so the wearer can see through it.
The glasses can use WiFi or Bluetooth to access data on computers, tables, and smartphones. Information is able to travel in both directions so data can be collected from internal sensors such as the GPS and microphone or optional sensors such as thermometers and cameras.

Take a look at the pictures and videos.

Omega Ophthalmics turning the human eye into a platform for AR

From Omega Ophthalmics is an eye implant platform with the power of continuous AR | TechCrunch

… lens implants aren’t a new thing. Implanted lenses are commonly used as a solve for cataracts and other degenerative diseases mostly affecting senior citizens; about 3.6 million patients in the U.S. get some sort of procedure for the disease every year.

Cataract surgery involves removal of the cloudy lens and replacing it with a thin artificial type of lens. Co-founder and board-certified ophthalmologist Gary Wortz saw an opportunity here to offer not just a lens but a platform to which other manufacturers could add different interactive sensors, drug delivery devices and the inclusion of AR/VR integration.

Maybe there’s a surprisingly large audience among the over 60 that is willing to try and get a second youth through biohacking. Maybe over 60s will become the first true augmented humans.

Progress in Smart Contact Lenses

From Smart Contact Lenses – How Far Away Are They? – Nanalyze

The idea of smart contact lenses isn’t as far away as you might think. The first problem that crops up is how exactly do we power the electronics in a set of “smart” contact lenses. As it turns out, we can use the energy of motion or kinetic energy. Every time the eye blinks, we get some power. Now that we have the power problem solved, there are at least several applications we can think of in order of easiest first:

  • Level 1 – Multifocal contact lenses like these from Visioneering Technologies, Inc. (VTI) or curing color blindness like these smart contact lenses called Colormax
  • Level 2 – Gathering information from your body – like glucose monitoring for diabetics
  • Level 3 – Augmenting your vision with digital overlay
  • Level 4 – Complete virtual reality (not sure if this is possible based on the eye symmetry but we can dream a dream)

So when we ask the question “how far away are we from having smart contact lenses” the answer isn’t that simple. The first level we have already achieved.

Google Glass Enterprise Edition gets adopted where it always meant to be

From Google Glass 2.0 Is a Startling Second Act | WIRED

Companies testing EE—including giants like GE, Boeing, DHL, and Volkswagen—have measured huge gains in productivity and noticeable improvements in quality. What started as pilot projects are now morphing into plans for widespread adoption in these corporations. Other businesses, like medical practices, are introducing Enterprise Edition in their workplaces to transform previously cumbersome tasks.

and

For starters, it makes the technology completely accessible for those who wear prescription lenses. The camera button, which sits at the hinge of the frame, does double duty as a release switch to remove the electronics part of unit (called the Glass Pod) from the frame. You can then connect it to safety glasses for the factory floor—EE now offers OSHA-certified safety shields—or frames that look like regular eyewear. (A former division of 3M has been manufacturing these specially for Enterprise Edition; if EE catches on, one might expect other frame vendors, from Warby Parker to Ray-Ban, to develop their own versions.)

Other improvements include beefed-up networking—not only faster and more reliable wifi, but also adherence to more rigorous security standards—and a faster processor as well. The battery life has been extended—essential for those who want to work through a complete eight-hour shift without recharging. (More intense usage, like constant streaming, still calls for an external battery.) The camera was upgraded from five megapixels to eight. And for the first time, a red light goes on when video is being recorded.

If Glass EE gains traction, and I believe so if it evolves into a platform for enterprise apps, Google will gain a huge amount of information and experience that can reuse on the AR contact lenses currently in the work.

While everybody develops bulky AR units, WaveOptics builds micro displays for AR

From WaveOptics raises $15.5 million for augmented reality displays | VentureBeat

While a number of major manufacturers are building the full AR systems (including the optics, sensors, camera, and head-mounted unit), WaveOptics is focused on developing the underlying optics to deliver an enhanced AR experience.

The core of the WaveOptics technology is a waveguide that is able to channel light input from a micro display positioned at the periphery of a lens made of glass — or in the future, plastic. Unlike conventional technologies that rely on cumbersome prisms, mirrors, or scarce materials, WaveOptics’ optical design harnesses waveguide hologram physics and photonic crystals, which enable lightweight design with good optical performance, the company said.

Flight attendants with HoloLens – What could possibly go wrong? 

From Will HoloLens turn air travelers into mixed-reality characters? – GeekWire

Imagine a world where headset-wearing flight attendants can instantly know how you’re feeling based on a computer analysis of your facial expression.

Actually, you don’t need to imagine: That world is already in beta, thanks to Air New Zealand, Dimension Data and Microsoft HoloLens.

In May, the airline announced that it was testing HoloLens’ mixed-reality system as a tool for keeping track of passengers’ preferences in flight – for example, their favorite drink and preferred menu items. And if the imaging system picked up the telltale furrowed brow of an anxious flier, that could be noted in an annotation displayed to the flight attendant through the headset.

Google already failed at this. The only places where AR glasses would be socially accepted are those ones where personnel with equipment is the norm.

It would take years, if not decades, for people to accept the idea that flight attendants must have a special equipment to serve drinks.

Google couldn’t see what we see with glasses, so they are trying through our smartphones

From Google Lens offers a snapshot of the future for augmented reality and AI | AndroidAuthority

At the recent I/0 2017, Google stated that we were at an inflexion point with vision. In other words, it’s now more possible than ever before for a computer to look at a scene and dig out the details and understand what’s going on. Hence: Google Lens.This improvement comes courtesy of machine learning, which allows companies like Google to acquire huge amounts of data and then create systems that utilize that data in useful ways. This is the same technology underlying voice assistants and even your recommendations on Spotify to a lesser extent.