Augmented Reality

From Augmented Reality to Altered Reality

From Dehumanization of Warfare: Legal Implications of New Weapon Technologies:

However, where soldiers are equipped with cybernetic implants (brain-machine interfaces) which mediate between an information source and the brain, the right to “receive and impart information without interference from a public authority” gains a new dimension. There are many technologies which provide additional information to armed forces personnel, e.g., heads-up displays for fighter pilots and the Q-warrior augmented reality helmets from BAE Systems, which are unlikely to impact this right.

However, there are technologies in development which are intended to filter data in order to prevent information overload. This may be particularly relevant where the implant or prosthetic removes visual information from view, or is designed to provide targeting information to the soldier. According to reports, software has been devised in Germany which allows for the deletion of visual information by smart glass or contact lens.

As one futurist was quoted as saying “So if you decide you don’t like homeless people in your city, and you use this software and implant it in your contact lenses, then you won’t see them at all.”

An entire section of this book is dedicated to the legal and ethical implications of using supersoldiers, augmented by bionic prosthetics, augmented reality devices, and neural interfaces, in modern warfare. Highly recommended.

The book is now featured in the “Key Books” section of H+.

AR glasses further augmented by human assistants 

From Aira’s new smart glasses give blind users a guide through the visual world | TechCrunch

Aira has built a service that basically puts a human assistant into a blind user’s ear by beaming live-streaming footage from the glasses camera to the company’s agents who can then give audio instructions to the end users. The guides can present them with directions or describe scenes for them. It’s really the combination of the high-tech hardware and highly attentive assistants.

The hardware the company has run this service on in the past has been a bit of a hodgepodge of third-party solutions. This month, the company began testing its own smart glasses solution called the Horizon Smart Glasses, which are designed from the ground-up to be the ideal solution for vision-impaired users.

The company charges based on usage; $89 per month will get users the device and up to 100 minutes of usage. There are various pricing tiers for power users who need a bit more time.

The glasses integrate a 120-degree wide-angle camera so guides can gain a fuller picture of a user’s surroundings and won’t have to instruct them to point their head in a different direction quite as much. It’s powered by what the startup calls the Aira Horizon Controller, which is actually just a repurposed Samsung smartphone that powers the device in terms of compute, battery and network connection. The controller is appropriately controlled entirely through the physical buttons and also can connect to a user’s smartphone if they want to route controls through the Aira mobile app.

Interesting hybrid implementation and business model, but I have serious doubts that a solution depending on human assistants can scale at a planetary level, or retain the necessary quality of service at that scale.

AR glasses competition starts to get real

From Daqri ships augmented reality smart glasses for professionals | VentureBeat

At $4,995, the system is not cheap, but it is optimized to present complex workloads and process a lot of data right on the glasses themselves.

and

The Daqri is powered by a Visual Operating System (VOS) and weighs 0.7 pounds. The glasses have a 44-degree field of view and use an Intel Core m7 processor running at 3.1 gigahertz. They run at 90 frames per second and have a resolution of 1360 x 768. They also connect via Bluetooth or Wi-Fi and have sensors such as a wide-angle tracking camera, a depth-sensing camera, and an HD color camera for taking photos and videos.

Olympus just presented a competing product for $1500.

Olympus EyeTrek is a $1,500 open-source, enterprise-focused smart glasses product

From Olympus made $1,500 open-source smart glasses – The Verge

The El-10 can be mounted on all sorts of glasses, from regular to the protective working kind. It has a tiny 640 x 400 OLED display that, much like Google Glass, sits semi-transparently in the corner of your vision when you wear the product on your face. A small forward-facing camera can capture photos and videos, or even beam footage back to a supervisor in real time. The El-10 runs Android 4.2.2 Jelly Bean and comes with only a bare-bones operating system, as Olympus is pushing the ability to customize it

It’s really cool that it can be mounted on any pair of glasses. Olympus provides clips of various sizes to adjust to multiple frames. It weights 66g.

The manual mentions multiple built-in apps: image and video players, a camera (1280x720px), a video recorder (20fps, up to 30min recording), and the QR scanner. It connects to other things via Bluetooth or wireless network.

You can download the Software Development Kit here.
It includes a Windows program to develop new apps, an Android USB driver, an Android app to generate QR codes, and a couple of sample apps.

BAE Systems working on eye tracking for military digital helmets

From How wearable technology is transforming fighter pilots’ roles

In the past, eye-tracking technology has had a bad press. “Using eye blink or dwell for cockpit control selection led to the so called ‘Midas touch’ phenomenon, where people could inadvertently switch things on or off just by looking,” says Ms Page. But combine a gaze with a second control and the possibilities are vast. “Consider the mouse, a genius piece of technology. Three buttons but great versatility.” Pilots, she says, could activate drop-down menus with their gaze, and confirm their command with the click of a button at their fingers.

In future, eye-tracking might be used to assess a pilot’s physiological state. “There’s evidence parameters about the eye can tell us about an individual’s cognitive workload,” says Ms Page.

Eye-tracking technology could also monitor how quickly a pilot is learning the ropes, allowing training to be better tailored. “Instead of delivering a blanket 40 hours to everyone, for instance, you could cut training for those whose eye data suggest they are monitoring the correct information and have an acceptable workload level, and allow longer for those who need it.”

Two thoughts:

  • Obviously, human augmentation is initially focusing on vision, but that’s just the beginning. Our brain seems to be capable of processing any input, extract a meaningful pattern out of it, and use to improve our understanding of the world. I expect the auditory system to be the next AR focus. I’d assume augmented earing would be especially useful in ground combat.
  • We are visual creatures so we are naturally inclined to assume that the large portion of our neocortex dedicated to image processing will be able to deal with even more data coming in. What if it’s a wrong assumption?

Univet Develops Modular AR Glasses with Interchangeable Lenses

From Red Dot Award Winning Tools: Univet 5.0 Augmented Reality Safety Glasses – Core77

Italian safety equipment manufacturer Univet just received a 2017 Red Dot Award for its augmented reality safety glasses. The glasses integrate Sony’s “holographic waveguide technology” into eye protection that allows wearers to view real time data without looking up from what they are doing.

A monocular projection system displays data on a holographic screen behind the right protective lens. The screen is clear so the wearer can see through it.
The glasses can use WiFi or Bluetooth to access data on computers, tables, and smartphones. Information is able to travel in both directions so data can be collected from internal sensors such as the GPS and microphone or optional sensors such as thermometers and cameras.

Take a look at the pictures and videos.

While everybody develops bulky AR units, WaveOptics builds micro displays for AR

From WaveOptics raises $15.5 million for augmented reality displays | VentureBeat

While a number of major manufacturers are building the full AR systems (including the optics, sensors, camera, and head-mounted unit), WaveOptics is focused on developing the underlying optics to deliver an enhanced AR experience.

The core of the WaveOptics technology is a waveguide that is able to channel light input from a micro display positioned at the periphery of a lens made of glass — or in the future, plastic. Unlike conventional technologies that rely on cumbersome prisms, mirrors, or scarce materials, WaveOptics’ optical design harnesses waveguide hologram physics and photonic crystals, which enable lightweight design with good optical performance, the company said.