Augmented reality (AR) is an emerging technology that enhances, or augments, a user’s perception of the real world. The enhancement can be delivered through sight, sound, or touch and provides additional information about a person’s environment.
Beyond consumer and entertainment applications, AR is highly useful for improving efficiency, safety, and productivity in the workplace by providing a user with important information (i.e. sensor data, inventory information, heat mapping) that is not naturally perceivable from the environment. Many industries including construction, medical, manufacturing, and defense sectors have begun to invest and develop AR technologies that enhance a user’s ability to complete a task. Recent partnerships between HTC Vive and AECOM, developments by the US military, and projects supported by Medtronic illustrate how AR heads-up displays offer a powerful method for conveying information needed to plan a construction site, learn a medical procedure, or successfully execute a military training exercise.
In addition to these on-Earth settings, AR has similar applications for improving operational tasks during space exploration. Astronauts stationed on the International Space Station (ISS) have already experimented using Microsoft’s Hololens to complete tasks.
Feedback from astronauts that tested the Hololens and NASA’s roadmap plans indicate that AR could be used to (1) superimpose instructions or illustrations that guide an astronaut through maintenance repairs, and (2) enable remote visibility between an astronaut and ground operator to solve an issue together in real-time. The Hololens is also being used to plan the next Mars mission. Additional projects such as AMARIS and open calls for AR solutions highlight NASA’s interest in developing AR technologies and AR’s growing ability to support unique conditions in aeronautic and space missions.
2018 is off with a bang, and so are we! We spent this past week in Las Vegas demo-ing our technology at CES and had a great time talking with fellow entrepreneurs and technologists. Here’s a few key takeaways from our CES experience.
Great Startups Start Anywhere
During CES, Shantanu spoke on a Techstars panel themed “Great Startups Start Anywhere” to share the benefits and reasons of why companies decide to start outside of Silicon Valley. In addition to representing our home state, Shantanu highlighted the importance of the networks and resources we have in Arizona, and how Phoenix is a becoming start-up city. The diversity in company headquarters throughout Eureka Park also illustrates the claim that great startups start anywhere. Companies we talked with came from a wide variety of cities in the US including Kansas City, Arlington, Cincinnati, and Detroit. Exhibitors from the Netherlands, Canada, and France — among many other foreign countries– had a large presence in Eureka Park.
The Da Vinci Surgical System is a robot built by Intuitive Surgical. After being approved for use by the FDA in 2000, it has been adopted by surgeons performing a wide range of minimally invasive procedures, including prostatectomies, cardiac valve repair, and gynecologic procedures. As of June 30, 2014, approximately 3,100 Da Vinci robots were installed worldwide, with each unit costing roughly $2 million. The primary innovation of the Da Vinci system is the surgeon’s console: an immersive visualization system that takes an ordinary laparoscopic image and projects it to a binocular display, enhancing the dexterity with which a surgeon can perform several procedures. For the patient, the Da Vinci system typically provides a reduced amount of pain and blood loss, frequently resulting in a shorter hospital stay and faster recovery period. Continue reading “Haptic Feedback in the Da Vinci Surgical System”
Adafruit provides a breakout board for the DRV2605 haptic driver from Texas Instruments. Although the example tutorial included with the product describes a quick way to set up the driver with an eccentric rotating mass (ERM) motor, we prefer using a linear resonant actuator (LRA) for increased precision and enhanced haptic feedback. You can use the breakout board with an Arduino Uno to quickly make a prototype of a system that delivers precise vibrotactile cues.
Solder the header strip onto the breakout board, and solder the LRA onto the breakout board. After this step, your DRV2605 breakout board should look like this:
Step 2: Wiring and Hookup
Connect VIN on the DRV2605 to the 5V supply of the Arduino
Connect GND on the DRV2605 to GND on the Arduino
Connect the SCL pin to the I2C clock SCL pin on your Arduino, which is labelled A5
Connect the SDA pin to the I2C data SDA pin on your Arduino, which is labelled A4
Connect the IN pin to an I/O pin, such as A3
Step 3: Testing and Creating Effects
Adafruit provides a very useful Arduino library for the DRV2605 that you can use to get started. In particular, we recommend looking through the example code to get an idea of the effects you can produce. In page 57 and 58 of the DRV2605 datasheet, you can find a table of all the effects you can produce “out of the box.”
Step 4: Creating Your Own Waveforms
Since you can also set the intensity of the LRA in realtime, you can design your own waveforms and effects by changing the value over time. Adafruit also provides an example for setting the value in realtime on Github. You can combine this example code with a waveform design tool like Macaron to customize the feedback provided by your new Arduino-powered haptic device!
With the rise of Netflix and Youtube as dominant platforms for video consumption, fewer people are visiting theaters to watch movies. An increasing amount of multimedia content will be designed for the home theater as these streaming services grow their libraries. Netflix users consume content on whichever screen is available: a laptop, tablet, or smartphone. As the user experience for content consumption shifts towards mobile applications and at-home viewing, the interactive elements of 3D and 4D film previously reserved for movie theaters will transition to technologies easily adopted by households.
Good video is engaging – it tells a compelling story with excellent production value. Since there is increasing competition for viewership between different streaming platforms, devices, and content production studios, there is an increasing demand for differentiated content – content that provides a unique experience to its viewers. Continue reading “The Future of 4D Home Cinema: A Haptic Effects Track”
Vibrotactile pulses (e.g. the buzzing of a cell phone or game controller) can provide users with real-time feedback in a computer interface, but it’s not the only way to transmit information through the sense of touch. Modulating the temperature of the surface of a device can also provide additional information to users.
When a current flows through a junction between two different conductors, heat can be generated or removed from the junction. This phenomenon is called the Peltier effect, named after physicist Jean Charles Athanase Peltier. Different conductive materials that exhibit a Peltier effect will generate or remove different amounts of heat proportional to the amount of current running through the junction – the Peltier coefficient measures how much heat is carried for every unit of charge flowing through the device. Continue reading “Temperature Feedback with the Thermoelectric (Peltier) Effect”
In this post, we’ll look at the different ways that some of the most popular wearables implement haptics. Outside of the Apple Watch, most wearables use a simple eccentric rotating mass motor for haptic feedback.
The Apple Watch was first introduced in the fall of 2014 and has since become the world’s best selling wearable device. It was Apple’s first introduction of its “Taptic Engine”, which provides haptic feedback for alerts and notifications. While the design of the Taptic Engine module is proprietary, it is likely a customized linear resonant actuator.
A few weeks ago, I noticed that the aluminum enclosure of my unibody Macbook Pro had a strange texture when I brushed my hand across the surface. After some tinkering, I noticed that this only happened when the device was being used while charging and that it only happened when using my shorter, 2-prong, power cable—leading me to believe there was some sort of current leakage happening.
If we’re honest about shortcomings in human physiology, optical illusions would be labelled “Brain Failures.”
– Neil DeGrasse Tyson
When we think about the ways our perception plays tricks on us, optical illusions come to mind first. They’re not the only kind of sensory illusions, though. Tactile illusions also illustrate the fascinating ways that our perception ‘fails’ to reflect reality. In this post, we describe several different types of tactile illusions.