fbpx

Augmented Reality: Qualcomm’s SDK Update Tells An Intriguing Silicon- And Software-Agnostic Story

My first exposure to augmented reality (aside from occasional demos at past SIGGRAPHs), or so I thought until earlier this morning, was when I tried out a Nintendo 3DS portable gaming console a few months back in advance of tearing it down (along with iFixit's Kyle Wiens) in front of a live crowd at the Silicon Valley edition of the Embedded Systems Conference. Admittedly, the whole thing seemed a bit hokey to me; why would anyone be motivated to carry around a game-activating piece of cardboard when the content could alternatively be pre-loaded either on embedded nonvolatile memory or in a removable memory card? But don't take my word for it; give a look at the above enthusiast-filmed video review (plenty of other examples also exist, although curiously not on Nintendo's official YouTube channel; search on keywords such as 'Nintendo 3DS augmented reality' to find them) and decide for yourself.

My second (again, so I thought) exposure to AR came a few weeks later, when I attended Qualcomm's Uplinq conference. Given the company's longstanding advocacy of its Snapdragon SoCs containing Scorpion (and soon, Krait) microarchitecture-based CPU cores (by virtue of the company's ARM architecture license) and Andreno GPU cores (by virtue of the company's Q3 2006 acquisition of then-ATI Graphics' Imageon handheld graphics product line), the preponderance of CPU+GPU-leveraging AR demonstrations at the show wasn't necessarily a surprise…not to mention the fact that it neatly leverages the now-ubiquitous image sensors on the backsides of modern smartphones. But again, to be honest, I found the elementary gaming demos to be less-than-completely compelling (at this point, feel free to conclude that I'm just too old to get it, if you wish!).

But one AR-related thing I heard at Uplinq did notably catch my attention, and got my imagination churning. Jay Wright, the presenter and Qualcomm's R&D Division Senior Director of Business Development (who you can see above  demonstrating AR and otherwise being interviewed by Mobile Industry Review) mentioned that military applications were among the first users of AR technology. Basically, simply by pointing a camera at a piece of equipment, an operator or technician could pull up relevant on-screen usage, repair and other manuals. At that point, I started thinking about how neat it would be to do this at home. Instead of wading through the jam-packed file cabinet for the wrinkled, faded operational paperwork for my microwave oven, for example, I could simply point my smartphone at the appliance and directly access the documentation that way.

And then I accessed augmented reality's Wikipedia reference and quickly realized that I was more familiar with technology implementation examples than I'd thought. Consider, for example, the computer-generated static first-down line that's superimposed on multiple camera-captured images of a football field for television broadcast purposes. Or the CG moving line used during Olympic swimming events to note the pace necessary to break the particular event's existing record time. Or the CG blue dot that helps TV viewers follow the progress of a hockey puck. Or the heads-up displays projected onto an increasing number of automobiles' windshields. So I guess you can say that I'm now a lot more knowledgeable of and supportive of AR than I was just a few hours ago.

While my awareness of AR has now increased, that of most folks remains at or near zero. This is why, I'm suspecting, Qualcomm recently (and quietly) augmented (pun intended) its Augmented Reality SDK with support for Apple's iOS, used in the iPad, iPhone, iPod touch, and second-generation Apple TV. Historically, the SDK only supported the Android O/S. And one might rightly wonder why Qualcomm bothered expanding the support umbrella to include iOS, considering that to the best of my knowledge:

1.     Apple's not interested in licensing iOS to other OEMs, and

2.     Apple's not interested in using non-Apple (i.e. Qualcomm-sourced) ARM SoCs

I've asked my Qualcomm contacts for clarification, and if I receive it I'll update my writeup. But for now, here's my supposition. iOS is the second most populous smartphone O/S, after Android. By making it easy for developers to add AR support to iOS apps, Qualcomm does its part to cultivate the overall AR ecosystem, therefore boosting consumers' awareness of the technology…which in the long term benefits all participants in the ecosystem, including both Apple and Qualcomm.

Agree or disagree? Regardless, please share your thoughts in the comments.

Followup: Not five seconds after pressing the 'publish' button (seriously!), the following statement showed up in my inbox from a Qualcomm contact. Like I was saying…

While we always seek to make sure applications run best on Qualcomm-based devices, the company's augmented reality initiative is intended to drive this new category of vision-based AR across the entire ecosystem. To this end, Qualcomm has made its AR platform available to developers free of charge and simplified the process for developers to create apps for multiple platforms.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top