AR + #s = $$$: New Processes and Strategies for Extracting Actionable Business Data from Augmented Reality Systems

“Ars Electronica – Light Tank”, image by Uwe Rieger (Denz), Yinan Liu (NZ) (Arcsec Lab) @ St. Mary’s Cathedral

Perhaps taking two disparate assertions, one tacit and one spoken, completely out of their original contexts and re-mixing and re-applying them to a different set of circumstances can be a helpful means to introduce an emerging and potentially prosperous new trend.

First, someone I know has a relatively smart and empathetic dog who will tilt his head from side to side if you ask him (the dog) something that sounds like a question. His owner claims that this is his dog’s way of non-verbally communicating – – to paraphrase (or parabark, maybe) – – something to the effect of “You know, you’re right. I never really thought if it that way”. Second, in an article in the January 4, 2019 edition of The New York Times entitled The Week in Tech: Amazon’s Burning Problems, by David Streitfeld, there is an amusing quote from a writer for WIRED named Craig Mod describing his 2018 Amazon Kindle Oasis as being “about as interactive as a potato”.

So, let’s take some literary license (and, of course, the dog’s license, too), and conflate these two communications in order to paws here to examine the burgeoning commercial a-peel of the rich business data now being generated by augmented reality (AR) systems.

To begin, let’s look no further than the 2019 Consumer Electronics Show (CES) held last month in Las Vegas.  New offerings of AR products and services were all the rage among a number of other cutting-edge technologies and products being displayed, demo-ed and discussed.¹ As demonstrably shown at this massive industry confab, these quickly evolving AR systems that assemble and present a data-infused overlay upon a user’s real-world line of sight, are finding a compelling array of versatile applications in a widening spectrum of industries. So, too, like everything else in today’s hyper-connected world, AR likewise generates waves of data that can be captured, analyzed and leveraged for the benefit and potential profit of many commercial enterprises.

A close and compelling examination of this phenomenon was recently posted in an article entitled Unlocking the Value of Augmented Reality Data, by Joe Biron and Jonathan Lang, on the MIT Sloan Management Review site on December 20, 2018. I highly recommend a click-through and full read if you have an opportunity. I will try to summarize and annotate this piece and, well, augment it with some of my own questions.

[The Subway Fold category of Virtual and Augmented Reality has been tracking a sampling of developments in this area in commerce, academia and the arts for the past several years.]

Image from

Uncensored Sensors

Prior to the emergence of the Internet of Things (IoT), it was humans who mostly performed the functions of certain specialized sensors in tasks such as detecting environment changes and then transmitting their findings. Currently, as AR systems are increasingly deployed, people will be equipped with phones and headsets, among other devices, embedded with these sensing capabilities. This “provides uncharted opportunities for organizations” to make use of the resulting AR data-enabled analyses to increase their “operational effectiveness” and distinguish the offerings of their goods and services to the consumer public.

AR’s market in 2019 is analogous to where the IoT market was in 2010, garnering significant buzz and demonstrating “early value for new capabilities”. This technology’s capacity to “visualize, instruct, and interact” can become transformative in data usage and analytics. (See Why Every Organization Needs an Augmented Reality Strategy, by Michael E. Porter and James Heppelman, Harvard Business Review, November – December 2017.)

To thereby take advantage of AR, businesses should currently be concentrating on the following questions:

  • How best to plan to optimize and apply AR-generated data?
  • How to create improved “products and processes” based upon AR users’ feedback?

Image from

AR Systems Generate Expanding Spheres of User Data

Looking again to the past for guidance today, with the introduction of the iPhone and Android phones in 2007 and 2008, these tech industry turning points produced “significant data about how customers engaged with their brand”. This time period further provided engineers with a deeper understanding of user requirements. Next, this inverted the value proposition such that “applications could sense and measure” consumer experiences as they occurred.

Empowered with comparable “sensing capabilities emerging through the IoT”, manufacturers promptly added connectivity, thus generating the emergence of smart, connected products (SCPs). These new devices now comprise much of the IoT. The resulting massive data collection infrastructure and the corresponding data economy have been “disrupting technology laggards ever since”.

Moreover, using “AR-as-a-sensor” for gathering deep quantities of data holds significant potential advantages. Many AR-enabled devices are already embedded with sensing capabilities including “cameras, GPS, Bluetooth, infrared and accelerometers”. More organically, they also unleash human “creativity, intuition and experience” that cannot be otherwise replicated by the current states of hardware and software.²

What can humans with AR-based devices provide to enhance their experiences? New types of data and “behavioral insights” can be harvested from both SCPs and unconnected products. For example, in the case of an unconnected product, a user with a device equipped to operate as a form of AR-as-a-sensor could examine how the product is used and what are the accompanying user preferences for it. For an SCP, the AR-equipped user could examine how usage affects performance and whether the product is adaptable to that particular user’s purposes.

For additionally needed critical context, it is indeed “human interaction” that provides insights into how SCPs and unconnected devices are realistically operating, performing and adapting.

“Rainbow Drops”, Image by Mrs. eNil

Evaluating Potential Business Benefits from AR-Derived Data

This new quantum of AR information further creates a form of feedback loop whereby questions concerning a product’s usage and customization can be assessed. This customer data has become central to “business strategy in the new digital economy”.

In order to more comprehensively understand and apply these AR data resources, a pyramid-shaped model called “DIKW” can be helpful. Its elements include

  • Data
  • Information
  • Knowledge
  • Wisdom

These are deployed in information management operations to process unrefined AR data into “value-rich knowledge and insights”. By then porting the resulting insights into engineering systems, businesses can enhance their “product portfolio, design and features” in previously unseen ways.

AR data troves can also be merged with IoT-generated data from SCPs to support added context and insights. For unconnected devices or digital-only offerings, humans using AR to interact with them can themselves become sensors similarly providing new perspectives on a product’s:

  • Service usage
  • Quality
  • Optimization of the “user experience and value”

“DSC_445”, Image by Frank Cundiff

Preliminary Use Cases

The following are emerging categories and early examples of how companies are capturing and leveraging AR-generated data:

  • Expert Knowledge Transfer: Honeywell is gathering data from experienced employees and then enhancing their collective knowledge to thereafter be transferred to new hires. The company has implemented this by “digitizing knowledge” about their products only made visible through experience. This enables them to better understand their products in entirely new ways. Further details of this initiative is presented on the firm’s website in a feature, photos and a video entitled How Augmented Reality is Revolutionizing Job Training.
  • Voice of the Product: Bicycle manufacturer Cannondale is now shipping their high-end products with an AR phone app to assist owners and bike shop mechanics with details and repairs. This is intended to add a new dimension to bike ownership by joining its physical and digital components. The company can also use this app to collect anonymized data to derive their products’ “voice”. This will consequently provide them with highly informative data on which “features and procedures” are being used the most by cyclists which can then be analyzed to improve their biking experiences. For additional information about their products and the accompanying AR app, see Cannondale Habit Ready to Shred with All-New Proportional Response Design, posted on on October 9, 2018. There is also a brief preview of the app on YouTube.
  • Personalized Services: AR is being promoted as “transformative” to online and offline commerce since it enables potential buyers to virtually try something out before they buy it. For instance, Amazon’s new Echo Look permits customers to do this with clothing purchases. (See Amazon’s Echo Look Fashion Camera is Now Available to Everyone in the US, by Chris Welch, posted on on June 6, 2018.) The company also patented something called “Magic Mirror” in January 2018. When this is combined with Echo Look will point the way towards the next evolution of the functionality of the clothing store dressing room. (See Amazon’s Blended-Reality Mirror Shows You Wearing Virtual Clothes in Virtual Locales, by Alan Boyle, posted on on January 2, 2018.) The data collected by Echo Look is “being analyzed to create user preference profiles” and, in turn, suggest purchases based upon them. It is reasonably conceivable that combining these two technologies to supplement such personalized clothing recommendations will produce additional AR-based data, elevating “personalized services and experiences” to a heretofore unattained level.³
  • Quality Control: For quite a while, DHL has been a corporate leader in integrating AR technology into its workers’ daily operations. In one instance, the company is using computer vision to perform bar code scanning. They are further using this system to gather and analyze quality assurance data. This enables them to assess how workers’ behavior “may affect order quality and process efficiency”. (See the in-depth report on the company’s website entitled Augmented Reality in Logistics, by Holger Glockner, Kai Jannek, Johannes Mahn and Björn Theis, posted in 2014.)

Image from

Integrating Strategic Applications of AR-Derived Data

There is clearly a range of meaningful impacts upon business strategies to be conferred by AR-derived data. Besides the four positive examples above, other companies are likewise running comparable projects. However, some of them may likely remain constrained from wider exposure because of “technological or organizational” impediments.

With the emergence of AR-generated data resources, those firms that meaningfully integrate them with other established business data systems such as customer relationship management (CRM) and “digital engagement”, will yield tangible new insights and commercial opportunities. Thus, in order to fully leverage these potential new possibilities, nimble business strategists should establish dedicated multi-departmental teams to pursue these future benefits.

My Questions

  • Because the datastreams from AR are visually based, could this be yet another fertile area to apply machine learning and other aspects of artificial intelligence?
  • What other existing data collection and analysis fields might also potentially benefit from the addition of AR-derived data stream? What about data-driven professional and amateur sports, certain specialties of medical practice such as surgery and radiology, and governmental agencies such as those responsible for the environment and real estate usage?
  • What entrepreneurial opportunities might exist for creating new AR analytical tools, platforms and hardware, as well as integration services with other streams of data to produce original new products and services?
  • What completely new types of career opportunities and job descriptions might be generated by the growth of the AR-as-a-sensor sector of the economy? Should universities consider adding AR data analytics to their curriculum?
  • What data privacy and security issues may emerge here and how might they be different from existing concerns and regulations? How would AR-generated data be treated under the GDPR? Whether and how should people be informed in advance and their consent sought if AR data is being gathered about them?
  • How might AR-generated data affect any or all of the arts and other forms of creative expression?
  • Might some new technical terms of ARt be needed such as “ARformation”, “sensAR” and “stARtegic”?


1.  Much of the news and tech media provided extensive coverage of this event. Choosing just one report among many, the January 10, 2019 edition of The New York Times published a roundup and analysis of all of the news and announcements that have occurred in an engaging article with photos entitled CES 2019: It’s the Year of Virtual Assistants and 5G, by Brian X. Chen.

2.   For an alternative perspective on this question see the November 20, 2018 Subway Fold post entitled The Music of the Algorithms: Tune-ing Up Creativity with Artificial Intelligence.

3.  During the 2019 Super Bowl 53 played (or, more accurately, snoozed through), on February 3, 2019, there was an ad for a new product called The Mirror. This is a networked full-size wall mirror where users can do their daily workouts in directly in front of it and receive real-time feedback, performance readings, and communications with other users. From this ad and the company’s website, this device appears to be operating upon a similar concept to Amazon’s whereby users are receiving individualized and immediate feedback.

Smart Dust: Specialized Computers Fabricated to Be Smaller Than a Single Grain of Rice


“Sabotage #4: Mixing Noodles with Rice”, Image by Stefan

Back in 1977, Steve Martin put out a live album of his stand-up comedy performances called Let’s Get Small. Included was one of his signature routines called Well, Excuse Me. It went on to sell more than a millions copies. Much of it was laugh-out-loud hilarious.

Now, 48 years later, when the Internet of Things (IoT) has becoming a burgeoning global phenomenon, some very imaginative people have taken this notion (in name only), and  in a way that could have never been foreseen in any manner back then.

Certainly no excuses needed here. Rather, let’s have a look at this exciting new development.

Researchers at the University of Michigan, led by Professor David Blaauw, have recently fabricated a functional and autonomous computer measuring only 1 millimeter on each side. This device, dubbed the Michigan Micro Mote (M^3), was the subject of a most interesting article by Rex Sakamoto on entitled This Working Computer is Smaller Than a Grain of Rice, posted on April 6, 2015. I will summarize it, add some links and annotations, and pose a few questions. (The CNET article also contains an embedded video of a very informative recent report about this project on CBS News.)

This team’s work has been ongoing for more than ten years. With regards to the IoT, they believe that all of devices connected to it will require more “intelligence” and networking capabilities integrated into them whereby the M^3 could be the means to accomplish this.

The M^3’s current capabilities are photography and as temperature and pressure sensors. The researchers are now exploring a range of potential applications “ranging from medical to industrial” including:

  • Medical: How it can be “injected into the body” to take such temperature and pressure readings, as well as an electrocardiogram (EKG).
  • Energy: Assessing whether an existing oil well still contains any extractable reserves.
  • Consumer Goods: Attaching M^3s to everyday items such as keys and wallets to insure they are never lost inside or outside of the home.
  • Other potential apps on the project’s website include a platform containing “low-resolution imager, signal processing and memory, temperature sensor, on-board CMOS timer, wireless communication, battery, and solar energy harvesting that are all packaged in a 1mm3 volume through low-cost die stacking and encapsulation.”

In order to program and power up the M^3, the researchers created a means to accomplish this by using “strobing light at high frequency”. In turn, the M^3’s output is transmitted to an external computer by “conventional radio frequencies”.

The team’s current efforts involve reducing the size of the M^3 even further to a point where it may become the basis for a form of “smart dust“.

Just a few days ago in the April 10, 2015 Subway Fold post entitled The Next Wave in High Tech Material Science about metamaterials that can bend sound, light, radar and seismic waves, I speculated about some other potential applications for this emerging technology. With the M^3 so similar insofar as its originality and  potential to generate a myriad of applications not even considered yet,  my questions include:

  • Are there apps where the M^3 and metamaterials can be combined? What about in optical networks where metamaterials are using in the production of fiber cables where the M^3s could be used as sensors?
  • Would the M^3 make a via sensor for transportation infrastructure (roads, bridges, rails and so on), as well as the bodies, engines and electronics in cars, planes and trains? How about embedding them into buildings for additional safety technologies?
  • What safety and privacy protocols and policies will need to be developed and by whom? How can they be enforced?

Artificial Intelligence Apps for Business are Approaching a Tipping Point


“Algorithmic Contaminations”, Image by Derek Gavey

There have been many points during the long decades of the development of business applications using artificial intelligence (AI) when it appeared that The Rubicon was about to be crossed. That is, this technology often seemed to be right on the verge of going mainstream in global commerce. Yet it has still to achieve a pervasive critical mass despite the vast resources and best intentions behind it.

Today, with the advent of big data and analytics and their many manifestations¹ spreading across a wide spectrum of industries, AI is now closer than ever to reaching such a tipping point. Consultant, researcher and writer Brad Power makes a timely and very persuasive case for this in a highly insightful and informative article entitled Artificial Intelligence Is Almost Ready for Business, posted on the Harvard Business Review site on March 19, 2015. I will summarize some of the key points, add some links and annotations, and pose a few questions.

Mr. Power sees AI being brought to this threshold by the convergence of rapidly increasing tech sophistication, “smarter analytics engines, and the surge in data”. Further adding to this mix is the incursion and growth of the Internet of Things (Iot), better means to analyze “unstructured” data, and the extensive categorization and tagging of data. Furthermore,  there is the dynamic development and application of smarter algorithms to  discern complex patterns in data and to generate increasingly accurate predictive models.

So, too, does machine learning² play a highly significant role in AI applications. It can be used to generate “thousands of models a week”. For example, a model premised upon machine learning can be used to select which ads should be placed on what websites within milliseconds in order to achieve the greatest effectiveness in reaching an intended audience. DataXu is one of the model-generating firms in this space.

Tom Davenport, a professor at Babson College and an analytics expert³, was one of the experts interviewed by Power for this article. To paraphrase part of his quote, he believes that AI and machine learning would be useful adjuncts to the human analysts (often referred to as “quants”4). Such living experts can far better understand what goes into and comes out of a model than a machine learning app alone. In turn, these people can persuade business managers to apply such “analytical insights” to actual business processes.

AI can also now produce greater competitive efficiencies by closing the time gap between analyzing vast troves of data at high speeds and decision-making on how to apply the results.

IBM, one of the leading integrators of AI, has recently invested $1B in the creation of their Watson Group, dedicated to exploring and leveraging commercial applications for Watson technology. (X-ref to the December 1, 2014 Subway Fold post entitled Possible Futures for Artificial Intelligence in Law Practice for a previous mention and links concerning Watson.) This AI technology is currently finding significant applications in:

  • Health Care: Due to Watson’s ability to process large, complex and dynamic quantities of text-based data, in turn, it can “generate and evaluate hypotheses”. With specialized training, these systems can then make recommendation about treating particular patients. A number of elite medical teaching institutions in the US are currently engaging with IBM to deploy Watson to “better understand patients’ diseases” and recommend treatments.
  • Finance: IBM is presently working with 45 companies on app including “digital virtual agents” to work with their clients in a more “personalized way”; a “wealth advisor” for financial planning5; and on “risk and compliance management”. For example, USAA provides financial services to active members of the military services and to their veterans. Watson is being used to provide a range of financial support functions to soldiers as they move to civilian status.
  • Startups: The company has designated $100 million for introducing Watson into startups. An example is WayBlazer which, according to its home page, is “an intelligence search discovery system” to assist travelers throughout all aspects of their trips. This online service is designed to be an easy-to-use series of tools to provide personalized answers and support for all sort of journeys. At the very bottom of their home page on the left-hand side are the words “Powered by IBM Watson”.

To get a sense of the trends and future of AI in business, Power spoke with the following venture capitalists who are knowledgeable about commercial AI systems:

  • Mark Gorenberg, Managing Director at Zetta Venture Partners which invests in big data and analytics startups, believes that AI is an “embedded technology”. It is akin to adding “a brain”  – – in the form of cognitive computing – – to an application through the use of machine learning.
  • Promod Haque, senior managing partner at Norwest Venture Partners, believes that when systems can draw correlations and construct models on their own, and thus labor is reduced and better speed is achieved. As a result, a system such as Watson can be used to automate analytics.
  • Manoj Saxena, a venture capitalists (formerly with IBM), sees analytics migrating to the “cognitive cloud”, a virtual place where vast amounts of data from various sources will be processed in such a manner to “deliver real-time analytics and learning”. In effect, this will promote smoother integration of data with analytics, something that still remains challenging. He is an investor in a startup called Cognitive Scale working in this space.

My own questions (not derived through machine learning), are as follows:

  • Just as Watson has begun to take root in the medical profession as described above, will it likewise begin to propagate across the legal profession? For a fascinating analysis as a starting point, I highly recommend 10 Predictions About How IBM’s Watson Will Impact the Legal Profession, by Paul Lippe and Daniel Katz, posted on the ABA Journal website on October 4, 2014. I wonder whether the installation of Watson in law offices take on other manifestations that cannot even be foreseen until the systems are fully integrated and running? Might the Law of Unintended Consequences also come into play and produce some negative results?
  • What other professions, industries and services might also be receptive to the introduction of AI apps that have not even considered it yet?
  • Does the implementation of AI always produce reductions in jobs or is this just a misconception? Are there instances where it could increase the number of jobs in a business? What might be some of the new types of jobs that could result? How about AI Facilitator, AI Change Manager, AI Instructor, AI Project Manager, AI Fun Specialist, Chief AI Officer,  or perhaps AI Intrapreneur?


1.  There are 27 Subway Fold posts in the category of Big Data and Analytics.

2.  See the Subway Fold posts on December 12, 2014 entitled Three New Perspectives on Whether Artificial Intelligence Threatens or Benefits the World and then another on December 10, 2014 entitled  Is Big Data Calling and Calculating the Tune in Today’s Global Music Market? for specific examples of machine learning.

3.  I had the great privilege of reading one of Mr. Davenport’s very insightful and enlightening books entitled Competing on Analytics: The New Science of Winning (Harvard Business Review Press, 2007), when it was first published. I learned a great deal from it and this book was responsible for my initial interest in the applications of analytics in commerce. Although big data and analytics have grown exponentially since its publication, I still highly recommend this book for its clarity, usefulness and enthusiasm for this field.

4.  For a terrific and highly engaging study of the work and influence of these analysts, I also recommend reading The Quants: How a New Breed of Math Whizzes Conquered Wall Street and Nearly Destroyed It (Crown Business, 2011), by Scott Patterson.

5.  There was a most interesting side-by-side comparison of human versus automated financial advisors entitled Robo-Advisors Vs. Financial Advisors: Which Is Better For Your Money? by Libby Kane, posted on on July 21, 2014.