Printable, Temporary Tattoo-like Medical Sensors are Under Development

pulse-trace-163708_640There is a new high-energy action and suspense drama on NBC this year called Blindspot. The first episode began when a woman in left in a luggage bag in the middle of Times Square in New York with tattoos completely covering her and absolutely no memory of who she is or how she got there. She is taken in by the FBI who starts to analyze her tattoos and see if they can figure out who she was before her memory was intentionally destroyed. It turns out that the tattoos are puzzles that, once solved, start to lead a team of agents assigned to her to a series of dangerous criminal operations.

“Jane” as they call her, is quickly made a part of this FBI team because, without knowing why, she immediately exhibits professional level fighting and weapons skills. She is also highly motivated to find out her real identity and is starting to experience brief memory flashbacks. All sorts of subplots and machinations have begun to sprout up regarding her true identity and how she ended up in this dilemma.

So far, the show is doing well in the ratings. Imho, after four episodes it’s off to a compelling and creative start. I plan to keep watching it. (The only minor thing I don’t like about it is the way the production team is using the shaky cam so much it’s making me feel a bit seasick at times.)

The lead actress, Jamie Alexander, who plays Jane, is actually wearing just temporary tattoos on the show. While these cryptic designs are the main device to propel the fictional plots forward in each episode, back in the non-fictional real world temporary tattoo-like devices are also currently being tested by researchers as medical sensors to gather patients’ biological data. This news adds a whole new meaning to the notion of medical application.

This advancement was reported in a most interesting article on Smithsonian.com, posted on October 8, 2015 entitled Tiny, Tattoo-Like Wearables Could Monitor Your Health, by Heather Hansman. I will summarize and annotate it in an effort to provide a, well, ink-ling about this story, and then pose some of my own questions.

Research and Development

This project, in a field called bio-integrated electronics, is being conducted at the University of Texas at Austin’s Cockrell School of Engineering. The research team is being led by Professor Nanshu Lu (who received her Ph.D. from Harvard).  Her team’s experimental patch is currently being applied to test heart rates and blood oxygen levels.

When Dr. Lu and her team were investigating the possibility of creating these “tattoo-like wearables”, their main concern was the manufacturing process, not the sensors themselves because there were many already available. Instead, they focused upon creating these devices to be both disposable and inexpensive. Prior attempts elsewhere had proven to be more “expensive and time-consuming”.

This led them to pursue the use of  3D printing . (These four Subway Fold posts cover other applications of this technology.) They devised a means to print out “patterns on a sheet of metal instead of forming the electronics in a mold”. They easily found the type of metal material for this purpose in a hardware store. Essentially, the patterns were cut into it rather than removed from it. Next, this electronic component was “transfer printed onto medical tape or tattoo adhesive”. Altogether, it is about the size of a credit card. (There is a picture of one at the top of the article on Smithsonian.com linked above.)

The entire printing process takes about 20 minutes and can be done without the use of a dedicated lab. Dr. Lu is working to get the cost of each patch down to around $1.

Current Objectives

The teams further objective is to “integrate multiple sensors and antenna” into the patches in order to capture vital signs and wirelessly transmit them to doctors’ and patient’s computing devices.  They can be used to measure a patient’s:

One of the remaining issues to mass producing the patches is making them wireless using Bluetooth or near field communication (NFC) technology. At this point, chip producers have not made any commitments to make such chips small enough. Nonetheless, Dr. Lu and her team are working on creating their own chip which they expect will be about the size of a coin.

My Questions

  • Could this sensor be adapted to measure blood glucose levels? (See a similar line of research and development covered in the June 27, 2015 Subway Fold post entitled Medical Researchers are Developing a “Smart Insulin Patch”.)
  • Could this sensor be adapted to improve upon the traditional patch test for allergies?
  • Could this sensor be adapted for usage in non-vital sign data for biofeedback therapies?
  • Would adding some artwork to these patches make them aesthetically more pleasing and thus perhaps more acceptable to patients?
  • Could this sensor be further developed to capture multiple types of medical data?
  • Are these sensors being secured in such a manner to protect the patients’ privacy and from any possible tampering?
  • Could the production team of Blindspot please take it easy already with the shaky cam?

Visionary Developments: Bionic Eyes and Mechanized Rides Derived from Dragonflies

"Transparency and Colors", Image by coniferconifer

“Transparency and Colors”, Image by coniferconifer

All manner of software and hardware development projects strive to diligently take out every single bug that can be identified¹. However, a team of researchers who is currently working on a fascinating and potentially valuable project is doing everything possible to, at least figuratively, leave their bugs in.

This involves a team of Australian researchers who are working on modeling the vision of dragonflies. If they are successful, there could be some very helpful implications for applying their work to the advancement of bionic eyes and driverless cars.

When the design and operation of biological systems in nature are adapted to improve man-made technologies as they are being here, such developments are often referred to as being biomimetic².

The very interesting story of this, well, visionary work was reported in an article in the October 6, 2015 edition of The Wall Street Journal entitled Scientists Tap Dragonfly Vision to Build a Better Bionic Eye by Rachel Pannett. I will summarize and annotate it, and pose some bug-free questions of my own. Let’s have a look and see what all of this organic and electronic buzz is really about.

Bionic Eyes

A research team from the University of Adelaide has recently developed this system modeled upon a dragonfly’s vision. It is built upon a foundation that also uses artificial intelligence (AI)³. Their findings appeared in an article entitled Properties of Neuronal Facilitation that Improve Target Tracking in Natural Pursuit Simulations that was published in the June 6, 2015 edition of The Royal Society Interface (access credentials required). The authors include Zahra M. Bagheri, Steven D. Wiederman, Benjamin S. Cazzolato, Steven Grainger, and David C. O’Carroll. The funding grant for their project was provided by the Australian Research Council.

While the vision of dragonflies “cannot distinguish details and shapes of objects” as well as humans, it does possess a “wide field of vision and ability to detect fast movements”. Thus, they can readily track of targets even within an insect swarm.

The researchers, including Dr. Steven Wiederman, the leader of the University of Adelaide team, believe their work could be helpful to the development work on bionic eyes. These devices consist of an  artificial implant placed in a person’s retina that, in turn, is connected to a video camera. What a visually impaired person “sees” while wearing this system is converted into electrical signals that are communicated to the brain. By adding the software model of the dragonfly’s 360-degree field of vision, this will add the capability for the people using it to more readily detect, among other things, “when someone unexpectedly veers into their path”.

Another member of the research team and one of the co-authors of their research paper, a Ph.D. candidate named Zahra Bageri, said that dragonflies are able to fly so quickly and be so accurate “despite their visual acuity and a tiny brain around the size of a grain of rice”4 In other areas of advanced robotics development, this type of “sight and dexterity” needed to avoid humans and objects has proven quite challenging to express in computer code.

One commercial company working on bionic eye systems is Second Sight Medical Products Inc., located in California. They have received US regulatory approval to sell their retinal prosthesis.

Driverless Cars

In the next stage of their work, the research team is currently studying “the motion-detecting neurons in insect optic lobes”, in an effort to build a system that can predict and react to moving objects. They believe this might one day be integrated into driverless cars in order to avoid pedestrians and other cars5. Dr. Wiederman foresees the possible commercialization of their work within the next five to ten years.

However, obstacles remain in getting this to market. Any integration into a test robot would require a “processor big enough to simulate a biological brain”. The research team believes that is can be scaled down since the “insect-based algorithms are much more efficient”.

Ms. Bagheri noted that “detecting and tracking small objects against complex backgrounds” is quite a technical challenge. She gave as an example of this a  baseball outfielder who has only seconds to spot, track and predict where a ball hit will fall in the field in the midst of a colorful stadium and enthusiastic fans6.

My Questions

  • As suggested in the article, might this vision model be applicable in sports to enhancing live broadcasts of games, helping teams review their game day videos afterwards by improving their overall play, and assisting individual players to analyze how they react during key plays?
  • Is the vision model applicable in other potential safety systems for mass transportation such as planes, trains, boats and bicycles?
  • Could this vision model be added to enhance the accuracy, resolution and interactivity of virtual reality and augmented reality systems? (These 11 Subway Fold posts appearing in the category of Virtual and Augmented Reality cover a range of interesting developments in this field.)

 


1.  See this Wikipedia page for a summary of the extraordinary career Admiral Grace Hopper. Among her many technological accomplishments, she was a pioneer in developing modern computer programming. She was also the originator of the term computer “bug”.

2For an earlier example of this, see the August 18, 2014 Subway Fold post entitled IBM’s New TrueNorth Chip Mimics Brain Functions.

3The Subway Fold category of Smart Systems contains 10 posts on AI.

4Speaking of rice-sized technology, see also the April 14, 2015 Subway Fold post entitled Smart Dust: Specialized Computers Fabricated to Be Smaller Than a Single Grain of Rice.

5While the University of Adelaide research team is not working with Google, nonetheless the company has been a leader in the development of autonomous cars with their Google’s Self-Driving Car Project.

6New York’s beloved @Mets might also prove to be worthwhile subjects to model because of their stellar play in the 2015 playoffs. Let’s vanquish those dastardly LA Dodgers on Thursday night. GO METS!