Part II of “On Your Body” Wearable Computing Devices Series
In Part I: “Why Google Glass is just the tipping point”, we explored that mobile is no longer limited to the handheld smartphone and tablet. The building blocks of a thriving new technology market is emerging – on your body (OYB) smart wearables – that enable hands-free mobile computing anywhere, anytime, in any context. These Internet-connected devices and sensors are not limited to smart glasses but extend to smart wearable clothing, watches, wristbands, shoes, gloves and even pill transmitters. In this article we examine the implications of wearable computing devices to consumers, businesses, and society:
In the history of mankind, a handful of inventions have harnessed the power of human knowledge and propelled the human race to greater productivity and faster innovation – books, printing press, typewriter, mainframe computers, PCs, Internet, and mobile.
At the heart of these inventions is content creation, leveraging that content to spur further innovation. With the Internet era, we saw the knowledge floodgate open as information globally became accessible at one’s fingertips in seconds. And with mobility, content generation and social sharing steeply accelerated to new heights. According to Twitter, users generate over 400 million tweets per day.
Around the bend is the wearable computing device trend. Computing is no longer something that needs to be held; it’s integrated into our clothing or worn on our bodies. This will have profound consequences to home, office and life paradigms. With OYB smart sensors, data can be measured down to your heartbeat, blood pressure, blood glucose, etc. According to CSC, by 2020 rapid growth of global data will eclipse 35 zettabyte, equivalent to 35 billion terabytes. What’s probably not factored in their research is the implication of the OYB wearable computing device ecosystem that could generate billions of biometric datapoints per day across the globe. Big data just got even bigger.
Prominent global brands – Apple, Google, Microsoft, Sony, Samsung and others – are racing to market with their smart wearable devices, recognizing that wearables are the next logical evolution of mobile. Google Glass represents one of, soon-to-be, many wearable devices coming to the consumer market as discussed in “Part I: Why Google Glass is just the tipping point”. There are other wearable innovations emerging in the smart contact lens, watches, wristbands, shoes, clothing, jewelry, and other segments yet to be defined.
With that said, not every wearables segment will receive equal attention and corresponding investment from a product portfolio strategy and new product line up. Though the big tech firms are experimenting across the burgeoning wearable categories, certain wearable segments will receive a disproportionately larger portfolio allocation. The question is which categories will yield higher ROI opportunities?
A helpful case study, as a point of reference, is the evolution of personal telecommunication devices. The first successful consumer pager was Motorola’s Pageboy I introduced in 1974. It had no display and could not store messages, however, it was portable and notified the wearer that a message had been sent. By 1994, 20 years later, there were over 61 million pagers in use. Upgrading from a pager to a cell phone was not only compulsory, but incremental. From cell phone to smartphone, it was evolution, not revolution. Therefore, the ubiquitous mass adoption of smart glasses, as an example, is questionable, at least in the near future because it lacks the consumer precedence. The adoption curve will likely be longer and flatter.
Whereas, electronic watches with quartz movements took to mass market in 1969 with Seiko 35 SQ Astron. Since then, the watch categories have expanded to include digital with LED. The affordability and advanced functions fueled the digital watch segment into ubiquity. Analogous to the evolution from cell phone to the smartphone, the smart watch category may be best positioned for mass appeal. In April, Tim Cook hinted at “exciting new product categories” coming in the fall and across all of 2014. The revenue opportunity from the smart watch category can easily eclipse the smart glasses segment in the near term. Market intelligence firm ABI Research projects more than 1.2 million smart watches will be shipped in 2013 and that does not include Apple’s rumored iWatch.
Looking across the other smart wearable categories, it’s important to also caveat that the tech giants are unlikely to invest heavily into web-enabled clothing and shoes. Rather, the textile and shoe industries will need to collaborate with researchers to commercialize cheap, durable sensors that send data to smart computing devices – watches, glasses, phones and tablets. For consumers to widely adopt wearable sensors, sensor technologies would need to be sewn into the fabric and can withstand the abuse of wash and wear.
· UC San Diego researchers have developed durable biosensors that can be printed directly onto clothingto allow continuous biomedical monitoring of blood pressure and heart rate. Screen-printed carbon electrode arrays are attached to the elastic bands on clothing to measure hydrogen peroxide and the enzyme NADH.
· Cornell University’s Department of Textiles and Apparel is developing nanofibers – cotton threads coated with semiconductor polymers and nanoparticles that conduct electricity and monitor heartbeat, brainwaves, and other bodily functions, as well as act as sensors that could detect the presence of hazardous materials, such as E. coli or anthrax.
· Virginia Polytechnic Institute and State University in Blacksburg have developed a pair of smart pants with wires and fabric sewn together that detect and transmit movements to a computer. Sensors embedded in the fabric measure the speed, rotation and flexibility of the pants with every movement.
· Lunar Design’s web-enabled BLU Jacket with infused semiconductors into the fabric turns the jacket into a wearable user interface to display the wearer’s moods, ads or in theory anything on the Web.
Perhaps even a bigger game changer is still in development – input technology. Unlike SixthSense that uses computer vision for input, Skinput, developed by Microsoft Research’s Computational User Experiences Group, uses bio-acoustic sensing such as bone conduction to localize finger taps on the skin. When augmented with a pico-projector, the device can provide a direct manipulation, graphical user interface on the body without the need to equip the skin with sensors or tracking markers.
Though still a few years out, researchers at the Ulsan National Institute of Science and Technology in collaboration with other institutions are working on smart contact lenses with heads-up display (HUD) and integrated biosensors made with grapheme and silver nanowires. This computer vision technology could enable Google Glass-like capabilities in the smallest form.
When computer vision and bio-acoustic input technologies are infused into smart glasses, smart watches, arm-bands and other countless implementations, then things begin to become very interesting as the user is no longer constrained to a small form factor for input and output. Personal computing is able to achieve new heights in freedom. Content creation and the viewing of content, in theory, can be done…
As elaborated in the “Officeless Society” section below, we envision a world where productivity does not require an employee to sit in front a PC at a desk to create and interact with content. The notion of an office where people interact and collaborate with one another in-person may not necessarily become obsolete; however, we predict that people will spend a lot less time crammed in an office environment to accomplish their work. Between the driveless car, connected home and connected public spaces, productivity can take place in short bursts or for an extended period of time, independent of the location.
“The whole is greater than the sum of its parts.”
No singular smart device will serve as the all-in-one super OBY device that stomps all other wearable devices but rather it will be the integration of key wearables that will create a unified user experience. A smart watch, with its front-facing camera, is great for face-to-face video calling but not so with smart glasses. Smart glasses are ideal for taking line of sight photos and videos hands-free but not for viewing complex and/or large content, in its current form at least. Rather, it’s the sum of the devices that brings a rich, interactive user experience for the wearer, analogous to the rich and complex music of a symphony orchestra that cannot be replicated by any one instrument alone.
The clear winner emerging from the smart wearables trend is the healthcare industry. Biometrics on the wrist, hands, feet, eyes and literally anywhere on the body work symbiotically, in an integrated fashion to track the body and to notify the wearer. Healthcare as an on-demand service paradigm will shift to a real-time, proactive monitoring and treatment model with automated patient notifications for behavioral change and personalized treatment advice, with emphasis on disease prevention and management.
By 2023, according to a study in Health Affairs, the cost of treating people with at least one chronic disease could rise to $4.3 trillion. Chronic diseases account for about three-quarters of all healthcare spending. In the healthcare home monitoring industry, M&A and VC activities are surging with examples of Alere’s acquisition of MedApps, maker of HealthPAL and other wireless home health devices for monitoring patients with chronic diseases, Sequoia Capital backed Telcare that received FDA clearance for a cellular-enabled glucose monitor, and Intel-GE Care Innovations pursuing the disease management and home monitoring segments. Overtime, sophisticated smart wearables and OYB sensors will replace many of these single-purpose home-monitoring devices.
Health screening and biometric monitoring via OYB devices and peripherals can enable the collection of biometric data such as blood pressure, cholesterol levels (full lipid panel), triglycerides, blood glucose, body mass index, and tobacco use to monitor patients with:
Though the most recognizable benefits of wearable computing devices is in the areas of healthcare and home monitoring, there are many practical applications that are emerging in the areas of sports, health and fitness, active safety, risks prevention, entertainment, military, and security:
Your sensor-embedded pajamas, through biomedical monitoring, recognize your waking pattern. That data is sent to your smart watch, which in turn signals to the coffee maker to start brewing a fresh pot of coffee. At 6am, your smart watch alarm vibrates with a tone as a wake-up reminder. The nanofiber sensors in your pajamas detect that you are getting out of bed. That message is picked up by your smart watch, which triggers the window shades to slide open to let the morning sunshine in.
As you walk from the bedroom to the bathroom and then to the kitchen, the lights turn on/off based on your proximity to the light sensors. You catch up on news headlines through your arm-band mounted mini-projector unto the dining table. You head to the bathroom to start getting ready for the day. Your schedule and tasks for the day is projected unto the bathroom wall.
As you walk into the garage, the garage door automatically opens and the car engine starts. As you buckle into your seat, your autonomous car greets you and asks for the destination. You instruct it to drive to your first face-to-face meeting site. In transit, a portion of your windshield becomes a heads-up display area, allowing you to read through your emails, check your calendar and begin to work on your deliverables. Your hand gestures give you full freedom to compose content and to manipulate text, images and videos on the HUD area, all the while the robotic car drives you safely to your destination.
You hold your daily 15-minute standing videoconference call with your regional Sales team. Their faces are organized on the windshield display. You bring up documents to share. The meeting concludes and you arrive early to your client meeting. You step out of the car and walk away. The doors automatically lock. The autonomous car drives itself to a nearby parking space and turns off the engine.
As you are about to walk into the front lobby, you receive a call from your VP of Sales indicating that your presentation needs to be modified with a couple of changes. You step outside to the side of the building and begin to modify the presentation on the spot, projecting the document on the wall and using hand gestures to make edits. With five minutes to spare, you check-in and wait to be escorted upstairs to the conference room.
For the presentation, your interconnected wearable system wirelessly connects to the conference room projector. Wearing your smart glasses, you can view your speaker notes much like a tele-prompter to help you stay on track with your key points. All the while, your smart glasses record the meeting.
During the Q&A, you are able to easily transition between reference materials and the presentation with your hand gestures. At one point, in addition to your presentation on the room projector, you use your wearable projector to project to another wall a customer case study that reinforces a particular point in your presentation.
You meet with the sponsor to review the contract. From your armband smart device, the contract is projected unto the sponsor’s desk. Using her hand gestures, she is able to scroll through the document. Due to the urgency of their need, the sponsor decides to sign the contract on the spot. It’s immediately transmitted to back-office operation at the home office for processing.
As you exit the elevator to the main lobby, you instruct your car to pick you up at the front entrance. The self-driving car makes its way from the parking lot to the appointed location for rendezvous. The door opens and you enter the car. Once inside the car, your calendar and your next appointment is displayed on the windshield.
While you are updating your meeting report on the HUD, you receive a biometric sensor notification from the glucose monitoring system on your glove to your smart watch indicating that your blood glucose level has dropped below your minimum floor level. Your smart watch projects a map on the windshield HUD to the nearest store to pick up a snack to normalize your blood sugar level.
Due to the unplanned stop and heavy traffic on the road, you are running behind schedule. Your GPS recalculates your time of arrival. It’s going to be tight. You begin to perspire and your heartbeat begins to race. The nanosensor nodes near your heart and on your driving gloves sense your heightened stress level based on your increased heart rate and perspiration. That triggers your smart watch’s virtual assistant to turn on soothing music and in a calming voice reminds you that you have sufficient time to make the next meeting. This prompts you to take a deep breath and feel reassured that you will be on time.
Your pill transmitter in your stomach sends your credentials in advance to the client site security system for clearance. Upon arrival, the driverless car drops you off in front of the main lobby and self-parks. With security already cleared, you make your way up quickly to the meeting room.
You arrive on time. Your host indicates that their colleagues in the West coast office need to be videoconferenced into the meeting. Your wearable system connects to the in-room projector and initiates a videoconferencing session for the remote attendees.
As you exit the building, your car is waiting for you. As you enter, recognizing the time, it suggests nearby restaurants based on your personal history and preferences. You make your selection and the car begins to travel.
By the time you arrive you have already reviewed the menu on the HUD dashboard and placed an order in advance to save time. You finish your meal and walk out the door. Your wearable system has already processed your credit card transaction remotely with your preset gratuity added.
You travel to your other onsite client meetings. In between meetings, you continue to work in the car, at a café and even while walking using your smart glasses and arm-band bio-acoustic smart device.
You have guests coming to your home later tonight. You decide to get a head start by remotely enabling the iRobot Roomba vacuum and the dishwasher while in commute. Your smart watch personal assistant carries out the orders.
You stop by the grocery store to pick up some ingredients for dinner. Based on your shopping list, when you enter the store, your smart glasses augment the store layout with icons displaying the location of the items to purchase. As you approach certain isles, your smart glasses notify you of in-store coupons based on proximity and line of sight. As you bag the groceries, the store payment system scans the RFID tags on the products and then transmits the payment request to your smart watch.
Since you’re ahead of schedule by an hour, you decide to get some exercise on the bike. On your smart glasses, you can see ahead the traffic congestion and decide to take an alternate route. You can view your progress against previous rides to evaluate your performance by segments.
For some readers, this may seem something right out of a sci-fi movie but before dismissing it as wishful thinking, some of the use cases described above are currently being pursued with mobile apps and home, car and office automation. As the smart wearables market matures, much of this will become the “new” reality. It will not be without its challenges – fragmentation in technologies, hardware, OS, and applications – but it will have profound implications to the human race. From the industrial revolution to the digital revolution and now the connected revolution, we are moving into a new age of automation and robotics.
For business leaders, how should they navigate this uncharted course? What should be their enterprise mobile strategy that incorporates wearable computing devices? In part III of this series, we provide a practical framework for short- and long-term strategies that ensure your organization remains relevant and thrive in a dynamically changing landscape.
Achieve IoT Success
We understand the challenge of transforming an organization to embrace the Internet of Things. Let us help you increase your probability of success.
Contact Amyx+ for a free initial consultation.
About Amyx+ IoT Business Transformation | Strategy | Innovation | Product | Data Analytics
- Voted Top IoT Influencer by Skyhook
- Voted Top IoT Rockstar by HP Enterprise
- Voted Top IoT Influencer by Inc. Magazine
- Voted Top in the Business of IoT by Relayr
- Voted Top Global IoT Expert by Postscapes
- Voted Top IoT Authority by the Internet of Things Institute
- Featured as a Top Internet of Things Company by Postscapes
- Voted Most Influential in Smart Cities and IIoT by Right Relevance
- Winner of the Cloud & DevOps World Award for Most Innovative Vendor
Amyx+ is an award-winning IoT business transformation firm specializing in IoT strategy, innovation & product development. As a thought leader in the Internet of Things, Amyx+ has the creative horsepower and the development prowess to execute even the most complex client engagements. Amyx+ is working with international and multinational enterprises to help 1) understand the impact of IoT disruptions, 2) formulate and sharpen their IoT strategy, 3) quantify the business case, 4) experiment, learn, validate, 5) develop game changing technologies, and 6) launch innovative IoT products and services worldwide. We employ a flexible methodology and approach to fit the client and needs & objectives while adapting to changing IoT environments. We have presence in San Francisco, NYC, and throughout Europe.