Skip to content

Research Paper On Google Glass Pdf Converter

Not to be confused with Google Goggles.

Google Glass Explorer Edition

Also known asProject Glass
TypeOptical head-mounted display (OHMD), Wearable technology
Release dateDevelopers (US): February 2013 (February 2013)[1]
Public (US): Around 2013[2]
Introductory priceExplorer version: $1,500 USD
Standard edition:[3]
Operating systemGlass OS[4] (Google Xe Software[5])
CPUOMAP 4430 System on a chip, dual-core processor[6]
Memory2 GB RAM[7]
Storage16 GB flash memory total[6] (12 GB of usable memory)[8]
DisplayPrism projector, 640×360 pixels (equivalent of a 25 in/64 cm screen from 8 ft/2.4 m away)
SoundBone conductiontransducer[8]
InputVoice command through microphone,[8]accelerometer,[8]gyroscope,[8]magnetometer,[8]ambient light sensor, proximity sensor
Controller inputTouchpad, MyGlass phone mobile app
Camera5 Megapixel photos
720p video[8]
ConnectivityWi-Fi802.11b/g,[8]Bluetooth,[8]micro USB
Power570 mAh Internal lithium-ion battery
Weight36 g (1.27oz)
Any Bluetooth-capable phone; MyGlass companion app requires Android 4.0.3 (Ice Cream Sandwich) or higher or any iOS 7.0 or higher[8]
Related articlesOculus Rift

Google Glass is a brand of smart glasses - an optical head-mounted display designed in the shape of a pair of eyeglasses. It was developed by X (previously Google X)[9] with the mission of producing a ubiquitous computer.[1] Google Glass displayed information in a smartphone-like, hands-free format.[10] Wearers communicated with the Internet via natural language voice commands.[11][12]

Google started selling a prototype of Google Glass to qualified "Glass Explorers" in the US on April 15, 2013, for a limited period for $1,500, before it became available to the public on May 15, 2014.[13][14] It had an integral 5 Megapixel still/720p video camera. The headset received a great deal of criticism and legislative action due to privacy and safety concerns.

On January 15, 2015, Google announced that it would stop producing the Google Glass prototype, to be continued in 2017 tentatively.[15] In July 2017, it was announced that the Google Glass Enterprise Edition would be released.[16]


Google Glass was developed by Google X,[17] the facility within Google devoted to technological advancements such as driverless cars.[18]

The Google Glass prototype resembled standard eyeglasses with the lens replaced by a head-up display.[19] In mid-2011, Google engineered a prototype that weighed 8 pounds (3.6 kg);[20] by 2013 they were lighter than the average pair of sunglasses.[1]

In April 2013, the Explorer Edition was made available to Google I/O developers in the United States for $1,500.[21]

The product was publicly announced in April 2012.[22]Sergey Brin wore a prototype of the Glass to an April 5, 2012, Foundation Fighting Blindness event in San Francisco.[23][24] In May 2012, Google demonstrated for the first time how Google Glass could be used to shoot videos.[25]

Google provided four prescription frame choices for $225 and free with the purchase of any new Glass unit. Google entered in a partnership with the Italian eyewear company Luxottica, owners of the Ray-Ban, Oakley, and other brands, to offer additional frame designs.[26] In June 2014, Nepal government adopted Google Glass for tackling poachers of wild animals and herbs of Chitwan International Park and other parks listed under World heritage sites. In January 2015, Google ended the beta period of Glass (the "Google Glass Explorer" program).[27][28]

Release date[edit]

In early 2013, interested potential Glass users were invited to use a Twitter message, with hashtag #IfIHadGlass, to qualify as an early user of the product. The qualifiers, dubbed "Glass Explorers" and numbering 8,000 individuals, were notified in March 2013, and were later invited to pay $1,500 and visit a Google office in Los Angeles, New York or San Francisco, to pick up their unit following "fitting" and training from Google Glass guides. On May 13, 2014, Google announced a move to a "more open beta", via its Google Plus page.[29]

In February 2015, The New York Times reported that Google Glass was being redesigned by former Apple executive Tony Fadell, and that it would not be released until he deemed it to be "perfect".[30]

In July 2017 it was announced that the second iteration, the Google Glass Enterprise Edition, would be released in the US for companies such as Boeing.[16]. Google Glass Enterprise Edition has already been successfully used by Dr. Ned Sahin to help children with autism to learn social skills.[31]


  • Touchpad: A touchpad is located on the side of Google Glass, allowing users to control the device by swiping through a timeline-like interface displayed on the screen.[32] Sliding backward shows current events, such as weather, and sliding forward shows past events, such as phone calls, photos, circle updates, etc.
  • Camera: Google Glass has the ability to take 5 Mp photos and record 720p HD video.[33]
  • Display: The Explorer version of Google Glass uses a liquid crystal on silicon (LCoS)(based on an LCoS chip from Himax), field-sequential color system, LED illuminated display.[34] The display's LED illumination is first P-polarized and then shines through the in-coupling polarizing beam splitter (PBS) to the LCoS panel. The panel reflects the light and alters it to S-polarization at active pixel sensor sites. The in-coupling PBS then reflects the S-polarized areas of light at 45° through the out-coupling beam splitter to a collimating reflector at the other end. Finally, the out-coupling beam splitter (which is a partially reflecting mirror, not a polarizing beam splitter) reflects the collimated light another 45° and into the wearer's eye.[35][36][37]


Main article: Glass OS


Google Glass applications are free applications built by third-party developers. Glass also uses many existing Google applications, such as Google Now, Google Maps, Google+, and Gmail. Many developers and companies have built applications for Glass, including news apps, facial recognition, exercise, photo manipulation, translation, and sharing to social networks, such as Facebook and Twitter.[38][39][40] Third-party applications announced at South by Southwest (SXSW) include Evernote, Skitch, The New York Times, and Path.[41]

On March 23, 2013, Google released the Mirror API, allowing developers to start making apps for Glass.[42][43] In the terms of service, it was stated that developers may not put ads in their apps or charge fees;[44] a Google representative told The Verge that this might change in the future.[45]

On May 16, 2013, Google announced the release of seven new programs, including reminders from Evernote, fashion news from Elle, and news alerts from CNN.[46] Following Google's XE7 Glass Explorer Edition update in early July 2013, evidence of a "Glass Boutique", a store that will allow synchronization to Glass of Glassware and APKs, was noted.[47]

Version XE8 made a debut for Google Glass on August 12, 2013. It brings an integrated video player with playback controls, the ability to post an update to Path, and lets users save notes to Evernote. Several other minute improvements include volume controls, improved voice recognition, and several new Google Now cards.

On November 19, 2013, Google unveiled its Glass Development Kit, showcasing a translation tool Word Lens, a cooking program AllTheCooks, and an exercise program Strava among others as successful examples.[48][49] Google announced three news programs in May 2014 – TripIt, FourSquare and OpenTable – in order to entice travelers. On June 25, 2014, Google announced that notifications from Android Wear would be sent to Glass.[50]

The European University Press published the first book to be read with Google Glass on October 8, 2014, as introduced at the Frankfurt Book Fair. The book can be read as a normal paper book or – enriched with multimedia elements – with Google Glass, Kindle, on Smartphone and Pads on the platforms iOS and Android.[51]


Google offers a companion Android and iOS app called MyGlass, which allows the user to configure and manage the device.[52]

Voice activation[edit]

Other than the touchpad, Google Glass can be controlled using just "voice actions". To activate Glass, wearers tilt their heads 30° upward (which can be altered for preference) or simply tap the touchpad, and say "O.K., Glass." Once Glass is activated, wearers can say an action, such as "Take a picture", "Record a video", "Hangout with [person/Google+ circle]", "Google 'What year was Wikipedia founded?'", "Give me directions to the Eiffel Tower", and "Send a message to John"[53] (many of these commands can be seen in a product video released in February 2013).[54] For search results that are read back to the user, the voice response is relayed using bone conduction through a transducer that sits beside the ear, thereby rendering the sound almost inaudible to other people.[55]


Healthcare applications[edit]

Several proofs of concept for Google Glass have been proposed in healthcare.

Augmedix developed an app for the wearable device that allows physicians to live-stream the patient visit and claims it will eliminate electronic health record problems, possibly saving them up to 15 hours a week[56] and improving record quality. The video stream is passed to remote scribes in HIPAA secure rooms where the doctor-patient interaction is transcribed, allowing physicians to focus on the patient. Hundreds of users[57] were evaluating the app as of mid-2015.[58]

Doctors Phil Haslam and Sebastian Mafeld demonstrated the first application of Google Glass in the field of interventional radiology. They demonstrated how Google Glass could assist a liver biopsy and fistulaplasty, and the pair stated that Google Glass has the potential to improve patient safety, operator comfort, and procedure efficiency in the field of interventional radiology.[59]

On June 20, 2013, Rafael J. Grossmann, a Venezuelan doctor practicing in the U.S., was the first surgeon to demonstrate the use of Google Glass during a live surgical procedure. In August 2013, Google Glass was used at Wexner Medical Center at Ohio State University. Surgeon Dr. Christopher Kaeding used Google Glass to consult with a distant colleague in Columbus, Ohio. A group of students at The Ohio State University College of Medicine also observed the operation on their laptop computers. Following the procedure, Kaeding stated, "To be honest, once we got into the surgery, I often forgot the device was there. It just seemed very intuitive and fit seamlessly."[61] On June 21, 2013, Spanish doctor Pedro Guillen, chief of trauma service of Clínica CEMTRO of Madrid, also broadcast a surgery using Google Glass.[62]

In July 2013, Lucien Engelen commenced research on the usability and impact of Google Glass in the health care field. As of August 2013, Engelen, based at Singularity University and in Europe at Radboud University Nijmegen Medical Centre, was the first healthcare professional in Europe to participate in the Glass Explorer program.[63] His research on Google Glass (starting August 9, 2013) was conducted in operating rooms, ambulances, a trauma helicopter, general practice, and home care as well as the use in public transportation for visually or physically impaired. Research included taking pictures, videos streaming to other locations, dictating operative log, having students watch the procedures and tele-consultation through Hangout. Engelen documented his findings in blogs,[64] videos,[65] pictures, on Twitter,[66] and on Google+,[67] with research ongoing as of that date.

In Australia, during January 2014, Melbourne tech startup Small World Social collaborated with the Australian Breastfeeding Association to create the first hands-free breastfeeding Google Glass application for new mothers.[68] The application, named Google Glass Breastfeeding app trial, allows mothers to nurse their baby while viewing instructions about common breastfeeding issues (latching on, posture etc.) or call a lactation consultant via a secure Google Hangout, who can view the issue through the mother's Google Glass camera.[69] The trial was successfully concluded in Melbourne in April 2014, with 100% of participants breastfeeding confidently.[70]

In June 2014, Google Glass ability to acquire images of a patient's retina ("Glass Fundoscopy") was publicly demonstrated for the first time at the Wilmer Clinical Meeting at Johns Hopkins University School of Medicine by Dr. Aaron Wang and Dr. Allen Eghrari.[71] This technique was featured on the cover of the Journal for Mobile Technology in Medicine for January 2015.[72]

In July 2014, the startup company Surgery Academy, in Milan, Italy, launched a remote training platform for medical students. The platform is a MOOC that allows students to join any operating theater thanks to Google Glass worn by surgeon.[73][74] Also in July 2014, This Place released an app, MindRDR, to connect Glass to a NeuroskyEEG monitor to allow people to take photos and share them to Twitter or Facebook using brain signals. It is hoped this will allow people with severe physical disabilities to engage with social media.[75]

There are several groups developing Google Glass based technologies to help children with autism learn about emotion and facial expressions. The first of these was developed by Brain Power who published the first academic paper on the use of Google Glass technology in children with autism[76].[77][78]

A visually impaired dancer, Benjamin Yonattan, used Google Glass to overcome his chronic vision condition. In 2015, Yonattan performed on the reality television program America's Got Talent.[79]

Journalism and mass media applications[edit]

In 2014, Voice of America Television Correspondent Carolyn Presutti and VOA Electronics Engineer Jose Vega began a web project called "VOA & Google Glass," which explored the technology's potential uses in journalism.[80] This series of news stories examined the technology's live reporting applications, including conducting interviews and covering stories from the reporter's point of view. On March 29, 2014, American a cappella group Pentatonix partnered with Voice of America when lead singer Scott Hoying wore Glass in the band's performance at DAR Constitution Hall in Washington, D.C., during the band's worldwide tour – the first use of Glass by a lead singer in a professional concert.[81]

In the fall of 2014, The University of Southern California conducted a course called "Glass Journalism," which explored the device's application in journalism.[82]

Non-profit NGO[edit]

The WWF as of mid-2014 used Google Glass and UAVs to track various animals and birds in the jungle, which may be the first use of the device by a non-profit, Non-governmental Organization (NGO).[83]


In 2014, the International Olympic Committee Young Reporters programme took Google Glass to the Nanjing 2014 Youth Olympic Games and put them on a number of athletes from different disciplines to explore novel point of view filmmaking.


Privacy concerns[edit]

Concerns have been raised by various sources regarding the intrusion of privacy, and the etiquette and ethics of using the device in public and recording people without their permission.[84][85][86] Google co-founder, Sergey Brin, claims that Glass could be seen as a way to become even more isolated in public, but the intent was quite the opposite: Brin views checking social media as a constant "nervous tic," which is why Glass can notify the user of important notifications and updates and does not obstruct the line of sight.[87]

Additionally, there is controversy that Google Glass would cause security problems and violate privacy rights.[88][89][90] Organizations like the FTC Fair Information Practice work to uphold privacy rights through Fair Information Practice Principles (FIPPS), which are guidelines representing concepts that concern fair information practice in an electronic marketplace.[91]

Privacy advocates are concerned that people wearing such eyewear may be able to identify strangers in public using facial recognition, or surreptitiously record and broadcast private conversations.[1] The "Find my Face" feature on Google+ functions to create a model of your face, and of people you know, in order to simplify tagging photos.[92] However, the only current app that can identify strangers is called MORIS (Mobile Offender Recognition and Identification System), and is a $3,000 iPhone app used by police officers.

Some companies in the US have posted anti-Google Glass signs in their establishments.[93][94] In July 2013, prior to the official release of the product, Stephen Balaban, co-founder of software company Lambda Labs, circumvented Google’s facial recognition app block by building his own, non-Google-approved operating system. Balaban then installed face-scanning Glassware that creates a summary of commonalities shared by the scanned person and the Glass wearer, such as mutual friends and interests.[95] Also created was Winky, a program that allows a Google Glass user to take a photo with a wink of an eye, while Marc Rogers, a principal security researcher at Lookout, discovered that Glass can be hijacked if a user could be tricked into taking a picture of a malicious QR code, demonstrating the potential to be used as a weapon in cyberwarfare.[96]

In February 2013, a Google+ user noticed legal issues with Glass and posted in the Glass Explorers community about the issues, stating that the device may be illegal to use according to the current legislation in Russia and Ukraine, which prohibits use of spy gadgets that can record video, audio or take photographs in an inconspicuous manner.[97]

Concerns were also raised in regard to the privacy and security of Glass users in the event that the device is stolen or lost, an issue that was raised by a US congressional committee. As part of its response to the committee, Google stated that a locking system for the device is in development. Google also reminded users that Glass can be remotely reset.[47] Police in various states have also warned Glass wearers to watch out for muggers and street robbers.[98]

Lisa A. Goldstein, a freelance journalist who was born deaf, tested the product on behalf of people with disabilities and published a review on August 6, 2013. In her review, Goldstein states that Google Glass does not accommodate hearing aids and is not suitable for people who cannot understand speech. Goldstein also explained the limited options for customer support, as telephone contact was her only means of communication.[99]

Several facilities have banned the use of Google Glass before its release to the general public, citing concerns over potential privacy-violating capabilities. Other facilities, such as Las Vegas casinos, banned Google Glass, citing their desire to comply with Nevada state law and common gaming regulations which ban the use of recording devices near gambling areas.[100] On October 29, 2014, the Motion Picture Association of America (MPAA) and the National Association of Theatre Owners (NATO) announced a ban on wearable technology including Google Glass, placing it under the same rules as mobile phones and video cameras.[101]

There have also been concerns over potential eye pain caused by users new to Glass.[102] These concerns were validated by Google's optometry advisor Dr. Eli Peli of Harvard, though he later partly backtracked due to the controversy which ensued from his remarks.[102][103][104]

Concerns have been raised by cyber forensics experts at the University of Massachusetts who have developed a way to steal smartphone and tablet passwords using Google Glass. The specialists developed a software program that uses Google Glass to track finger shadows as someone types in their password. Their program then converts the touchpoints into the keys they were touching, allowing them to catch the passcodes.[105]

Another concern regarding the camera application raises controversy to privacy. Some people are concerned about how the product has the capability of recording during events such as conversations. The device sets off a light to indicate that it is recording but many speculate that there will be an app to disable this.[106]


Concerns have also been raised on operating motor vehicles while wearing the device. On July 31, 2013 it was reported that driving while wearing Google Glass was likely to be banned in the UK, being deemed careless driving, therefore a fixed penalty offense, following a decision by the Department for Transport.[107]

In the US, West Virginia state representative Gary G. Howell introduced an amendment in March 2013 to the state's law against texting while driving that would include bans against "using a wearable computer with head mounted display." In an interview, Howell stated, "The primary thing is a safety concern, it [the glass headset] could project text or video into your field of vision. I think there's a lot of potential for distraction."[108]

In October 2013, a driver in California was ticketed for "driving with monitor visible to driver (Google Glass)" after being pulled over for speeding by a San Diego Police Department officer. The driver was reportedly the first to be fined for driving while wearing a Google Glass.[109] While the judge noted that "Google Glass fell under 'the purview and intent' of the ban on driving with a monitor", the case was thrown out of court due to lack of proof the device was on at the time.[110]

In November 2014, Sawyer et al., from the University of Central Florida and the US Air Force Research Laboratory, published the results of comparative study in a driving simulator. Subjects were asked to use either Google Glass or a smartphone-based messaging interface and were then interrupted with an emergency event. The Glass-delivered messages served to moderate but did not eliminate distracting cognitive demands. A potential passive cost to drivers merely wearing the Glass was also observed. Messaging using either device impaired driving as compared to driving without multi-tasking.[111]

In February 2014, a woman wearing Google Glass claimed she was verbally and physically assaulted at a bar in San Francisco after a patron confronted her while she was showing off the device, allegedly leading a man accompanying her to physically retaliate. Witnesses suggested that patrons were upset over the possibility of being recorded.[112]

Terms of service[edit]

Under the Google Glass terms of service for the Glass Explorer pre-public release program, it specifically states, "You may not resell, loan, transfer, or give your device to any other person. If you resell, loan, transfer, or give your device to any other person without Google's authorization, Google reserves the right to deactivate the device, and neither you nor the unauthorized person using the device will be entitled to any refund, product support, or product warranty." Wired commented on this policy of a company claiming ownership of its product after it had been sold, saying: "Welcome to the New World, one in which companies are retaining control of their products even after consumers purchase them."[113] Others pointed out that Glass was not for public sale at all, but rather in private testing for selected developers, and that not allowing developers in a closed beta to sell to the public is not the same as banning consumers from reselling a publicly released device.[114]

Technical specifications[edit]

For the developer Explorer units version 1:

  • Android 4.4[115]
  • 640×360 Himax HX7309 LCoS display[6][34]
  • 5-megapixel camera, capable of 720p video recording[8]
  • Wi-Fi 802.11b/g[8]
  • Bluetooth[8]
  • 16 GB storage (12 GB available)[8]
  • Texas Instruments OMAP 4430 SoC 1.2Ghz Dual(ARMv7)[6]
  • 1 GB RAM[116]
  • 3 axis gyroscope[117]
  • 3 axis accelerometer[117]
  • 3 axis magnetometer (compass)[117]
  • Ambient light sensing and proximity sensor[117]
  • Bone conduction audio transducer[8]

For the developer Explorer units version 2, RAM was expanded to 2 GB and prescription frames were made available:

  • all of the features from the Explorer version 1 plus:
  • 2 GB RAM[118]
  • Prescription frames available[119]

The new Google Glass Enterprise Edition improves upon previous editions with the following[120]

  • Intel Atom processor
  • Dual-band 802.11n/ac wifi
  • Assisted GPS & GLONASS
  • Barometer
  • 32GB storage memory
  • 780 mAh battery

See also[edit]


  1. ^ abcdMiller, Claire Cain (February 20, 2013). "Google Searches for Style". The New York Times. Retrieved March 5, 2013. 
  2. ^"Gadgets". NDTV. IN. 
  3. ^Coldewey, Devin (February 23, 2013). "Google Glass to launch this year for under $1,600". Gadgetbox. NBC News. Retrieved February 23, 2013. 
  4. ^"KitKat for Glass". Google. February 28, 2014. 
  5. ^Google glass fans .
  6. ^ abcdTorberg, Scott (June 11, 2013). "Google Glass Teardown". TechRadar. Retrieved June 12, 2013.  
  7. ^Fitzsimmons, Michelle (June 24, 2014). "Google Glass gets more memory, photo-framing viewfinder". Tech radar. 
  8. ^ abcdefghijklmnop"Tech specs". Google. April 16, 2013. Retrieved April 18, 2013. 
  9. ^Goldman, David (April 4, 2012). "Google unveils 'Project Glass' virtual-reality glasses". Money. CNN. Retrieved April 4, 2012. 
  10. ^Albanesius, Chloe (April 4, 2012). "Google 'Project Glass' Replaces the Smartphone With Glasses". PC Magazine. Retrieved April 4, 2012. 
  11. ^Newman, Jared (April 4, 2012). "Google's 'Project Glass' Teases Augmented Reality Glasses". PC World. Retrieved April 4, 2012. 
  12. ^Bilton, Nick (February 23, 2012). "Behind the Google Goggles, Virtual Reality". The New York Times. Retrieved April 4, 2012. 
  13. ^"Here's your chance to get Google glass", Gadget cluster, Apr 2014 .
  14. ^"Google Glass: $1,500 to buy, $80 to make?". Retrieved 2018-01-03. 
  15. ^"Google Will Stop Selling Glass Next Week". Time. Retrieved 2018-01-03. 
  16. ^ abSavov, Vlad (July 18, 2017). "Google Glass is back from the dead". The Verge. 
  17. ^Velazco, Chris (April 4, 2012). "Google's 'Project Glass' Augmented Reality Glasses Are Real and in Testing". TechCrunch. Retrieved April 4, 2012. 
  18. ^Houston, Thomas (April 4, 2012). "Google's Project Glass augmented reality glasses begin testing". The Verge. Retrieved April 4, 2012. 
  19. ^Hatmaker, Taylor (April 4, 2012). "Google shows off Project Glass". USA Today. 
  20. ^"Google Glass goes on open sale - while stocks last". Retrieved 2017-01-15. 
  21. ^Mack, Eric (June 28, 2012). "Brin: Google Glass lands for consumers in 2014". CNET. CBS Interactive. Retrieved February 21, 2013. 
  22. ^"Google Glasses Sound As Crazy As Smartphones And Tablets Once Did". Forbes. April 5, 2012. Retrieved April 5, 2012. 
  23. ^Hubbard, Amy (April 6, 2012). "debut on Google co-founder's face". Los Angeles Times. Retrieved April 6, 2012. 
  24. ^Bohn, Dieter (April 6, 2012). "Google's Sergey Brin takes Project Glass into the wild". The Verge. Retrieved April 6, 2012. 
  25. ^"First Google Project Glass video released via Google+". Future plc. May 25, 2012. Retrieved February 22, 2013. 
  26. ^Rhodan, Maya (March 24, 2014). "Google Glass Getting Ray Ban, Oakley Versions". Time. Retrieved March 25, 2014. 
  27. ^"Google Glass sales halted but firm says kit is not dead". BBC News. January 15, 2015. Retrieved January 15, 2015. 
  28. ^Cellan-Jones, Rory. "Rory Cellan-Jones on Twitter: "Breaking – Google ends Google Glass Explorer programme, stops selling Glass in present form, still hopes to produce other versions in future"". Retrieved January 15, 2015. 
  29. ^"More open Beta begins in US". 
  30. ^Bilton, Nick (February 4, 2015). "Why Google Glass Broke". The New York Times. Retrieved February 19, 2015. 
  31. ^Sahin NT, Keshav NU, Salisbury JP, Vahabzadeh A Second Version of Google Glass as a Wearable Socio-Affective Aid: Positive School Desirability, High Usability, and Theoretical Framework in a Sample of Children with Autism JMIR Hum Factors 2018;5(1):e1
  32. ^"Getting to know Glass". 
  33. ^"Acceptable Google Glass Camera Sizes". Stellarbuild. Retrieved 14 April 2015. 
  34. ^ abGuttag, Karl (June 23, 2013). "Proof That Google Glass Uses A Himax LCOS Microdisplay". Retrieved February 4, 2014. 
  35. ^ abUS application 20,130,070,338 
  36. ^ a
Google Glass can be controlled using the touchpad built into the side of the device
A Google Glass with black frame for prescription lens.

Hardware, software, and Glassware for data transmission

Wearable devices allow us to interact in new ways with data and enable us to take fast decision-making actions. When designing such wearable technological solutions it is ideal that the interaction is a two-way process – one should be able to see new data and take appropriate action through the wearable that translates into tangible changes in the system. With this aim, we designed a Glassware solution to monitor and control microfluidic and organs-on-a-chip systems, aided by a set of custom-developed hardware and software.

The Google Glass communicates with the sensors via the Google App Engine (Fig. 1A). Data is stored in the Google Cloud, and the Glass periodically checks for new sensor data, such as the latest video of cardiomyocyte beating. A custom-built printed circuit board (PCB) shield (Supplementary Fig. 2) interfaces the sensors and the actuators (electrovalves) to an embedded Linux board (BeagleBone Black). This board was chosen for its low-cost, powerful processor, and high number (in total 92) of pins that allow the connection of multiple sensors and controllers. Furthermore, the board, programmed using C++, uses the Open Source Computer Vision (OpenCV) library to plot beating patterns from cardiomyocyte videos, based on our previously published work on the pixel-shift method46,47. To transmit data from the Beagle Bone to the Google App Engine we used the cURL library with simple http post and http get operations.

The voice control command (“ok glass”) gives the users access to the designed Glassware custom card (Measurement, Fig. 1B). Once the Glassware is launched, a set of Live Cards can be accessed via swiping, giving the user access to view the pH and temperature values, a video of the cells, and a plot of the beating patterns (Fig. 1B, bottom panel). Additionally, the user has access to a Live Card that actuates electrovalves, which can control the flow direction and addition of drugs in the microfluidic system. For the electrovalves a driver circuit was specifically designed to meet their voltage and current needs, and was controlled via digital output signals from the BeagleBone board. We have also designed custom circuits for temperature and pH sensors (Supplementary Fig. 2) in order to amplify the signals and reduce the noise of the sensors.

Overall, the sensor data is stored in the Google App Engine, and retrieved whenever the Glassware application is started and then refreshed periodically. Video files are also stored locally in a computer, so that data from any different time-point can be retrieved when needed. To actuate the different electrovalves, the Google Glass changes the status (on/off) of each one of the electrovalves in the Google App Engine, and the Linux board periodically checks for these status and makes concordant changes. Together, the developed platform allows the Glassware to update the user on data coming from the sensors and enable the user to take action through the control of electrovalves.

Physical sensing units for real-time temperature and pH monitoring

Microfabricated sensors were obtained using our published miniaturized approach48. Figure 2A shows a schematic of the sensor array measuring 2.2 × 15 mm2, which hosts five biosensor platforms, a temperature sensor, and a pH electrode. The entire sensor array was fabricated from biocompatible materials and integrated with a complementary metal-oxide semiconductor (CMOS) chip for measurements. As a proof of principle, we chose to only probe the temperature and pH responses of the sensor array to characterize the physical microenvironment of our system. The right panel in Fig. 2A indicates a magnified view of the temperature and pH sensing units of the sensor array, where the temperature sensor consisted of a platinum (Pt) zigzagging path of 0.02 × 16 mm2 for each turn. Pt was chosen due to its linearity within the physiological temperature range and higher resistivity, which efficiently confined the size of the sensor to a small footprint49. The pH sensor consisted of a 250-μm2 metal pad that was electrodeposited with a thin film of iridium oxide (IrOx). Changes in the pH of the surrounding medium were measured by open circuit potential of the electrode50,51. To achieve continual monitoring of the microenvironment, the sensor array was enclosed in a microfluidic device, with a 3 × 15 × 0.2 mm3 chamber placed directly on top of the sensing units, resulting in a small working volume of <10 μL (Fig. 2B). Calibration curves for temperature and pH utilizing the enclosed microfluidic device show a linear correlation, and a sensitivity of 3.6 Ω °C−1 and −67.9 mV pH−1 at the flow rate of 200 μm h−1 (Fig. 2C,D).

We further integrated a miniature microscope that we have recently developed for image and video acquisition46,47,52. The microscope was fabricated from a webcam and off-the-shelf components46. Figure 2E,F show a schematic and a photograph of the mini-microscope integrated with a microfluidic bioreactor, respectively. The imaging unit was constructed by flipping the webcam lens and re-attaching it to the CMOS sensor in order to achieve magnification rather than the de-magnifying mechanism that a webcam requires46. This imaging unit was fitted on a set of poly(methyl methacrylate) (PMMA) frames sandwiched by screws/bolts and the microfluidic bioreactor placed above the sensor. A mini-microscope image of HepG2 cells in a liver bioreactor is shown in Fig. 2G, where individual cells could be observed, highlighting the high resolution of the mini-microscope.

In order to transmit the data to the Google App Engine a custom PCB board was designed to accommodate the BealgeBone Black for connection with the pH and temperature sensors as well as the mini-microscope (Fig. 2H,I). The micro-computing unit then records the data and image/videos and constantly transmits them remotely to the Cloud for the Glass to fetch and display.

Remote monitoring of liver- and heart-on-a-chip platforms

To test the capability of our Google Glass system for visualizing sensor data we monitored liver- and heart-on-a-chip platforms. The multi-layer microfluidic bioreactor was fabricated following our recently published protocol (Fig. 3A,B)10, where the bioreactor chambers were made of polydimethylsiloxane (PDMS). The bioreactor was designed to be re-usable and re-sealable, providing easy access to seeding cells or placing organoids inside the chamber. To construct the liver bioreactor we seeded HepG2 cells in the bottom chamber at a density of approximately 1000 cells mm−2. The mini-microscope was fitted at the bottom of the culture chamber for continuous monitoring of cell behavior. The images acquired from the mini-microscope were successfully transmitted to the Google Glass wirelessly and visualized in the View Image Live Card (Fig. 3C). The mini-microscope images can be acquired at a pre-set time interval and saved locally on the computer. Importantly, the latest acquired image is wirelessly transmitted to the Google Glass, where it then refreshes the Live Card for visualization. This feature enables the user to remotely access the microscopic images of the organoids in the bioreactors and monitor their morphological changes as a function of time.

However, in many cases, static images cannot be used to follow and measure fast dynamic cellular behaviors. We next demonstrated the capability of the Google Glass to simultaneously visualize and analyze mini-microscope videos using a simplified heart-on-a-chip platform. Heart-on-a-chip provides a versatile approach for studying the biology, physiology, as well as screening pharmaceutical compounds possessing cardiotoxicity26,53,54. The heart-on-a-chip platform was constructed by seeding rat neonatal cardiomyocytes on a piece of glass coated with a thin layer of 5 wt.% gelatin methacryloyl (GelMA) and 1 mg mL−1 carbon nanotubes (CNTs), followed by placement of the construct in the chamber of a cardiac bioreactor (Fig. 3D). CNTs were embedded into the GelMA to promote the intercellular connections among the cardiomyocytes, therefore improving the functionality of the fabricated cardiac organoids55,56. Figure 3E is the View Image Live Card showing a mini-microscope image of the cardiomyocytes in the cardiac bioreactor transmitted to the Google Glass, indicating the formation of a confluent layer of cardiomyocytes on the GelMA/CNTs substrate.

Neonatal cardiomyocytes beat synchronously, but the beating frequency and pattern can be disturbed easily by administration of drugs/toxins or by changing the surrounding microenvironment (e.g. temperature). We therefore performed two sets of experiments to perturb the regular beating of the cardiac organoid in the heart-on-a-chip device. We first opened the incubator door for 10 min to allow the temperature to drop from 37 °C to 27 °C, and then closed it back. The temperature sensor detected the drop in real-time (Fig. 3F), and the data was transmitted via the integrated system to the Google Glass and visualized in the View Temperature Live Card. The temperature data was consistent with that collected by a commercial sensor directly connected with a National Instruments data acquisition (NI-DAQ) card and LabVIEW (Fig. 3G), highlighting the accuracy and time responsiveness of the sensor data transmitted to the Google Glass. The beating of the cardiac organoid was monitored during the external manipulation of the temperature, recorded using the mini-microscope and the video wirelessly transmitted to the Google Glass (Supplementary Movie 1). When the heart-on-a-chip temperature decreased to 23 °C, even for only 10 min, the cardiomyocytes showed an irregular and reduced beating rate, which was analyzed and plotted on the Google Glass in the View Beating Live Card (Fig. 3H). Furthermore, when the cardiac bioreactor was completely removed from the incubator and cooled to room temperature for over 30 min, the cardiomyocytes completely ceased beating (Supplementary Movies 2 and Fig. 3I). Alternative to changing environmental conditions, the beating of the cardiac organoid was also tested by addition of cardiotoxic drugs. We infused the heart-on-a-chip with 50 μM of doxorubicin (DOX) for 1 h at 37 °C and monitored the beating. DOX, an anti-cancer drug with adverse side effects on cardiac tissues, was shown to induce acute arrhythmia of the cardiac organoid upon treatment at high doses (Supplementary Movie 3 and Fig. 3J)26,47,57,58.

The integrated BeagleBone board and software/Glassware sets allowed the data from the microfabricated sensor, as well as still images and videos recorded by the mini-microscope, to be visualized and transmitted wirelessly to the Google Glass. The sensing capabilities of our system was enhanced with real-time analysis and concomitant visualization in the Google Glass. Importantly, the data were simultaneously recorded locally (on the computer where the BeagleBone board was connected), facilitating on-demand retrieval of the data at any time. In addition to organoids in bioreactors, a range of other static images or dynamic videos such as micropatterns and microfluidic droplet generation could also be remotely monitored in real-time using the Google Glass (Supplementary Fig. 4 and Movie 4).

Remote control of actuators using Google Glass

Actuators play a pivotal role in microdevices, functioning as gating mechanisms for controlling a variety of devices based on electricity and mechanics59,60. For example, electrovalves are a category of electrically actuated valves that allow for opening and closure of pressure-driven valves that can be used to conveniently manipulate liquid flows inside a microfluidic device. Here we have developed a remote control platform where the BeagleBone board reads the wirelessly transmitted Google Glass commands and responds to actuate the electrovalves (Fig. 1A). The Glassware application has a set of Live Cards that allow turning on and off each electrovalve switch upon command (Fig. 4A). Selection of the “Drive Electrovalves” Card enables a list of eight valves in the screen of the Glass; upon swiping of the touch pad eight Live Cards will be sequentially shown, each of which can be individually triggered. To visually show the working concept, we prepared an array of eight LEDs connected to the outputs on the BeagleBone board (Fig. 2H,I). As shown in Fig. 4B–E, when the switches were selectively activated on the Google Glass the corresponding LEDs could be turned on and off. The capabilities to sequentially activate and deactivate LEDs and random manipulation are further shown in Supplementary Movies 5 and 6.

We subsequently constructed a microfluidic bioreactor consisting of an elliptical chamber, one central inlet for medium circulation, and two side inlets with corresponding pressure-driven pneumatic valves, which together with linked pressure-driven reservoirs (both activated by electrovalves), accomplish the injection of target agents (Fig. 4F). We then first demonstrated the capability to remotely actuate the valves with the commands wirelessly transmitted from the Google Glass to the BeagleBone board. As shown in Fig. 4G–K and Supplementary Movie 7, we initially injected medium in the central inlet, with both side channels closed by the valves (Fig. 4G). To drive blue dye from the reservoir to the bioreactor chamber we sequentially opened the channel by deactivating Valve 1 and pressurized the blue dye reservoir by activating Valve 2 (Fig. 4H). Reversing these actions and performing the same Google Glass commands on Valves 3 and 4 drove yellow dye to infuse the bioreactor chamber (Fig. 4J). Finally, the valves were reset to the initial configuration, restoring the circulation of the medium through the central inlet (Fig. 4K).

Additionally, a mini-microscope was fitted at the bottom of the bioreactor for real-time monitoring, and connected to the BeagleBone board. The insets in Fig. 4F–J, show the change in color of the liquid pumped into the bioreactor chamber, monitored by the mini-microscope in real-time. The dynamic process of synchronized mini-microscope recording, the activation of the valves, and injection of reagents into the bioreactor, together with the commands on the Google Glass can be observed in Supplementary Movie 7.

Simultaneous remote control and monitoring of liver-on-a-chip for drug testing

The main purpose of the liver is to provide detoxification of various metabolites, protein/enzyme synthesis, and the production of bile necessary for digestion of food, rendering hepatotoxicity studies an important target for multiple fields. Here we introduced human primary hepatocytes into the chamber of the microfluidic bioreactor to construct a liver-on-a-chip platform. Liver spheroids of approximately 200 μm were first formed using a non-adherent microwell array61, which were then retrieved, suspended in GelMA, and crosslinked to the bottom of the bioreactor chamber. Prior to the experiment, 15 mM acetaminophen (APAP), a hepatotoxic drug, was added to the reservoir of one of the side channels. The liver bioreactor was initially perfused with hepatocyte culture medium from the central inlet in the first 24 h with the side channels closed (Fig. 5A,B). At 24 h we used the Google Glass to inject APAP for 1 min. Commands were sent from the Glass deactivating the valve and activating the pressure to inject APAP (Fig. 5C,D). The channel was then closed to stop the APAP injection and restore the regular perfusion with the hepatocyte growth medium (Fig. 5E,F). The mini-microscope fitted at the bottom of the bioreactor monitored the morphology of the liver spheroids, transmitting the data to the Google Glass for real-time, in situ monitoring of drug treatment effect. Without any drug, the liver organoid remained healthy and tightly agglomerated (Fig. 5G,H). In comparison, 12 h post injection of 15 mM APAP, the liver spheroid micrographs transmitted through the mini-microscope to the Google Glass showed swollen cellular structures (Fig. 5I), clearly indicating signs of a toxic response to APAP treatment. The decreased liver functionality was further confirmed by off-chip viability assay and analysis of damage biomarker glutathione S-transferase α (GST-α, Fig. 5J–L), well correlating to the observed hepatotoxicity with the Google Glass-directed drug administration and organoid monitoring.

In this particular demonstration we highlight that the entire process, including the operation of the valves, injection of the drug, restoration of the main perfusion, and monitoring the morphology with the mini-microscope was solely controlled by the Google Glass without direct manipulation of the liver-on-a-chip device or the valves system. Such seamless interface allows for remote actuation of microfluidic devices and easy access of sensed data, potentially enabling long-term communication and control between humans and microdevices.