Project Description
Questions often arise spontaneously in a curious mind, due to an observation about a new or unknown environment. When an expert is right there, prepared to engage in dialog, this curiosity can be harnessed and converted into highly effective, intrinsically motivated learning. “TagAlong”, uses Google Glass to foster learning with the aid of a remote peer. A remote companion can remotely review footage of the learner and tag relevant objects in the user’s environment. The tagged objects would then be delivered on the heads up display on Glass in a real-time fashion. The application began as a proof of concept with a focus on language learning, focusing on delivering real time translations of objects within the learner’s environment. During the summer of 2013 a prototype consisting of a Google Glass application and a web client was deployed. After initial user trials, the application changed its web client and shifted towards a mobile friendly experience in which any peer could be contacted and engaged through a companion app. This development widened the spectrum of experiences enabled by the TagAlong system, which eventually explored remote expert assistance and social interactions aided by context sharing.