[publication] OmniColor – A Smart Glasses App to Support Colorblind People #iJIM #tugraz

Georg investigated in his masterthesis the usefulness of Google Glass for colorblind people. We published his results in our article about „OmniColor – A Smart Glasses App to Support Colorblind People„.

Abstract:

Colorblind people or people with a color vision deficiency have to face many challenges in their daily activities. Their disadvantage to perceive colors incorrectly leads to frustration when determining the freshness of fruits and the rawness of meat as well as the problem to distinguish clothes with confusing colors. With the rise of the smartphone, numerous mobile applications are developed to overcome those problems, improving the quality of live. However, smartphones also have some limitations in certain use cases. Especially activities where both hands are needed do not suit well for smartphone applications. Furthermore, there exist tasks in which a continuous use of a smartphone is not possible or even not legally allowed such as driving a car. In recent years, fairly new devices called smart glasses become increasingly popular, which offer great potential for several use cases. One of the most famous representatives of smart glasses is Google Glass, a head-mounted display that is worn like normal eyeglasses produced by Google. This paper introduces an experimental prototype of a Google Glass application for colorblind people or people with a color vision deficiency, called OmniColor and meets the challenge if Google Glass is able to improve the color perception of those people. To show the benefits of OmniColor, an Ishihara color plate test is performed by a group of 14 participants either with, or without the use of OmniColor.

[Link to full article @ ResearchGate]

[Link to full article @ Journal’s Homepage]

Reference: Lausegger, G., Spitzer, M., Ebner, M. (2017) OmniColor – A Smart Glasses App to Support Colorblind People. In: International Journal of Interactive Mobile Technologies (iJIM). Vol. 11 (5), pp. 161-177

[publication] Project Based Learning: from the Idea to a Finished LEGO® Technic Artifact, Assembled by Using Smart Glasses #edmedia17

The first publication at this year ED-Media conference is about „Project Based Learning: from the Idea to a Finished LEGO® Technic Artifact, Assembled by Using Smart Glasses„. The presentation has been recorded and can be find here.
Abstract:

Smart Glasses and 3D printers are now easily available on the market. The challenge is how to integrate them efficiently in a learning environment. This paper suggests a project-based learning (PBL) scenario how to construct, produce and assemble a planetary gear using Open Source tools, LEGO® Technic, 3D printers and Smart Glasses. The whole project-based learning scenario was implemented together with a 16-year-old student. Additionally, the assembly process using Smart Glasses was tested by seven users in a qualitative evaluation. The feedback of the student of the target group together with the feedback of other subjects was considered to improve the PBL scenario and the Smart Glasses (ReconJet) application. The evaluation showed the potential of Smart Glasses to improve hands-free assembly processes and supports the user to understand the structure and functionality of mechanical objects.

[Draft version @ ResearchGate]

Reference: Spitzer, M. & Ebner, M. (2017). Project Based Learning: from the Idea to a Finished LEGO® Technic Artifact, Assembled by Using Smart Glasses. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 196-209). Association for the Advancement of Computing in Education (AACE).

[vodcast] From the Idea to a Finished LEGO® Technic Artifact, Assembled by Using Smart Glasses #tugraz #research #edmedia

Because we are not able to attend the ED-Media conference 2017 in Washington this year, we are doing our presentations virtually. The third of four talks is about „From the Idea to a Finished LEGO® Technic Artifact, Assembled by Using Smart Glasses„:

YouTube

Mit dem Laden des Videos akzeptieren Sie die Datenschutzerklärung von YouTube.
Mehr erfahren

Video laden

[video] Augmented Reality and Mobile Discovery

Eine sehr schöne Augmented Reality Anwendung für das Nokia S60 ist Point & Find:

Point your mobile device’s camera at a landmark or barcode and receive immediate information such as descriptions, phone numbers and reviews, no typing required. Or join businesses like Expotel, Oasis, Joule and others already using Nokia Point & Find (Beta) to reach and interact with customers – easily, quickly and with no coding required. Mobile discovery has never been this easy.

Wenn man dies für Lernanwendungen umlegt, sehe ich vor allem die Möglichkeit dort wo man Informationen zu realen Objekten liefern kann, entweder über die Geolocation oder über einscannbare BarCodes: