Our goal was to create a seamless digital tool that merges AR with travel, making foreign environments instantly accessible.
The app empowers users to simply tap on real-world text, whether it's a street sign, a menu, or a historical plaque and receive instant translations. By removing language barriers, TAYAR transforms the travel experience, allowing for a deeper and more intuitive connection with the local culture.
We approached the project as a complex engineering challenge, integrating Vision, ARKit, and API networking into a high-performance iOS app.
Built using SwiftUI and the MVVM architecture for scalability, the app leverages the Vision framework for accurate text detection and ARKit to overlay translations in real-time. By integrating the DeepL API, TAYAR delivers precise, non-intrusive translation bubbles, demonstrating advanced proficiency in combining diverse frameworks into a cohesive user experience.








