PicGuide@TUM

“Find Your Destination in TUM Through the Pictures!”

What it is about

PicGuide@TUM is a web-based tool designed to address the path-guide challenges within the complex TUM main campus, particularly benefiting Cartography and Geodesy students and new visitors. Traditional navigation tools often rely on abstract maps and lack the detailed, real-world cues necessary for recognizing specific locations in multi-layered environments like multi-story buildings. PicGuide@TUM fills this gap by providing an intuitive platform that combines interactive maps with real-world photos.
Users can select their start and end points and follow a series of picture-guided paths to their destination. Each step of the path is accompanied by photos of key junctions and landmarks, making it easier to navigate corridors and entryways that might otherwise look similar. The platform also supports user-generated content, allowing individuals to upload new routes and photos, which, upon approval, can be added to the database. This ensures that the tool remains adaptable and up-to-date with users' needs.
The web interface includes features such as dynamic pathfinding, progress indicators, and 360-degree panoramic views at critical points, further enhancing spatial recognition. PicGuide@TUM focuses on improving usability and accessibility, offering a user-friendly navigation experience that bridges the gap between abstract maps and real-world environments.

How we built it

PicGuide@TUM is developed through a combination of modern web technologies, extensive data collection, and a structured workflow. The front-end UI was designed using Bootstrap to ensure responsive and visually appealing layouts, and Leaflet.js was used to implement the interactive map, enabling users to navigate the key location efficiently. The back-end part is built with Python using Flask, which facilitates data processing, pathfinding, and communication between the server and frontend.
The development process begins with identifying key locations and paths across the TUM campus. All team members conducted multiple on-site visits to explore and determine the most efficient routes, carefully noting each point's and path’s details. Each location was assigned a unique identifier and name for consistency. Our team then captured high-resolution photos and 360-degree panoramic photos at these points, ensuring the visual data is accurate and comprehensive.
To organize and manage the collected data, an adjacency table format was adopted. Points and edges were stored separately for clarity and efficiency in data processing. The points table contained columns for ID, Name, FloorNr, PictureName, Latitude, Longitude, and Display status, ensuring each location’s attributes were clearly defined. The edges table included Node_S (start node), Node_E (end node), Distance, and PictureName, mapping out the connections between points.
After the data collection phase, we moved to the development stage. The web application is built iteratively, with the front-end and back-end codes being written and continuously tested. Special focus was given to improving the user interface to provide a seamless guide experience. Once a functional prototype was ready, user testing was conducted. Participants were asked to use the website to find specific locations on campus, while team members followed and recorded their feedback, testing time, and any issues they encountered.
To engage our users, an interactive 3D model of TUM has been added to the website under the Lageplan page. We retrieved the 3D model from Google Earth and the background orthophoto “DOP20 RGB” from Free geodata of the Bavarian Surveying Administration. We settled on using blender and Sketchfab to visualize the model and offer a smooth interactive experience with the model. Furthermore, the platform also offers customizable VR experience to user. This could raise more interest in exploring the model with bird eye view.
To enhance the user experience, a brochure was created containing detailed information about transportation, campus layout, entrances, and more. Additionally, Unity was used to develop an augmented reality (AR) feature, which allows users to interact with specific classrooms through an Android app. By scanning designated markers, users can access photos to reach the classrooms. A dedicated webpage was also created to provide comprehensive details about the campus layout and other helpful information.
Finally, we conducted user testing and collected feedback through questionnaires. The feedback from these tests was then analyzed, leading to further improvements in both the UI and system function. Enhancements included optimizing the route visualization, refining the photo display, and addressing usability concerns. This comprehensive approach ensured that PicGuide@TUM is not only functional but also highly intuitive and user-friendly.

Challenges we ran into

One significant challenge is the time-consuming process of collecting photos for numerous points and paths. To address this, we divided tasks among team members: recording point and path information, capturing photos, and cross-checking the database and map to ensure accuracy. Additionally, for efficiency, we captured bidirectional path photos at each point, reducing the workload by half.
Another challenge was the initial lack of clarity in the photo guidance, as some photos contained too much information. To resolve this, we spent time annotating the photos with clear blue arrows to indicate directions. This improvement made the guidance much more intuitive and user-friendly (be proved in the user test).
For the interactive 3D model of TUM, initially we hope to work on modelling TUM with indoor laser scan by Navvis. We had reached out to Mr. Marcus Hebel to see the dataset named TUM-MLS-2016. We are very fortunate to gain access to the dataset. We had to perform data preparation because the data comes as frames of a single lidar scanner. We first selected the frames that we would need to model the campus and combined them all for each Lidar scanner. Then we merge the point cloud from the two lidar scanners together. After subsampling it to 10mm, the mesh reconstruction through MeshLab still does not work. And after a few trials in blender, it remains too large for web deployment. We also tried to colorize the mesh with HD aerial photos. After three weeks of trial and error, we did not use the point cloud provided by Marcus from Fraunhofer-Gesellschaft. The difficulties include lack of UV map, hard to reconstruct with noise, Significant server loading time with Potree viewer and Coarse resolution points for the roof (after combined with Aerial Lidar).
In the AR interaction design, one challenge we encountered was the importance of designing effective targets. While the recognition rate for individual targets was high, there was a degree of similarity between some targets, which led to cases where the system mistakenly identified one target as another. This issue could affect the overall user experience during the AR interaction. Furthermore, due to the complexity of the iOS system, we currently only have a simple Android app available, which limits the app's accessibility for users with iOS devices.

What we're proud of

We are particularly proud of our contributions to the frontend and backend code of our website. We often spent long hours brainstorming code structures and designing the database, even pulling all-nighters to fix bugs when inspired by a new idea. Every time the code ran successfully, it brought us immense satisfaction. Seeing users successfully navigate to their destinations during testing made us even prouder, as it showed our project fulfilled its purpose and truly helped others.

What we learned

This was our first time developing a multi-page web application with both frontend and backend. We learned how to manage data using Flask in Python coding and handle data interactions between the frontend and backend. Much new knowledge from this semester, such as UI design, geoinformation data structures and algorithms, dashboard design, and user survey test, were applied to this project, enhancing both our technical skills and practical experience.

What's next

We plan to add more paths and points to expand the project’s coverage, ensuring it includes additional locations across the TUM campus. This expansion will not only make the guide more comprehensive but will also improve the mobile interaction experience by providing users with a broader range of navigational options and up-to-date campus information. Additionally, we aim to further refine the augmented reality (AR) functionality to ensure seamless and accurate target recognition, allowing for a more intuitive user experience.
Looking ahead, we hope that this project can be adopted by future MSc Cartography intakes as a valuable tool to help them quickly familiarize themselves with the campus and classrooms. By providing an easy-to-use guide, we envision that new students will be able to navigate the complex TUM campus with greater efficiency, aiding their academic and extracurricular activities. Furthermore, the project can serve as a foundation for future enhancements, integrating more advanced features such as real-time updates, personalized routes based on individual needs, and even integration with other campus services like transportation or event schedules.

Sources

OpenStreetMap Foundation (2022). Guidelines for Web Map Styling. Retrieved from OpenStreetMap Wiki.
Leaflet Documentation (2023). Leaflet: An Open-Source JavaScript Library for Mobile-Friendly Interactive Maps. Retrieved from Leaflet Documentation.
Fuchsberger, D. (2015, May 17). Turm der TU München [Photograph]. Wikimedia Commons. https://commons.wikimedia.org/wiki/File:Turm_der_TU_M%C3%BCnchen.jpg
Bootstrap. (n.d.). Official website. https://getbootstrap.com
Digital Orthophoto RGB 20cm (DOP20 RGB), Bavarian state government. https://geodaten.bayern.de/opengeodata/OpenDataDetail.html?pn=dop20rgb
TUM_studentModel, wongchingyeung(2025). https://skfb.ly/ptVxG
Students
Weiyi Gao, Lingyu Tang, Ching Yeung Wong and Ximing Wang

14th intake
Supervisor
Juliane Cron, M.Sc.
Keywords
Picture guide, TUM, Campus, Panoramic, AR
Try it