ABSTRACT
As a UX Designer and Developer for the DiGeKo project, funded by the Federal Ministry of Education and Research, I was responsible for leading the design thinking workflow to research, design and develop an app that increases health literacy for people who can not or hardly can read. I worked closely with the scientific research team to identify user needs, create personas, and develop a roadmap for the project. I designed an accessible user interface for the core feature of the app, an information landscape that can be accessed completely by audio/visual information. I also developed a flow for the "Anamnesebogen" feature, which aimed to simplify the process of filling out forms at the doctor's office. I developed the app using Flutter and Dart, and managed team resources to ensure successful delivery of the project. Furthermore, I illustrated the corporate design (logo, colors).
⦁ Led design thinking workflow for the DiGeKo project, which aimed to increase health literacy for people who can not or hardly can read
⦁ Worked closely with the scientific research team to identify user needs, create personas, and develop a roadmap for the project
⦁ Designed an accessible user interface with Figma for the core feature of the app, an information landscape that can be accessed completely by audio/visual information
⦁ Developed a flow for the "Anamnesebogen" feature, which aimed to simplify the process of filling out forms at the doctor's office
⦁ Developed the app using Flutter and Dart
⦁ Managed team resources to ensure successful delivery of the project
⦁ Illustrated the corporate design (logo, colors)
DOCUMENTATION
I was hired to this project with a clear goal - We need a health literacy app for illiterate people. To better understand the needs and potentials of our target user group, I incentivized the team to work through a design-thinking workflow. Based on the scientific interviews with the user group, we created personas. In a konvergent analysis we constructed needs, keywords and key contexts.  Together with the scientific research team we then formulated missions and goals. We discussed possible features and their cost and use, in order to create a roadmap for the project time.
We ended up with having two main features: The first is our core - a information landscape that can be accessed completely by audio/visual information. For this, I had to design an user interface, that is accessible to people, that can not read text. To tackle this challenge, we created an interactive prototype in Figma. We used a gamified approach, to display the health landscape as a city with different buildings. After many iterations and user testing, we scraped the building metaphor and settled for a very minimal approach, in order to concentrate on the important visual and auditive stimulations. We noticed that even though, the user group can hardly read or not read at all, that a text display can still be very useful. In order to communicate the navigation through the landscape we used illustrations. We tested these illustrations iteratively with the user group, to gain knowledge how to communicate specific information in the health ecosystem. We developed a tree-branch navigation, that is also displayed through animation, to show the user the process.
The second feature is the "Anamnesebogen", a form that a person has to fill out when visiting a doctor. During user research we identified the "Anamnesebogen" to be a user pain, that is mentioned regularly. For this feature, we designed a flow, where the user starts a dialogue and goes through the form step-by-step. The user can interact with the form by using natural language inputs, such as speech-input. After the form is filled out, the user can print their form with the printer.
Finally, we started to develop the app using Flutter and Dart. With Flutter, we are able to use important features, such as text-to-speech, to be able to let the app speak each text it displays. Also, it can use native features such as speech-to-text, a function we need for the Anamnesebogen. We developed the app in an agile workflow. Together with my colleague Dennis Przytarski we identified technical challenges and adjusted the project roadmap accordingly. We managed team resources, to form a symbiotic relationship with the academic and media team, that was helping us creating the illustrations and video content. Using a lean approach, we tested the app iteratively with the user group, concentrating on a certain feature for each testing.
To ensure that the app is maintable, we integrated a no-coding interface, that can modify the most important parts of the application. We implemented a Content-Management-System, that connects through a STRAPI API, that can add, delete or edit all the content that is accessible in the app in a modular way.

Other Work

nach oben