WK09 – Project Presentation to wider class
During the first week after Easter, we had our project presentations in which we would be presenting our progress to all the supervisors. Everyone prepared a presentation in which we explained our main idea, the research behind the idea, how much progress we have made and how we managed our time. We used padlet so that supervisors and other students could add comments as we presented.
I got positive feedback regarding my research and the product in itself. However, I do think the presentation could have been better. I am unsure I was able to communicate the different stages I have gone through and the key aims of the project. I have been advised to be very specific during the writing of my project report and to be careful with the linearity in which I am developing my project. Overall, I think not many changes were suggested to the actual output of the project, and that gave me a green light to keep developing the project as I had planned from the beginning.

Continuation of development: re-thinking Tabs navigation
At this stage I had already created my main navigation, however, the logic of the ‘Views’/ Screen was using a React library that is used more for web development than app development: React Router (I realized this when I wanted to create more complex navigations). This library offered an API that allowed me to navigate between different screens, however, I wasn’t able to find many examples or documentation using this library. The interactions between the screens of my app are slightly complex, and because I wasn’t able to find clear documentation I decided to switch the whole project to the React Navigation library.
I decided that it was easier to start a new project and re-implement the folder structure as well as the opening of the camera. However, this time I would use React Navigation to refactor my routing. This allowed me to include a splash screen and an onboarding session. At this point I felt comfortable with the file structure and the installing of libraries. The logic felt somhow similar to the logic of a game, where there are different ‘states’ that need to be triggered by different user inputs/ actions.

Having a clearer structure of the tabs and the different Screens (also called components), I was able to start thinking about the Onboarding process. I followed a tutorial that helped to structure my code even more. Within the app there are two important files from where the whole app is structured. App.tsx runs the very first state of the app. In this file is where I manage the onboarding screen, and decide what to show after it has been completed. It also hosts screens that aren’t part of the Tab navigation, which allows me to overlay full screens on top of the bottom tab navigation.
mainContainer.tsx is where I manage the Tab navigation and the ‘Screen Stacks’ that belong to every Tab screen. From these two files, I manage the different screens and navigations that the user needs to go through. In this process, referring back to Figma has been useful to identify traps for the user, and create a better user journey.

Onboarding, Splash Screen and App icon
Once I had the app’s main underlying structure (code-wise), I started to integrate visual elements, beginning with the onboarding process.
The onboarding process consists of three screens, and the option to skip through all of them or go to the next one. The draft shows the placement of elements and the interaction between the screens. This resulted from a process of understanding how different elements behave in a View screen, how styling works in React native, and how styles can be made re-usable.
The development of the Splash screen was completely different from a normal react-native screen. The content had to be placed within other paths, as well as the required assets and fonts. This switch of style and folder structure informed further my comprehension of the react-native framework because it demonstrated how code can exist in different areas and yet be displayed within the app journey.

The Generation of the App icons also required the creation of assets for different devices and platforms. I made use of iconKitchen which helps to automate the creation of those assets, and I loaded them within my Android Studio project. Again, this was the result of following a tutorial and documentation on how to add icons to a project.
Next Steps
After creating a more solid structure of my project, my next steps into the development of the app are:
- Integrate camera access (from old code) and camera permissions.
- Integrate a TensorFlow image detection model within the react-native folder structure
- Print out predictions of the model and draw bounding boxes.
