Designing for focus

Design Lead

Avid readers often use timers to fully immerse themselves in the content without losing track of time. However, phone timers invite interruptions, such as notifications asking for attention and being tempted to do something else. After an initial proof of concept, I led the user research, ideation, and UX validation phase to build a timer tool that met the user’s expectations. With the reading timer, users can finally focus on their reading goals without checking their phones for a break.

This case study is password protected.
You can open it here or request the password by email.
“When I opened the book, it asked me if I wanted to set a timer for 10 minutes. It said that setting a timer and then taking short breaks helps readers focus. I like that. It limits burn out for me.”
User feedback
“I used the timer, which was helpful to get me into the mood for reading, highlighting, and note-taking. I enjoyed the timer as a new feature, and I would highly encourage anyone to give it a chance.”
User feedback
Design: Baruch Pi, Olamide Jegede


After health, time and relationships, we consider money as one of the most critical assets in our lives. We feel frustrated and even suspicious if we don’t have immediate access to our bank statements. But when your bank receives too many server requests at the same time, or when you a stuck with a weak internet connection, the endless spinning wheel of your banking app can drive you insane.

In this series, I want to find alternative loading state solutions which leave us less frustrated when waiting for an important piece of information.


For the first prototype of this series, I decided to build a banking app passcode screen.

I started by sketching the user interface the and animation flow, to get a better understanding of the final product.

Building the Animation

I used two intersecting 3D cubes for symbolising the transmission of data. It was clear to keep the motion snappy and loopable to be an excellent replacement for the infamous spinning wheel. Also, the minimum loop duration could not exceed 1 second to not stand in the way of users who can access the app without any delays.

For the 3D graphic, I focused on keeping the lighting and texturing at a bare minimum to ensure a small file size and therefore fast app performance. However, the 3D details needed to be complex enough to give a sense of discovery every time the loading state takes a bit longer than usual.


In earlier design versions, I placed the 3D graphic above the passcode element as a stand-alone cube (see sketches). But this design direction would have put too much emphasis on the loading state, something that should ideally happen in the background. The final implementation feels like a good middle-ground since the 3D graphic is a nice eye-candy but does not distract from the passcode screen.

In this series, I will continue to explore advanced loading state animations and the implementation of 3D graphics. Stay tuned.


With AR glasses approaching mass-market adoption in the coming five years, we have the chance to reimagine all user interface components from the ground up. In this series, I adopt different concepts from nature to find new IX patterns that could replace the spinning wheel on AR glasses. Different from VR devices that let you immerse in multimedia content for a limited amount of time, AR glasses are meant to be worn over extended periods of time. For the first time in human history, we will be interfacing with computer tasks directly in front of our eyes instead of external screens.

That shift will mean a radical rethinking of existing UI paradigms since we will be much more attuned to any distraction of our workflows. Where before we were able to put aside the phone for something to load, we soon will need to wait for the results in front of our eyes. Therefore the way we use loading screens and display background tasks will determine the customer satisfaction and adoption rate of AR glasses as a replacement for the smartphone.

Building the Animation

Nature itself is the rule book for all sorts of motions that are pleasing to the eye and relaxing for the mind. Why not borrowing from these concepts and making computing tasks for AR devices less methodical but more in tune with our nature as human beings?

For my first exploration of this series, I looked at the dilation-contraction movement of the human eye to find a motion pattern that could work as a metaphor for the loading process between two user interface states.

I first build the UI interface of the voice assistant to define the timing and placement of the loading animation.

Once I had a rough idea of the final animation, I started building the iris dilation-contraction animation, which I would place between the voice command and the search results screen.


Without having a test device, it wasn't easy to define the right amount of opacity for the animation to work in differently lit environments. I turned to the Microsoft Hololens documentation to find advice on UI legibility in AR environments.

I placed the final animation on top of a real-word environment to have a proof of concept for different lighting conditions (bright sky vs muted colours) and the legibility of the UI interface.
For the upcoming explorations of this series, I will try more concepts from nature, and explore different ways to display data in the AR environment.


The iPad has come a long way to replace the traditional laptop. With the recent updates to the iPad operating system, it became an excellent computer replacement for the casual user. But to be also appealing to power users, it needs to adopt a more straightforward and powerful way to let you interact with multiple apps at the same time.

To multitask on the iPad, the application dock at the bottom of the screen is your gateway for opening multiple apps at the same time. Any app that is not placed in the dock can't be immediately accessed to be opened in split view. Often I find myself needing to open utility apps like Excel or the calculator. But because I don't use them frequently enough to have them placed in my dock, I need to interrupt my current task, go back to the home screen and find the utility app first.

If you are lucky and you have your desired app placed in the dock, you need to drag-and-drop it next to the already opened app. That often takes a couple of seconds and also interrupts you mentally from your current workflow.

But why so complicated? On MacOS you can open multiple apps on your Desktop without much hassle. So why not translating this idea to the multitasking view on iPadOS?

My Solution

When long-pressing an app in the multitasking window, you can open multiple apps by tapping their preview window. A side-by-side picture of the selected apps shows up immediately at the top of the screen from where you can navigate to your newly created Desktop.

In the desktop view, you can easily add and remove apps from the split view by swiping the handles on the top and right of the screen. The latter brings you back to the multitasking window from where you can choose additional apps.


When creating the mockup, I got a deep appreciation for the IX team at Apple. How the background gently fades aways when you open the multitasking view and how apps jump into position in split-screen mode is something I hardly ever noticed. Only when I tried to replicate the animations for my mockup, I discovered the well-paced timing and fluidity of the transitions. But still, the user experience of the current multitasking implementation is not ideal for iPad power users. With the ongoing unification of the iPad and Mac operating systems, I hope that Apple will rethink multitasking on the iPad soon.


The initial client brief suggested to explore a visual storytelling approach and to highlight the lifestyle aspect of the product, instead of solely focusing on the app features.


I developed several placeholder layouts to inform the initial design direction. What followed were several sprints to map out the site architecture, and to narrow down the layout options. Once we found a good balance with the content copy for each sub-page, I finalised the designs for production.


I set up the CMS and site infrastructure in Webflow and build out the layout and content content components to be easily maintainable by the client.


After testing and tweaking the responsive layout for desktop, tablet and mobile devices, the website went live in June 2021.


After testing and tweaking the responsive layout for desktop, tablet and mobile devices, the website went live in June 2021.