Jeffrey Slort
BI Developer

Working for All Your BI: The week of Jeffrey

BI-dev (1)

Hi! I’m Jeffrey and I’m a BI developer at All Your BI. .
I was trained as a technical business specialist, but quickly rolled into IT following my studies. In the early days I taught myself programming and by now I have built up some 7 years of experience in building and managing various BI and other platforms..

At All Your BI I work at the Business and IT interface, largely on the technical solutions in order to create acceleration for our clients based on data. I also work on a number of products we are developing internally.

With this blog post I’d like to show you what my working week looks like.

Monday

Around 7.30 I open up my laptop. Right now, I go to our club house – we can’t think of it as an office – about once a week. From the 11th floor I greet the sun rising over the port of Rotterdam, with the company of a double cup of coffee. Believe me, it’s a view you don’t ever take for granted. At 8.45 I go to our team’s daily, and we talk about who is going to do what that day.

Monday always starts with a quick check of our monitoring systems. Over the weekend, three “event driven” Azure Logic Apps ran successfully and the load on our servers was extremely stable. A good start to the day!

Halfway through the morning, it is time for an online introductory meeting with a new client. They make radioactive substances with a particle accelerator to trace cancer cells in patients, for example. We’re given a digital tour in the form of a presentation, where we get to see one of these particle accelerators. I’m often astonished by the fantastic projects we contribute to. With a BI Discovery project we will deliver a Proof of Concept based on Excel sources.

Our Club House at Lloydstraat 210 in Rotterdam

Tuesday

Today, I plunge head first into a complex issue! A client wishes to add data from their accounting software to an existing data set. There is only one teensy-weensy problem: the software package is hosted by the IT supplier and we cannot get direct access from our Azure environment.

But it gets better: the software package appears to run on an extremely dated database and the drivers needed to make the connection are no longer supported. And…to top it all…the implementation partner, who took care of the installation of the software back in the day, tells me that the contact person has left. This is not looking good, but…never give up, never surrender!

During the conference call with the implementation partner we walk through a number of solutions. Approaching the database directly from the application is impossible. There is no API we can use to get data from the system. The options are getting fewer and fewer… Whilst talking, we hit on the idea to generate automated CSV exports and to put those in a shared location via SFTP. During a first test we discover that it is not possible to set up a SFTP connection from the network of the application. Sometimes it requires a fair whack of creativity to get data from place to another. In the eventual solution, we receive an email overnight with the encrypted files and we process those in the client’s data warehouse. This solution is far from ideal, but at least we have been able to get the client a step further.

Tired, but satisfied I shut my laptop.

Wednesday

I’m busy processing the feedback on the Proof Of Concept brainstorming session.

We split the week with delivering a Power BI Discovery for Rotterdam City Council. In the morning I go over the presentation once more, make a few minor amendments and quickly grab a coffee before starting the call.

For the Proof Of Concept we produced a combined data set on the basis of data extracts from two systems, for example to calculate the Overall Equipment Effectiveness (OEE) of the assets managed by the client. On the dashboard we are delivering, it is possible to produce a more in-depth analysis of the cause of reduced availability of these assets for example. This can also provide insight into the effectiveness of the maintenance plan and the budgets required for the coming years to tighten up the maintenance plan.

By sharing my screen I guide the client through our process and give a brief demonstration of the first dashboard. It’s beautiful how you can provide valuable insights to organisations with relatively few data. From the demo we slip into a brainstorming session about what is possible with the outcomes. For example, what other insights might be possible if we added more sources to the data model.

The energy and enthusiasm levels during sparring sessions are generally off the scale, and that creates mountains of possibilities. After the call I get cracking immediately with processing the feedback and I start on the follow-up process on the basis of the brainstorming session.

Thursday

Today I will focus on one specific project, being the development of our Data hub. We are working on building a hub (we’re still looking for a good name), where every organisation can connect its systems to (for the time being Exact Online, Afas, Salesforce & NMBRS) and within a few seconds they have direct access to a large number of standard reports in Power BI.

First I start with a BIG coffee. Then I crank up the laptop and open my editor. And…cheerfully coloured bits of abracadabra (code that is) for our data hub appear on my screen. Headphones on (the neighbours are still asleep!) and a little browse through Spotify’s “Daily mixes”. I select the best playlist in a tried-and-tested but totally random way, and I’m good to go.

The programme for the day is developing a connection for Exact Online. Together with the team I devised the prerequisites and the required data model.

It concerns a REST API with a OAUTH2 authentication layer. Reading through the documentation, I’m amazed once more by the massive differences between comparable APIs of different applications. Personally, I use the “tracer bullet method”, as Uncle Bob (Robert C. Martin) calls it, to develop such a connection. On a Jurassic piece of paper, I produce a rough outline of the architecture of the solution. Then I set up the skeleton of the application as quickly as possible, and with iterations I arrive at the end product.

An attempt at a Thursday afternoon smoothie 😉

Friday

Fridays are always special. Every other week, we work together on a “All Your BI Lab”. They are internal products, such as the data hub I worked on yesterday for example. This week we will all get to work on a liquidity forecast.

Last week, Jesse prepared the data set so we can make a flying start. We split into two teams of 3 people each, and before we start some sharp comments fly across the table. Game on!

Our team will attempt to forecast the liquidity on the basis of X aspects. We split the assignment into small parts and get on with it. First up is predicting income on the basis of concluded contracts and their corresponding average payment term. As soon as we have implemented this successfully, we add predictions on the basis of the current sales funnel. The second step is predicting the expenditure, and that is proving to be more difficult. The data is based on the AFAS data of one of our clients. Over the last 3 years, this client grew quickly and that makes predicting expenditure over a period of less than three months a massive challenge.

The second team gets to work with a calculation model that was produced in Excel by one of the CFOs from our network. They translate the calculations into Power BI DAX code and add some parameters for choosing various scenarios.

At the end of the afternoon, we present our solutions to each other. Sadly we have to admit that the solution of team 2 is way more accurate than ours. You can’t win ‘em all.

With a call, we keep Friday afternoons with the team going – even in these times. We have a beer, we go over both solutions and come up with some improvements. If we invest a little more time, we can arrive at a general model for preparing a liquidity forecast. A successful Lab, we decide to put it on our backlog and it’s time to wrap up the week.