Our final week of training (eek)!!! Join me for my weekly reflection on life at the Data School.
Monday - APIs
Our final week of training got started with APIs, which we had looked at before for one of our client projects, but hadn't really understood them. An API (Application Programming Interface) is a way by which two programs can talk to each other, and they're probably something you've used before without realizing it. For example, when you log into a website using your google log in, the two programs need to talk to each other, and do so through APIs! We spent the day looking at different APIs, such as this one on Brewdog beers, extracting and cleaning the data so that it's in a usable format.
Tuesday - Apps in Alteryx
Tuesday was a pretty technical day in which we learnt how to create an app in Alteryx. An analytical app is similar to a macro, but reduces the amount of work the user has to do. Instead of using a macro within another workflow, an app basically presents the user with an interface, allowing them to control certain aspects of the workflow and get a desired output based on the parameters given to them. This felt like a pretty daunting task and I found apps to be quite fiddly to configure how you want, but definitely worth it in the end.
To test what we'd gone through on Apps, I tried Alteryx Challenge 115, in which I had to create some nested interface elements, allowing users to select an option, and then change parameters within the selected option.
Wednesday - Advanced LODs and Parameter Actions
Wednesday was our final Tableau lesson, in which we went into advanced LODs, and setting up some more parameter actions. Throughout training Tableau days have always been my favorite, as they're a great chance to practice some creativity, and get inspiration for Makeover Mondays. We went over parameter actions such as chart swapping parameters (which I used in my Makeover Monday this week), dynamic zone visibility (which actually can now be used instead of chart swapping parameters), and creating buttons to change said parameters!
Thursday - Web Scraping
Our last full day of training! Sob! Today we looked at web scraping in Alteryx, which is basically a relatively quick way of extracting information from a website, similar to APIs. Websites are built on the programming language HTML, which you can see if you right click on a website, and select 'Inspect'. To extract data from a website, you need to copy the link into a text input in Alteryx, then download data using the download data tool. At this point you can start parsing and extracting information from within the downloaded data, building up a table depending on the information you want to extract. I actually really enjoyed trying this out myself, and managed to scrape some data on which movies passed or failed the Bechdel test from this website.
Friday - Personal Development
Tomorrow we have time for personal development, before dashboard week next week. I'm going to focus on working on some personal projects I've been doing, and trying to get towards outputting something interesting!