I’ve been a Microsoft Data professional for over 20 years. Most of that time I’ve spent in the SQL Server stack, the core query engine, SSIS, SSRS, and a little SSAS. But times changed, and the business problems grew more complex. As they did, I looked at other technologies to try and answer those questions….
Category: Modern Data Estate
Notebooks Explore Data
On a recent engagement, I was asked to provide best practices. I realized that many of the best practices hadn’t been collected here, so it’s time I fix that. The client was early in their journey of adopting Databricks as their data engine, and a lot of the development they were doing was free-form. They…
The Big Cost in Data Science
You hear time and time again how 50 to 80 Percent of Data science projects is spent on data wrangling munging and transformation of raw data into something usable. For me personally. I’ve automated a lot of those steps. I built tools over the last 20 years that help me do more in less time….
NOAA Radar and Severe Weather Data Inventory
After I finished evaluating the Storm Events Database from NOAA, I was convinced we needed to look for machine recorded events. When you start poking around the NOAA site looking for radar data, you’ll find a lot of information about how they record this data in binary block format. Within this data, you’ll find measurements…
Data Quaity Issues
This entry picks up the story behind my first data science project predicting hail damage to farms. In this article we identify data quality issues in our first data source. Property and Crop Damage In the NOAA documentation these two columns were recorded to say how much property and crop damage occurred in a given…
Data Science Project 1: Predicting Hail Damages
Early on in my new role I was asked to find out how risky it was to offer hail insurance for a given property. If you haven’t worked with insurance before here’s the basics. You’re placing a bet that says I’m betting something bad is going to happen. The insurer is betting that it won’t…
Data Analysis…can we automate this?
As some of you know, I’ve moved from consulting back into a full-time employee for Crop Pro Insurance. There was so much opportunity in this role. First of all, this role gives me my first full-time data science credit. I also get to build a team to support data science projects. On top of that,…
Metadata Model Update
As I began learning Biml, I developed my original metadata model to help automate as much of my BI development as I could. This model still works today, but as I work with more file based solutions in Azure Data Lakes, and some “Big Data” solutions, I’m discovering it’s limitations. Today I’d like to talk…
U-SQL: Automating Schema on Read
Moving to U-SQL for your ETL can feel like a step back from the drag and drop functionality we have in SSIS. But there is one great thing about your ETL being defined in text rather than a UI: you can automate it! Today I’m going to show you how you can automate the part…
U-SQL and ETL Processing–Part 2
We’ve been going through a simple U-SQL script to perform some ETL processing in Azure Data Lake. Last time, we started by covering some basic syntax like variables and expressions. Now, we’ll pick up with some transformations. SSIS v U-SQL In traditional data warehouse ETL, we’ve been spoiled by the ease of drag and drop…