Skip to content

shannonlowder.com

Menu
  • About
  • Biml Interrogator Demo
  • Latest Posts
Menu

Delta Sharing – Data Providers

Posted on January 31, 2023February 18, 2023 by slowder

Setting up Delta Sharing in Databricks is straightforward once you understand the diagram provided in the Azure Databricks documentation.

Delta sharing is implemented as a part of Databrick’s Unity Catalog. Unity catalog is the official data governance solution for Databricks. You can consider it an extension to the metastore catalog or Databricks version of a data catalog. You can define datasets, columns, and permissions from a single service. Setting up Unity Catalog is outside this article, but I’ll put together a walkthrough later. For the rest of this article, we’ll assume that you’ve got Unity Catalog set up and you have connected your Databricks workspace to that catalog.

In Unity Catalog, you can define databases and objects. You can see these represented down the left side of the diagram. Delta Shares are also defined in the Delta catalog. Once a share is defined, you can add tables. You can then grant recipients access to those shares. Once configured, users can read the data you’ve shared.

Define the Share

From your workspace, click Data on your left pane. You can do this from the Data Engineer or SQL experience.

You can see all the datasets defined in your unity catalog when the page loads. If you haven’t defined any datasets yet, go and do that now. Click on Delta Sharing in the left half of your data explorer.

Once the page loads, you’ll see any shares you have defined previously or any share you have access to in the current workspace. Click Share Data on this next screen to define your new share.

Next, you’ll be prompted for a share name and comment. Share names cannot contain symbols or spaces. After you’ve entered your share name, click create.

We can add any table already defined in our Unity Catalog. Click Add tables to select the tables you wish to share.

You can filter your table list to a specific database and schema. There’s also a free-form text box to help you find what you’re looking for. In my case, I’m going to share the NYC taxi data. If you open the Advanced table options section, you can also define an alias for your schema or table if you need a more human-readable name.

You can also limit the shared rows by adding a column filter. (Think column = ‘value’). It’s also possible to enable Change Data Feed, allowing consumers to query the data by version. If the data you’re sharing gets batch updates, you can allow users to see each batch’s update. This can be useful for feeds like the US unemployment rate, with a monthly release cycle.

Once you have selected your options, hit save to add your tables to the share.

When you return to the share page, click Add recipient to continue. If this is your first recipient, click Create new recipient to continue. Otherwise, you can click on an existing recipient to add them to this share.

On the next screen, you need to identify the recipient. This isn’t an email address or login; this is just an identifier that will remind you who this recipient is. It can be a Company name, user name, or anything. Again, no symbols or spaces are allowed.

If the user is going to access your data through another instance of Databricks, and that workspace is attached to a Unity catalog, you can request their metastore identifier. They can get their identifier by running the following query from their instance.

SELECT CURRENT_METASTORE()

This will return an identifier in the format <cloud>:<region>:<GUID>. You can paste that into the Sharing identifier box. Finally, you can add a description for the recipient to help you identify this recipient in the future. Once you’ve entered all your information, click Create and add recipient.

When you return to the Add recipient screen, click Add to continue. You’ll then get a link to download the share file you’ll give to the recipient to access your data.

You can give this link to your recipient and follow it yourself to get the share file. When you follow the link, you land on a page that looks like this:

When you click Download Credential File, your browser will download the share file.

Once the consumer downloads that file, all the code from the previous blog entry works. They reference this new share file instead.

Conclusion

Creating Delta Shares in Databricks is pretty straightforward, As long as you already have Unity Catalog set up. As usual, if you have any questions, please let me know.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • A New File Interrogator
  • Using Generative AI in Data Engineering
  • Getting started with Microsoft Fabric
  • Docker-based Spark
  • Network Infrastructure Updates

Recent Comments

  1. slowder on Data Engineering for Databricks
  2. Alex Ott on Data Engineering for Databricks

Archives

  • July 2023
  • June 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • October 2018
  • August 2018
  • May 2018
  • February 2018
  • January 2018
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • June 2017
  • March 2017
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • August 2013
  • July 2013
  • June 2013
  • February 2013
  • January 2013
  • August 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006
  • December 2005
  • November 2005
  • October 2005
  • September 2005
  • August 2005
  • July 2005
  • June 2005
  • May 2005
  • April 2005
  • March 2005
  • February 2005
  • January 2005
  • November 2004
  • September 2004
  • August 2004
  • July 2004
  • April 2004
  • March 2004
  • June 2002

Categories

  • Career Development
  • Data Engineering
  • Data Science
  • Infrastructure
  • Microsoft SQL
  • Modern Data Estate
  • Personal
  • Random Technology
  • uncategorized
© 2025 shannonlowder.com | Powered by Minimalist Blog WordPress Theme