Biml and Oracle — Connections and Data Types

I’ve had two different clients approach me in the past few weeks asking how to deal with some odd behavior in Biml when working with Oracle sources. I have to be honest, working with Oracle is painful. The good news is there are ways to work around the pain. You can even build your own library to deal with the pain automatically, once you know where the problems are. This blog entry is to help you identify where those problems are when dealing with data types.

Getting Started

When you get started with your first Oracle instance you will want to grab a couple things from their website. Oracle Developer Tools for Visual Studio 2017. My most recent client that uses Oracle is still on VS 2017. I’m using version 18.3.0 of the developer tools. I prefer ADO.Net connections, so I also grabbed Oracle Net Services version 12.1.0.2. And being able to query the source is helpful on occasion, so also snag a copy of Oracle SQL Developer.

With those installed, and after rebooting, you should be safe to carry on. Some people have reported issues with 32-bit and 64-bit version problems. On my latest dev machine I have not seen those issues, and all my generated artifacts run in 64-bit mode. So, your mileage may vary here.

Connections

In order to extract metadata from your Oracle instance, you’re going to need to define at least one connection in Biml. You can define either an ADO.Net or an OLEDB connection. As long as the driver is on your machine, you can use it. In the following demo I’ve defined both so you can switch between the two and see no changes in the end result. I store the following script in 00_BuildConnections.biml.

<#@ template tier="0" designerbimlpath="Biml" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd"> <Connections>
<AdoNetConnection Name="AdoNetOracleSource"
ConnectionString="Data Source=oracleSourceServer;User ID=OracleUser;Password=0r4c13SUX!;Persist Security Info=True;Unicode=True"
Provider="System.Data.OracleClient" />
<OracleConnection Name="OracleSource"
ConnectionString="Data Source=oracleSourceServer;User ID=OracleUser;Password=0r4c13SUX!;Persist Security Info=True;" />
</Connections> </Biml>

With these defined we can create a script that will extract our metadata, and fix a couple of the known issues.

Metadata

Most of you who use Biml are familiar with what metadata is, it’s the data that describes your data. In SQL Server terms you may think of sys.tables and sys.columns. Those two views in SQL Server describe the tables (and views) as well as the columns in those objects. Many people work through example code in Biml where you use the GetDatabaseSchema function to interrogate, or “pull in” that source metadata for use. With Oracle you can still use that function, but you’re going to run into problems.

All of these problems can be solved by separating your interrogation step from your artifact generation step (the step where you create your packages. To do this, I’ll often create a BimlScript file to build out a Biml representation of my source objects. Once this script gets my metadata clean and usable, then I create separate scripts to generate assets from this Biml representation of my source objects, rather than directly from the source metadata.

Let’s walk through some of the bugs you’re going to hit, and how you can address them in a BimlScript.

column.GetBiml() Sometimes returns nothing

I create a script named 01_BuildTables.biml with the following code:

<#@ template tier="1" designerbimlpath="Biml" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<#
var sourceConnection = RootNode.AdoNetConnections["AdoNetOracleSource"];

//I do define two lists for includedSchemas and includedTables to limit results

var importResult = sourceConnection.GetDatabaseSchema(includedSchemas,includedTables,ImportOptions.None);
#>
<Tables>
<#foreach (var table in importResult.TableNodes) {#>
<Table Name="<#=table.Name#>" SchemaName="<#=table.SchemaName#>">
<Columns>
<#foreach( var column in table.Columns) { #>
<#=column.GetBiml()#>
<# } #>
</Columns>
</Table>
<#} #> </Biml>

Running this code on my instance hits a table that has a column named ID_N with a datatype in Oracle of Number(10,0). When you read the Biml for that column you see:

<Column Name="ID_N" Length=22" />

What i’ve found is any time there is no scale defined for the number, you won’t get a datatype back. So how do I fix it? First of all, I don’t use the GetBiml() function for columns when dealing with Oracle data sources. I always build them out more explicitly. So replace that <#=column.GetBiml() with something like this:

<Column Name="<#=column.Name#>" DataType="<#=column.DataType#>" />

Just making that one change handles the no data type returned problem.

<Column Name="ID_N" DataType="Int32" />

But then you’ll find the next problem, sometimes you get unexpected data types.

Unexpected Data Types

In my source system, I have some columns with the type Number, but no Precision or Scale defined. Literally NULL stored for those values. our code above column.DataType will return an Int64. And that might work for you. The problem is, in at least once case I’ve hit, the source system had decimal values with 27 decimal places. We can’t store that in an Int64, at least not without losing the precision of the original value. In this case we could decide that any time we get a source column of type Int64, with NULL for both Precision and Scale, we want to change the datatype to Double.

<# if (column.DataType == "Int64" && column.Precision == null && column.Scale == null) { #>
     DataType="Double"
<# } else { #>
     DataType="<#column.DataType#>"
<# } #>

You may want to be a little more controlled than simply changing all your Int64s over to Doubles.

Conclusion

The point I need you to understand is this interrogation step needs to make an accurate Biml representation of the source metadata. Your logic can be as simple as substitution, or it could be as complex as writing a custom class to address all the metadata problems you could possibly hit when dealing with Oracle.

That latter option may be something the Varigence team might want to tackle internally.

If you don’t handle this interrogation separately from artifact generation, then it only makes the scripts that generate your artifacts a lot harder to build. That’s what I’m going to cover in the next article!

Leave a Reply

Your email address will not be published. Required fields are marked *