Counting logical lines of T-Sql code in a DacPac

There are two general ways of counting lines of code, the first is basically just count the lines in each file that make up a project, the second is to count the logical statements as specified by the language.

Counting lines of T-SQL code is typically quite hard as, personally, I am interested in things like stored procedures and functions and their statements as a metric for lines of code rather than including columns and tables as, for me, they are separate.
I thought this would be a great example of how to use the DacFx API and the T-Sql script dom together to create a tool that I, and hopefully the other person who has visited this blog, would find useful!

What this tool does is count the logical lines of code in any stored procedure, function (scalar or table-valued) and dml triggers, if anyone thinks of other items of code then it is open source so feel free to check it out, add it in and submit a pull request, it is really trivial to add other types of code.

To count the number of statements, I use the T-Sql visitor class to break the each code file into statements.

Feel free to download the tool from:

https://raw.githubusercontent.com/GoEddie/DacFxApi/master/Download/CountLinesOfSqlCodeInDacPac.zip

The source code is in my DacFx Api github repro:

https://github.com/GoEddie/DacFxApi

How does this work?

The first step is to open the dacpac file:

var model = new TSqlModel(fileName, DacSchemaModelStorageType.File);

I then need to query the model for the Objects of the specific type I am looking for:

model.GetObjects(DacQueryScopes.Default, Procedure.TypeClass)
model.GetObjects(DacQueryScopes.Default, ScalarFunction.TypeClass)

Each of these return an IEnumerable<TSqlObject> which I pass to a helper function to query for the name and the script:

Name = codeUnit.Name.ToString()
var script = “”;
if (codeUnit.TryGetScript(out script))

return script;

Once I have the names and contents of all the scripts I am interested in, I need to parse the scripts and break them into statements. This is made really simple using the TSqlFragmentVisitor class.

The way the TSqlFragmentVisitor class works is that you create a class that inherits from it:

public class SqlVisitor : TSqlFragmentVisitor
{
}

There are then a set of methods you can override which are called everytime a statement of that type is encountered, so for example if you use:

public override void Visit(TSqlStatement node)
{
}

This will be called for every statement, it is what I use in this tool as I don’t care what type of statement it is, as long as it is a statement.

There are other override-able methods such as:

public override void ExplicitVisit(SelectStatement node)

This is only called for select statements so it really is powerful and so simple to use!

Once I have the statement count, I tally up the lines of code for all code units and I end up with the total lines of logical code for just code units in a dacpac file!

This is obviously really basic, if you have an example where the lines of code is incorrect, send it to me and I will fix the tool or fix it yourself and send a pull request on github!

Happy DaxFx + TransactSql ScriptDom’ing!

 

p.s. to see how many lines in each code unit, use the /detailed switch!

Dac Fx Api – Use it, write better tools for Sql Server!

The DacFx Api is an api for querying the object model that both DacPac files and SSDT use, it is a giant leap forward in the available tooling for Sql Server and hopefully there will be a raft of great tools to make T-SQL development better.

I really enjoy writing c# code. I typically use visual studio with Resharper and add NUnit and Moq to my test projects and this setup makes developing and testing really simple and removes a lot of the boring repetition you sometimes get. I also love developing T-SQL but the development environment is nowhere near as smooth.

When I first started looking at the DacFx api, I was a little confused as to how it worked, even after reading the introductory blog (http://blogs.msdn.com/b/ssdt/archive/2013/12/23/dacfx-public-model-tutorial.aspx) and when I finally realised what the implications were of the painless sounding sentence “The public model API is loosely typed: the TSqlModel contains loosely typed TSqlObjects that represent all the elements in your schema.” my heart sunk a little as it felt very similar to every object being returned as a System.Object and then having to cast it to the correct type (you don’t by the way, you leave it as the TsqlObject).

What this means is that all objects in the model, a table, schema, procedure etc are all the same type, a TSqlObject. To get any information such as what columns belong to what table or index or whether the index is clustered or not you need to query the objects and ask for the information you want, in reality what you have with the DacFx Api is a queryable set of classes, you just have to know what you have and what you want and although it is a different paradigm to pretty much every API I have ever seen, it is fairly straight forward. I have put an example application which opens a dacpac file and dumps out a list of indexes which gives an overview of how to use the api to get the information you want.

To open a dacpac file and get a TSqlModel object which is the entry point to the API you just need to call:

var model = new TSqlModel(filePath, DacSchemaModelStorageType.File);

When you have the model you can then query it for all the objects you would like, if you want all the indexes in the model just do:

model.GetObjects(DacQueryScopes.Default, Index.TypeClass)

GetObjects searches for and returns a list of specific object types, the query scope means all objects, you can limit it to just system objects or just user objects etc. Finally the TypeClass causes a list of just Indexes to be returned.

The way the TypeClass works is that every object such as a table, index, column etc has a TypeClass and also on the static class that has the TypeClass such as Index or Table there are a number of properties which allow you to access the actual properties you want. If this sounds a little strange, I think it is, but it is fairly straight-forward with an example! We will call the class that holds the TypeClass and other property classes the container class as that is what msdn seems to call them. If you retrieve a list of all the indexes and you have a TSqlObject that you got by using the Index.TypeClass (as above) then you can query the index for its properties. The Index container class has, amongst others, these properties:

Index.TypeClass

Index.AllowPageLocks

Index.Clustered

You can then use these properties to get the value you want from your index TSqlObject instance, so if you do:

foreach (TSqlObject index in model.GetObjects(DacQueryScopes.Default, Index.TypeClass ){

var isClustered = index.GetProperty<bool?>(Index.Clustered)

}

You will get a nullable bool saying whether the index is clustered or not. If you do not know the type of the object you want you can call the non-generic version index.GetProperty(Index.Clustered). The example at: https://github.com/GoEddie/DacFxApi/tree/master/DumpIndexProperties (repo: https://github.com/GoEddie/DacFxApi/) shows some extra information like how to query the relationships to get the columns and included columns that are on the index (in, on, used by??) and how to generate the T-SQL script to deploy the index.

I strongly believe that as Sql Server developers we have the ability to really help move our tooling forward and if you throw into the mix the TSql scripting stuff, the next few years are really going to be exciting times for us Sql devs!

Speed up TDD in SSDT (and other acronymns)

tldr: I have written a free vs package to deploy files quickly from SSDT to sql server, to speed up the red-green tdd cycle.

I find that writing code for sql server using TDD can be a little frustrating, the process I have is that I use SSDT to manage and deploy database projects including tests written using tSQLt, using the standard red-green cycle:

  1. In SSDT, Visual Studio 2013 currently, write a tSQLt test
  2. In SSDT I write an nunit test to call tSQLt.Run passing in the name of the new test.
  3. I then deploy the tSQLt test and run it and expect it to fail (Red)
  4. I then write the code which satisfies the test
  5. I then deploy the code and watch for the test to pass (Green)
  6. If the test doesn’t pass, I either change the code and re-deploy or change the test and re-deploy that
  7. The cycle continues until the test is passing correctly.

The frustrating part, for me is that building the project and deploying, although I have automated it takes quite a long time (45 seconds to 1 minute for a reasonably sized project).

To help this process I have written the Sql Tdd extension for visual studio. If you install it from:

http://goo.gl/U3s0il

When you right click on a .sql file, you are given two new options, the first option is to “Generate Deploy Script” which creates a .sql file which includes a drop and create script.

The second option, generates the script in the background and deploys it to your test sql server (there is a configurable connection string).

The two options replace any $(Sql Cmd Variables) that have been configured in the project using the default values.

This isn’t intended to replace the build + publish but supplement it in that you can quickly deploy single files to help speed up the red-green cycle.

Once you have the test working I would certainly recommend, building, re-deploying and re-running your tests to verify that the whole project is correct but the time you spend waiting for ssdt to update it’s internal model and deploying is greatly reduced.

Not convinced?

To compare working with ssdt with and without this plugin, here is a comparison.

To deploy a file in ssdt:

  1. Write the changes
  2. Build the solution
  3. Fix any build errors, anywhere in the solution
  4. Perform a schema comparison
  5. Decide what you want to include in the update
  6. Update the destination
  7. Re-perform the schema comparison

You could just have done a publish at step 4 but that wouldn’t have been as compelling!

To deploy a file with this plugin:

  1. Write the changes
  2. Build the solution if you want to validate the model or just save the files
  3. Right click the files in solution explorer and choose “deploy file”

simple!

This is a really simple plugin, so don’t expect bells and whistles like only activating on certain projects I am afraid (maybe a later version) – all messages are sent to the output window so check that if you want to see what is going on.

This is published on github (http://github.com/GOEddie/SSDTExtension), if you like it and want it to do more, either ask me or fork it and do it!

There are a few articles and blogs on SSDT but if you have any tips on making the most of it, please please do share with everyone!

 

Disclaimer: I run it on my work machine and home machine and seems to work, please don’t cry if you get any issues, let me know and I will try to fix them!