Learning PowerShell as a DBA: Debugging PowerShell - It's easier than you think

2017-04-23 09:36

I was going to dive into a little less meaty topics to start my PowerShell series but I have had a couple requests for debugging pop up, and so that’s where our journey begins.

It’s a little backwards, but I hope you learn a bit about Set-PsBreakpoint and how to ask the shell a few more questions about what the heck it is doing.

Debugging often comes up when you get beyond the super trivial questions and start asking “what is this magical thing doing for me?”

I wanted to include a few terms that may be familiar or may not be (and that’s ok!), but they are good to know when approaching a debugger.

Glossary of terms:
  • REPL – Read – Evaluate – Print – Loop – A method to facilitate exploratory programming by allowing a line by line read of your input, evaluate it in the programming language of choice, print the result, and then do it all over again.
  • Debugger – Often times a specialized form of a REPL that allows you to step through your code as if you were personally typing them in line by line.
  • Breakpoint – A specific callout to the debugger to stop the world when it hits a line, or sometimes with more advanced debuggers (like PowerShell) you can set conditions to break on and hit the “breaks” to check out the state of your program.
  • Step-Into – The “default” “expected” step command, this just always chooses to go line by line no matter where the execution path leads. This may sometimes lead you much deeper than you want and then you might want to use Step Over or Step Out.
  • Step-Over – This executes whatever is on the current line but returns to the current level/context. This allows you to skip over functions/method calls and not get into their inner workings.
  • Step-Out – This executes whatever context you are currently in and then steps to one level up in the hierarchy of calls. This can sometimes lead to the application exiting if your current context is the parent program.

Most commonly when people start debugging their programs they reach for what they know, which is printing stuff to the screen. This is generally considered “printf debugging” (from the old C programming language days), and it has a fairly predictable approach:

  • Add a print statement (in PS, this would be best as a call to write-verbose or write-debug and making sure to set $VerbosePreference=“Continue” or $DebugPreference=“Continue” to get results.)
  • See if the error happens before or after the print statement.
  • Add another print statement.
  • Continue until you figure it out or give up.

This is a completely valid way of debugging a problem and one I use regularly (especially in SQL), but as you may know there are other methods such as tracing/profiling and in our case, a great debugger!

Any new set of commands can be a complex task, but thankfully with PS you really only need to spend some time on Set-PsBreakpoint, everything else is just supporting that command or outside the scope of this blog post :)

Breakpoint Related Commands
  • Set-PsBreakpoint – Allows you to trigger a breakpoint at a line, command, or variable, one type of breakpoint at a time (but you can invoke multiple items of that type.)
  • Get-PsBreakpoint – Allows you to return a list of any existing breakpoints (verifying what you did helps!) and also to easily pipe these breakpoints to Remove-PsBreakpoint or Disable-PsBreakPoint so that you can start fresh or quickly disable breakpoints for a “clean run”.
  • Remove-PsBreakPoint – Removes any breakpoint you pass to it.
  • Disable-PsBreakpoint – Disables a breakpoint you have created with Set-PsBreakpoint, I dont often use this or Enable-PsBreakpoint because of the technique I list a bit later creating a debug script.
  • Enable-PsBreakpoint – Enables a disabled breakpoint you have originally created with Set-PsBreakpoint.

Non-breakpoint commands, but excellent to know for debugging.
  • Set-PsDebug -Trace 2 – Allows you to notify EVERYTHING that you want to get deeper debugging information, internal commands will notify you much more about what they are doing.
  • Set-StrictMode -Version 2.0 – This allows you to have a more strict interpretation of what is valid code in your script. StrictMode will complain about references to variables you never initialize, references to properties of an object which do not exist, variables without names and a few less important things.
  • Trace-Command – Allows you to trace execution of a command, (included for completeness but it is a bit outside of our scope right now, we will come back to it in a later post.)

Let’s look at an example, and show you what I am talking about!

Say you are writing some PS and run into an error:

You may notice that the line number itself is thrown in the error message, and one of the things that helps us as a human when debugging an issue in our code. But in this case we can use the line number (and the script) to tell PS to break at that point in the execution and let us start doing things.

So, let’s start the debugger BEFORE my error occurs and watch it happen :)

First we set the Set-PsBreakpoint to break before the line number that threw the error (I forgot to add the script part, so PS helpfully asked what I was talking about.)

We re-run the command and now we can check out the commands available at this point with ? or h this breaks out which shortcuts are used to stepInto, stepOver, stepOut, continue, quit, and detach.

Use l to list the source code and where you are in the script execution will be marked by a star.

If we start stepping into the code with the s (step into) key (and enter) we can see the first line executing (and showing additional debug information printing the next line) and then the error message, just as before when we ran the entire script.

We can also run our own commands and treat the current debugging context as a normal shell. That means you can call internal variables, functions, you name it, if the script can use it you can use it. (This does not apply to $psboundparameters or the pipeline operator $_, if you want to check the values of these you must assign them within your script.)

If the problem gets more complex than that, I setup a simple “debug and invoke” script, so I can add additional breakpoints at will.

I wanted to show you the shell version because I want to constantly reinforce that PowerShell is really built for you to use as a shell first; these debugging commands are among the easiest I have ever used in a programming language, I didn’t have to suss out any weird flags and the REPL is super friendly.

I also want you to understand when you use Visual Studio Code, the PowerShell ISE, or some other tool that the debugging that is available is generally the one you always have anyway, you dont need some heavy tool to figure out what is wrong with your program!

If you want to get fancier (and potentially emulate some of the additional heavy hitter features of VS Code like watching the value of variables change), Set-PsBreakpoint can also do things like set conditions and fire scriptblocks when you hit your condition and allow you to perform certain actions instead of just break.

A common case is to add additional tests to your breakpoints so that you can ask things like “If the value of $var is X, then break” for things like “I want to break on the fiftieth time this runs” or “If this is true then break”.

The Get-Help Set-PsBreakpoint page shows a lot of really interesting additional examples that I wanted to briefly trot out for completeness, and to remind you that PowerShell help is solid as heck.

Check out more on the online help for Set-PsBreakpoint

Example 2: Set a breakpoint on a function
PS C:\>Set-PSBreakpoint -Command "Increment" -Script "sample.ps1"

This command creates a command breakpoint on the Increment function in the Sample.ps1 cmdlet. The script stops executing immediately before each call to the specified function.

Example 3: Set a breakpoint on a variable
PS C:\>Set-PSBreakpoint -Script "sample.ps1" -Variable "Server" -Mode ReadWrite

This command sets a breakpoint on the Server variable in the Sample.ps1 script. It uses the Mode parameter with a value of ReadWrite to stop execution when the value of the variable is read and just before the value changes.

Example 4: Set a breakpoint on every command that begins with specified text
PS C:\>Set-PSBreakpoint -Script Sample.ps1 -Command "write*"

This command sets a breakpoint on every command in the Sample.ps1 script that begins with write, such as Write-Host.

Example 5: Set a breakpoint depending on the value of a variable
PS C:\>Set-PSBreakpoint -Script "test.ps1" -Command "DiskTest" -Action { if ($Disk -gt 2) { break } }

This command stops execution at the DiskTest function in the Test.ps1 script only when the value of the $Disk variable is greater than 2.

It uses the Set-PSBreakpoint cmdlet to set a command breakpoint on the DiskTest function. The value of the action is a script block that tests the value of the $Disk variable in the function.

The action uses the BREAK keyword to stop execution if the condition is met. The alternative (and the default) is CONTINUE.

Constantine Kokkinos

Learning PowerShell as a DBA: First Steps and A Bit Of History

2017-04-14 07:26

I am going to kick off a series of blog posts that I hope will help you understand the usefulness and power of PowerShell (or at least come away with some tips and tricks.)

Before I start showing off keyboard shortcuts or breaking down some of the peculiarities you may need to understand to get the full benefits of PowerShell, I want to talk a little about history and why the shell is good.

In the Beginning was the Command Line

Before there were modern graphical user interfaces there was the command line.

There were many types of machines leading up to the giant time share computers of the 60s and 70s (and even later in some cases), but back then it was fairly common to write a program ahead of time, encode them in punch cards or otherwise, and then use a shared “terminal” (sometimes just a keyboard or input device, usually at your place of business or school) to submit your work.

After that, you often would get you output either printed back to you or in more modern cases, could even see that output on a screen.

Very quickly on multi user machines it became clear that allowing direct access to the systems functions was not only dangerous, but difficult to use and prone to error. Many programs were written to solve these problems and ease the burden of a fairly complex task.

Heroes In A Half Shell

To facilitate the more “modern” terminals (maybe even supporting iterative input!) many different approaches were taken, and over time a large set of conventions and methods took over the idea of what a shell was to computer operators.

These predecessor programs actually have an out-sized influence on the applications of today, and many shells are descendants that contain some of the zany choices which had to be made due to hardware or software limitations of code 30 years or more ago.

As a result of this many people think that the shell is something from an earlier time or a place only hackers go, some of you may have tried to accomplish your goals but because the shell is so separate from the standard experience, it was probably slow going.

While the shell is definitely closer to the bare metal experience than many of us are used to, it at one time was designed to be the primary user interface, and most shells have many methods to help you deal with the problems baked into a command line interface.

If you’ve never used a shell before or you are familiar with other shells (bash, zsh, cmd, etc) I want to show you that not only is PowerShell not scary, but instead how it will remove drudgery from your day to day work and how using it doesn’t need to feel kludgy.

If you are a SQL professional like me, I want to show you how you can use features of PowerShell to:

  • Manage your ever growing code base and server farm.
  • Automate tasks which seem annoying (text processing!)
  • Reporting on multiple machines and transforming them into useful input.
  • Solve problems that used to take thousands of lines of SQL in tens of lines of PowerShell


I also want you to know that there is an active community of people who want to help you when you run into problems. please come join us at the SQL Community Slack and head on over to #PowerShellHelp or #dbatools.

You can find me there talking as @ck, I would love to hear from you!


I also want to acknowledge a few people as this series begins who have helped and influenced me greatly in these endeavors:

  • Chrissy LeMaire – As the project leader for dbatools.io, she has shown tireless enthusiasm in showing the both the usefulness and fun that can be had when authoring PowerShell.
  • Rob Sewell – A tireless contributor and head of the dbareports project, his want to demonstrate and help those with less knowledge seems boundless.
  • Klaas Vandenberghe – A tireless supporter of quality code and best practices, Klaas has an encyclopedic knowledge of PowerShell, PowerShell publications, and more; he has always driven me to ask more questions about my code and methods.
  • Friedrich Weinmann – A PowerShell wizard and hyper contributor to dbatools, his deeper dives into some of the technologies and his advice have gotten me out of a few weird places.
  • Cláudio Silva – Cláudio’s constant pushing for more and better code to bring back to the community is always impressive, and his enthusiasm leaks through the screen any time I work with him.
  • Carlos Chacon for giving me the idea for this series of posts (though indirectly) I truly saw his enthusiasm at learning more about using PowerShell when he learned I wasn’t a PowerShell expert, but instead a SQL Professional like himself (with a lot less experience of course!)
  • Neal Stephenson for some great books and a great quote.
Constantine Kokkinos

Contributing to the SQL Server Data Partners Podcast

2017-04-12 07:04

Just wanted to mention a great podcast that some of the dbatools team and I were lucky enough to be invited to attend, the SQL Server Data Partners Podcast

You can listen to the specific episode below:
SQL Server Data Partners Podcast : dbatools

If you haven’t checked the podcast out before, it features two SQL Server professionals Carlos Chacon and Steve Stedman, who do a great job of keeping the conversation light-hearted and momentum moving during some of the more in depth technical conversations. I really enjoy their interview style and I highly recommend checking them out.

Different episodes range widely on the podcast and cover topics such as career advice and experiences, code best practices, and personal interviews from some of the titans of the SQL Server community.

The focus of this episode was dbatools ; topics ranged from how the project started and gained acceptance, bringing in new people and maintaining interest in the project, and getting a little deeper into why dbatools mattered to us.

It was really fun to be invited and I was happy to be able to chip in about learning and growing as one of the more junior PowerShellers of the team.

Chrissy and I had such a fun time we are working towards releasing our own dbatools oriented “people talking on the internet type thing” soon!

Constantine Kokkinos

You committed files to the wrong branch (master), and now you want to apply your changes to the right one (development)

2017-03-14 15:40

I wanted to share a scenario and solution for a problem I am seeing arise in the dbatools project:

  • Commits are registered against the “wrong” branch
  • A pull request is submitted
  • The file comparison shows hundreds of changes instead of 1-2

There are many ways this can happen, but usually due to forgetting to create a feature branch, or accidentally creating said branch from master instead of development.

To fix this problem, you will want check the log of commits to determine the hash of the command you want, and then create a new branch (from the correct source) and apply those commits.

git cherry-pick is a command that allows you to apply a commit on a different branch than what you what you originally intended.

To fix a pull request where you have more files than you want:

  • Make sure you are in the branch you had a problem.
  • git status to see your current branch and git checkout branchname will change your context to the branch name you want.
  • Use the git log command to see a list of commits, you should see your changes and a relevant hash.
  • Record the first 10-12 characters of the commit hashes of your changes (technically you only need enough of the values to be unique.)
  • Checkout the correct source branch (development) with git checkout development
  • Create the correct feature branch with git checkout -b new-feature-name (this will also check it out for you and set it to be the current branch)
  • Cherry pick the commits with git cherry-pick hash
  • Push the new branch to your repository git push origin new-feature-name
  • You should now be able to Pull Request/Compare your code.

A note on comparisons, any time you make a new Pull Request GitHub will automatically list the files changed and the number of commits you put in place, if this seems different than what you did, give me a jingle and we can see which part of this process we want to do :)

Constantine Kokkinos

Troubleshooting PowerShell and .NET: When error messages are not enough.

2017-03-05 13:18

Recently we had a report in the dbatools project: there was an issue installing from the PowerShell Gallery (via Install-Module.)

Error “Invalid characters in path.”

We had a few scattered notifications of the issue and even one of the dbatools developers affected internally, but it proved to be a stubborn bug and the people involved had tried numerous methods in tracking it down.

We knew that one of the paths being provided to Windows was bad as a part of our module was bad, but the error message sucked and everything worked almost everywhere; we were really confused about what was going on.

It was clear that like any new issue it was a product of new code and changes to the project, but it was unclear which of the hundreds and hundreds of new additions was the source were the culprit.

We had been adding new things at a frenetic pace and this was not something that came up during the standard build process… the final count was 185+ files which might have something to do with this error.

I finally was able to get the source of the problem through attaching my new favorite program dnSpy to my PowerShell process, and I wanted to show you how to do it too.


First, you need to download and extract dnSpy :

  • Launch dnspy.exe – you may have to run dnSpy as an administrator to attach to processes or rewrite DLLs, I always launch it as administrator.
  • Switch to a ready PowerShell window with your command
  • From the debug menu, choose “Attach to Process…”
  • Find your powershell.exe in the list, and choose “Attach”
  • Back in your PowerShell window, run your command, you should see a new colored bar on the botton of the dnSpy window indicating the attachment worked and it is running.
  • Your PowerShell command should fail, and dnSpy should throw the exception and wait for input.
  • Click the “Call Stack” tab
  • Right Click in the Call Stack area and make sure “Show Parameter Values” is checked.
  • Success!!

You should now be able to read the methods that were called to get you to this error, and hopefully be able to suss out which methods got which parameters!

Further reading for dbatools problems:

When the module loaded and was analyzed, one of our strings in our psm1 file was interpreted as a filepath (but only in WMF 5.0, as stated 5.1 does not have this issue.)

The team was interested in keeping the module installation and setup process as speedy as possible and in that vein the following tweet and blog post came up

From the article:

Solution? There are few options (like pipe Get-Content to Invoke-Expression), but I ended up with dot-sourcing a script block created from content of the script.

It looks like while this was (and probably is) a great idea, in certain versions of Install-Module it will actually interpret the dot in the dot sourcing as the start of a path, and will attempt to evaluate it as such, throwing an absolute tizz.

In the mean time, dbatools has made changes to effectively do the same thing without using a dot prefix for the line (even though WMF 5.1 is not affected by this issue) and thankfully with a few other changes one of our core contributors Fred was able to find a fix without losing too much time.

Constantine Kokkinos

« Older