Press "Enter" to skip to content

Select Indistinct

Would you be involved with PASS without a SQL Saturday?

I wanted to bring you a quote from #SQLSaturday channel in the SQL Community Slack that I found amazing:

“They [the PASS board] are already signaling they don’t want to spend $$ on SQL Sat and a few of the board members would just as soon SQL Sats die.”

Having missed the Q/A at the Summit, this blew my mind.

Everyone I know who joined PASS had a requisite SQL Saturday or local user group to get them interested and welcomed into the community.
Even after my second SQL Saturday I didn’t really recognize what PASS was (beyond my local UG) and if someone had asked about the international org, I would have responded with (at best) “uh, sure, I guess?”

In my opinion, the local user groups are the primary thing that matters, the national/international components of the org may put out a few high quality webinars on a weekly basis, but 99% of that content is not related to my problems at hand. The local UG/SQL Saturday gives me valuable time with experts which I don’t have to pay thousands of dollars for to get on a plane or buy a ticket to the Summit.

If you ask the question “Would the SQL PASS Summit exist without SQL Saturday?” I think the answer is no, or if yes, then a much smaller less welcoming affair.

Agree or disagree?

I would love to hear more from the community whether they cherish and think SQL Saturday should be funded, or if it seems like a dead end activity for PASS.

 

 

 

 

Slack Asks: Scripting out all objects from a SQL Server database with PowerShell and dbatools

@brianszadek says: I am looking to automate the scripting of objects with PowerShell for easier source control integration.

I thought this would be a little bit easier, but it looks like the SMO way of doing this is still fairly complicated. Maybe something I can look at simplifying for the project in a bit 🙂

For completeness I also included my favorite tool (schemazen) and Microsoft’s new scripting tool if you want a quick shell approach.

PowerShell: (dbatools)

$SqlInstance = "192.168.0.23,1433"
$DatabaseName = "breakme"
$Server = Connect-DbaInstance $SqlInstance
$Database = $Server.Databases[$DatabaseName]

# Define an empty list
$ObjectList = @()
$Options = New-DbaScriptingOption
$Options.AllowSystemObjects =
  $Options.AppendToFile = $false

$Options.Triggers =
  $Options.ToFileOnly =
  $Options.ScriptBatchTerminator =
  $Options.DriAll =
  $Options.Indexes =
  $Options.Default =
  $Options.SchemaQualify =
  $Options.ClusteredIndexes =
  $Options.NonClusteredIndexes = $true

# Surprisingly, $Database.EnumObjects will return a set of strings, not SMO objects, this is kind of sucky.
$ObjectList += foreach ($ob in $Database.EnumObjects()){ # Get all objects
  if ($ob.Schema -notin "sys","INFORMATION_SCHEMA"){ # Not in system schemas
    $o = $Database.Resolve($ob.Urn) # Resolve the object path to an SMO object
    if (($o | Get-Member | Select-Object -exp Name) -contains "Script"){ # If it has a script property
      $o # Return it
    }
  }
}


function Rename-ObjectToSqlFile {
  param ($ObjectName)
  $Name =  $ObjectName -match '.*\/(.*)'
  if ($Name -eq $false){
    $Name = $ObjectName
  } else {
    $Name = $matches[1]
  }
  $Name = $Name + ".sql"
  return $Name
}

# Iterate, get one file per object, decide on how you want to name them, because things like event notifications are uris
foreach ($obj in $ObjectList) {
  $Name = Rename-ObjectToSqlFile -ObjectName $obj.name
  Export-DbaScript $obj -ScriptingOptionsObject $Options -Path $Name # Export it with the dbatools powers
}
# Or export based on category
Export-DbaScript $ObjectList -ScriptingOptionsObject $Options # Export object categories to each file

Note: Export-DbaScript currently has a bug that prevents servers using SQL Authentication from scripting, this has been fixed in the next version.

Command line executables

Schema Zen (C#) (Free, OSS)

If you haven’t heard of SchemaZen, its a great and insanely fast tool for exporting all the objects in a database to a folder, and if needed, take the same folder and create a database from it.

There are also some experimental data export features.

MSSQL Scripter (python) (Free, OSS)

If you haven’t used the MSSSQL Scripter, its Microsoft’s tool that utilizes the new SqlToolsService to request and script out DDL information. They have a lovely architecture page if you want to learn more.

“Fixing” Visual Studio Code or PowerShell ISE Cached Variables

One of the features of both Visual Studio Code and the PowerShell Integrated Scripting Environment is allowing you to run a line of code with the F8 key.

This is super cool and useful, but it has a side effect because to validly be able to run a specific line of code, you often need to have stored all the results up to this point.

In my experience, I have received all sorts of strange behavior from both PowerShell ISE and Visual Studio code that came down to “oh, I remembered the last time you used that variable and now I am going to cause you problems.”

When I am writing code, I expect that adding a variable would mean that before you add it, it wouldn’t exist, and after you declared it, it would exist, we can see that behavior below:

 

However, in both VS Code and the ISE, if you run the same code again, you might be surprised at seeing the numbers of variables cease to change.

While the benefits are clear and deliberate, I prefer to be able to run from a clean slate each time, and PowerShell gives us a few mechanisms to do that. I thought about Get/Remove variable, but there’s no clear properties that I can see to distinguish user variables from environment ones, so this is what I came up with:

You can see, that now that we defined our code in a scriptblock, the variables do not leak out of their context and we can run our code over and over without worrying about existing variables.

Generating Tables of Contents for Github Projects with PowerShell

Table Of Contents

 

Initial Idea

Let”s generate a table of contents for a Github flavored Markdown document in PowerShell (because nobody else seems to have.)

 

Clarifying Our Requirements

After talking with Claudio, he didn”t just want a TOC for a particular markdown file (though that”s nice), but he also wanted a TOC for a set of files in a folder. I decided to do the first one, and it looks like Claudio just finished the second.

I did some googling and found that Github flavored markdown supports a special form of anchor linking as it isn”t supported in vanilla MD. I also found that beyond this there wasn’t a good anchor link MD shortcut, so I took this tact (which cares more about readability than speed or length.)

You can use the function below, which was also run on this blog post to generate itself.

 

The Code

function Convert-StringToMarkdownTableOfContents {
    param ([string]$MarkdownDocument)

    $nl = [System.Environment]::NewLine
    $TOC = "## Table Of Contents$nl$nl"

    foreach ($line in $MarkdownDocument.Split( $nl )){
        if ($line -match "^(#+)\s(.*)$") {
            $Count = $Matches[1].Length
            $Content = $($Matches[2])

            $EncodedContent = ($Content -replace " ",''-'').ToLower()

            if ($Count -eq 1){
                $TOC += "- [$Content](#$EncodedContent)$nl"
            }
            else {
                $TOC += "$("  " * $Count)* [$Content](#$EncodedContent)$nl"
            }

            $MarkdownDocument = $MarkdownDocument -replace "(($line)\w*)",$(
                ''<a href = "#'+ $EncodedContent +''"></a>'' +
                $nl +
                $line
            )
        }
    }
    return $TOC + $nl + $MarkdownDocument
}

You can download this code which generates this post Here.

 

Special Thanks

Special thanks to Cláudio Silva (PowerShell MVP) for giving me this fun idea!

Note:
Unlike Github Flavored Markdown, WordPress Markdown does not implement the anchor links, so in my blog they wont be clickable like they would github, sorry!

Automate and Simplify DBCC CHECKDB

T-SQL Tuesday Logo

As a part of the T-SQL Tuesday run by Rob Sewell this month, I wanted to throw in a little post about using Test-DbaLastBackup which I will also be showing off a this year’s SQL Summit!

 

Come to the Precon

If you like this post, don’t forget to signup for me and Chrissy Lemaire’s precon at the 2017 SQL Summit for more awesome PowerShell <3 SQL Server magic with dbatools!

Test-DbaLastBackup

As data professionals we know that safeguarding our data is one of our most important jobs, and yet a significant amount of us continue to find excuses not to verify the integrity of our data.

I want to show you an uber simple method to automate testing your entire estate worth of backups.

Before we move forward, I want to note this process does perform a restore and DBCC checkdb on the server you specify, so please don’t run this in production and come to me upset that your IO and CPU time shot through the roof 😀

However, if you only have a smallish or sleepy production environment, this will run just fine side by side, otherwise I demonstrate a parameter to run this on a different server.

Lastly, before we begin, this assumes you’ve already installed and imported dbatools, because why wouldn’t you have?

Code

"sql2012","sql2005" | Test-DbaLastBackup -Destination "Sql2016"

This:

  • Defines a list of two servers (PowerShell lists are as easy as “”,””)
  • Pipes them to the Test-DbaLastBackup command.
  • Which then:
    • Gathers information about the last full backups for all of your databases on that instance.
    • Restores the backups to the Destination with a new name. If no Destination is specified, the originating server will be used.
    • The database is restored as “dbatools-testrestore-$databaseName” by default, which you can change with the -Prefix parameter.
    • The internal file names are also renamed to prevent conflicts with original database.
    • A DBCC CHECKTABLE is then performed.
    • And the test database is finally dropped.

If you are super stuck on the restore portion but would like to do a VERIFYONLY, there is a -VerifyOnly parameter switch.

Some other useful parameters you might need to use in a typical scenario:

-Database Allows you to specify a list of databases to specifically check, useful if you have just one in mind.

-MaxMB Prevents databases over X MB from being restored.

-ExcludeDatabase Exclude specific databases (for that one gigantic one you know you don’t want to touch.)

-DataDirectory Alternate data directory (besides default) you want to restore the database files to test.

-LogDirectory Alternate log directory (besides default) you want to restore the database files to test.

Watch

If you like video, wait no more, here’s a youtube demo of Test-DbaLastBackup

Thanks for your time, and I hope you enjoyed.

Learning PowerShell as a DBA: Debugging PowerShell – It’s easier than you think

I was going to dive into a little less meaty topics to start my PowerShell series but I have had a couple requests for debugging pop up, and so that’s where our journey begins.

It’s a little backwards, but I hope you learn a bit about Set-PsBreakpoint and how to ask the shell a few more questions about what the heck it is doing.

Debugging often comes up when you get beyond the super trivial questions and start asking “what is this magical thing doing for me?”

I wanted to include a few terms that may be familiar or may not be (and that’s ok!), but they are good to know when approaching a debugger.

Glossary of terms:

  • REPL – Read – Evaluate – Print – Loop – A method to facilitate exploratory programming by allowing a line by line read of your input, evaluate it in the programming language of choice, print the result, and then do it all over again.
  • Debugger – Often times a specialized form of a REPL that allows you to step through your code as if you were personally typing them in line by line.
  • Breakpoint – A specific callout to the debugger to stop the world when it hits a line, or sometimes with more advanced debuggers (like PowerShell) you can set conditions to break on and hit the “breaks” to check out the state of your program.
  • Step-Into – The “default” “expected” step command, this just always chooses to go line by line no matter where the execution path leads. This may sometimes lead you much deeper than you want and then you might want to use Step Over or Step Out.
  • Step-Over – This executes whatever is on the current line but returns to the current level/context. This allows you to skip over functions/method calls and not get into their inner workings.
  • Step-Out – This executes whatever context you are currently in and then steps to one level up in the hierarchy of calls. This can sometimes lead to the application exiting if your current context is the parent program.

Most commonly when people start debugging their programs they reach for what they know, which is printing stuff to the screen. This is generally considered “printf debugging” (from the old C programming language days), and it has a fairly predictable approach:

  • Add a print statement (in PS, this would be best as a call to write-verbose or write-debug and making sure to set $VerbosePreference="Continue" or $DebugPreference="Continue" to get results.)
    • See if the error happens before or after the print statement.
    • Add another print statement.
    • Continue until you figure it out or give up.

This is a completely valid way of debugging a problem and one I use regularly (especially in SQL), but as you may know there are other methods such as tracing/profiling and in our case, a great debugger!

Any new set of commands can be a complex task, but thankfully with PS you really only need to spend some time on Set-PsBreakpoint, everything else is just supporting that command or outside the scope of this blog post 🙂

Breakpoint Related Commands

  • Set-PsBreakpoint – Allows you to trigger a breakpoint at a line, command, or variable, one type of breakpoint at a time (but you can invoke multiple items of that type.)
  • Get-PsBreakpoint – Allows you to return a list of any existing breakpoints (verifying what you did helps!) and also to easily pipe these breakpoints to Remove-PsBreakpoint or Disable-PsBreakPoint so that you can start fresh or quickly disable breakpoints for a “clean run”.
  • Remove-PsBreakPoint – Removes any breakpoint you pass to it.
  • Disable-PsBreakpoint – Disables a breakpoint you have created with Set-PsBreakpoint, I dont often use this or Enable-PsBreakpoint because of the technique I list a bit later creating a debug script.
  • Enable-PsBreakpoint – Enables a disabled breakpoint you have originally created with Set-PsBreakpoint.

Non-breakpoint commands, but excellent to know for debugging.

  • Set-PsDebug -Trace 2 – Allows you to notify EVERYTHING that you want to get deeper debugging information, internal commands will notify you much more about what they are doing.
  • Set-StrictMode -Version 2.0 – This allows you to have a more strict interpretation of what is valid code in your script. StrictMode will complain about references to variables you never initialize, references to properties of an object which do not exist, variables without names and a few less important things.
  • Trace-Command – Allows you to trace execution of a command, (included for completeness but it is a bit outside of our scope right now, we will come back to it in a later post.)

Let’s look at an example, and show you what I am talking about!

Say you are writing some PS and run into an error:

You may notice that the line number itself is thrown in the error message, and one of the things that helps us as a human when debugging an issue in our code. But in this case we can use the line number (and the script) to tell PS to break at that point in the execution and let us start doing things.

So, let’s start the debugger BEFORE my error occurs and watch it happen 🙂

First we set the Set-PsBreakpoint to break before the line number that threw the error (I forgot to add the script part, so PS helpfully asked what I was talking about.)

We re-run the command and now we can check out the commands available at this point with ? or h this breaks out which shortcuts are used to stepInto, stepOver, stepOut, continue, quit, and detach.

Use l to list the source code and where you are in the script execution will be marked by a star.

If we start stepping into the code with the s (step into) key (and enter) we can see the first line executing (and showing additional debug information printing the next line) and then the error message, just as before when we ran the entire script.

We can also run our own commands and treat the current debugging context as a normal shell. That means you can call internal variables, functions, you name it, if the script can use it you can use it. (This does not apply to $psboundparameters or the pipeline operator $_, if you want to check the values of these you must assign them within your script.)

If the problem gets more complex than that, I setup a simple “debug and invoke” script, so I can add additional breakpoints at will.

I wanted to show you the shell version because I want to constantly reinforce that PowerShell is really built for you to use as a shell first; these debugging commands are among the easiest I have ever used in a programming language, I didn’t have to suss out any weird flags and the REPL is super friendly.

I also want you to understand when you use Visual Studio Code, the PowerShell ISE, or some other tool that the debugging that is available is generally the one you always have anyway, you dont need some heavy tool to figure out what is wrong with your program!

If you want to get fancier (and potentially emulate some of the additional heavy hitter features of VS Code like watching the value of variables change), Set-PsBreakpoint can also do things like set conditions and fire scriptblocks when you hit your condition and allow you to perform certain actions instead of just break.

A common case is to add additional tests to your breakpoints so that you can ask things like “If the value of $var is X, then break” for things like “I want to break on the fiftieth time this runs” or “If this is true then break”.

The Get-Help Set-PsBreakpoint page shows a lot of really interesting additional examples that I wanted to briefly trot out for completeness, and to remind you that PowerShell help is solid as heck.

Check out more on the online help for Set-PsBreakpoint

Example 2: Set a breakpoint on a function

PS C:\> Set-PSBreakpoint -Command "Increment" -Script "sample.ps1"

This command creates a command breakpoint on the Increment function in the Sample.ps1 cmdlet. The script stops executing immediately before each call to the specified function.

Example 3: Set a breakpoint on a variable

PS C:\> Set-PSBreakpoint -Script "sample.ps1" -Variable "Server" -Mode ReadWrite

This command sets a breakpoint on the Server variable in the Sample.ps1 script. It uses the Mode parameter with a value of ReadWrite to stop execution when the value of the variable is read and just before the value changes.

Example 4: Set a breakpoint on every command that begins with specified text

PS C:\> Set-PSBreakpoint -Script Sample.ps1 -Command "write*"

This command sets a breakpoint on every command in the Sample.ps1 script that begins with write, such as Write-Host.

Example 5: Set a breakpoint depending on the value of a variable

PS C:\> Set-PSBreakpoint -Script "test.ps1" -Command "DiskTest" -Action { if ($Disk -gt 2) { break } }

This command stops execution at the DiskTest function in the Test.ps1 script only when the value of the $Disk variable is greater than 2.

It uses the Set-PSBreakpoint cmdlet to set a command breakpoint on the DiskTest function. The value of the action is a script block that tests the value of the $Disk variable in the function.

The action uses the BREAK keyword to stop execution if the condition is met. The alternative (and the default) is CONTINUE.I was going to dive into a little less meaty topics to start my PowerShell series but I have had a couple requests for debugging pop up, and so that’s where our journey begins.

It’s a little backwards, but I hope you learn a bit about Set-PsBreakpoint and how to ask the shell a few more questions about what the heck it is doing.

Debugging often comes up when you get beyond the super trivial questions and start asking “what is this magical thing doing for me?”

I wanted to include a few terms that may be familiar or may not be (and that’s ok!), but they are good to know when approaching a debugger.

Glossary of terms:

  • REPL – Read – Evaluate – Print – Loop – A method to facilitate exploratory programming by allowing a line by line read of your input, evaluate it in the programming language of choice, print the result, and then do it all over again.
  • Debugger – Often times a specialized form of a REPL that allows you to step through your code as if you were personally typing them in line by line.
  • Breakpoint – A specific callout to the debugger to stop the world when it hits a line, or sometimes with more advanced debuggers (like PowerShell) you can set conditions to break on and hit the “breaks” to check out the state of your program.
  • Step-Into – The “default” “expected” step command, this just always chooses to go line by line no matter where the execution path leads. This may sometimes lead you much deeper than you want and then you might want to use Step Over or Step Out.
  • Step-Over – This executes whatever is on the current line but returns to the current level/context. This allows you to skip over functions/method calls and not get into their inner workings.
  • Step-Out – This executes whatever context you are currently in and then steps to one level up in the hierarchy of calls. This can sometimes lead to the application exiting if your current context is the parent program.

Most commonly when people start debugging their programs they reach for what they know, which is printing stuff to the screen. This is generally considered “printf debugging” (from the old C programming language days), and it has a fairly predictable approach:

  • Add a print statement (in PS, this would be best as a call to write-verbose or write-debug and making sure to set $VerbosePreference="Continue" or $DebugPreference="Continue" to get results.)
    • See if the error happens before or after the print statement.
    • Add another print statement.
    • Continue until you figure it out or give up.

This is a completely valid way of debugging a problem and one I use regularly (especially in SQL), but as you may know there are other methods such as tracing/profiling and in our case, a great debugger!

Any new set of commands can be a complex task, but thankfully with PS you really only need to spend some time on Set-PsBreakpoint, everything else is just supporting that command or outside the scope of this blog post 🙂

Breakpoint Related Commands

  • Set-PsBreakpoint – Allows you to trigger a breakpoint at a line, command, or variable, one type of breakpoint at a time (but you can invoke multiple items of that type.)
  • Get-PsBreakpoint – Allows you to return a list of any existing breakpoints (verifying what you did helps!) and also to easily pipe these breakpoints to Remove-PsBreakpoint or Disable-PsBreakPoint so that you can start fresh or quickly disable breakpoints for a “clean run”.
  • Remove-PsBreakPoint – Removes any breakpoint you pass to it.
  • Disable-PsBreakpoint – Disables a breakpoint you have created with Set-PsBreakpoint, I dont often use this or Enable-PsBreakpoint because of the technique I list a bit later creating a debug script.
  • Enable-PsBreakpoint – Enables a disabled breakpoint you have originally created with Set-PsBreakpoint.

Non-breakpoint commands, but excellent to know for debugging.

  • Set-PsDebug -Trace 2 – Allows you to notify EVERYTHING that you want to get deeper debugging information, internal commands will notify you much more about what they are doing.
  • Set-StrictMode -Version 2.0 – This allows you to have a more strict interpretation of what is valid code in your script. StrictMode will complain about references to variables you never initialize, references to properties of an object which do not exist, variables without names and a few less important things.
  • Trace-Command – Allows you to trace execution of a command, (included for completeness but it is a bit outside of our scope right now, we will come back to it in a later post.)

Let’s look at an example, and show you what I am talking about!

Say you are writing some PS and run into an error:

You may notice that the line number itself is thrown in the error message, and one of the things that helps us as a human when debugging an issue in our code. But in this case we can use the line number (and the script) to tell PS to break at that point in the execution and let us start doing things.

So, let’s start the debugger BEFORE my error occurs and watch it happen 🙂

First we set the Set-PsBreakpoint to break before the line number that threw the error (I forgot to add the script part, so PS helpfully asked what I was talking about.)

We re-run the command and now we can check out the commands available at this point with ? or h this breaks out which shortcuts are used to stepInto, stepOver, stepOut, continue, quit, and detach.

Use l to list the source code and where you are in the script execution will be marked by a star.

If we start stepping into the code with the s (step into) key (and enter) we can see the first line executing (and showing additional debug information printing the next line) and then the error message, just as before when we ran the entire script.

We can also run our own commands and treat the current debugging context as a normal shell. That means you can call internal variables, functions, you name it, if the script can use it you can use it. (This does not apply to $psboundparameters or the pipeline operator $_, if you want to check the values of these you must assign them within your script.)

If the problem gets more complex than that, I setup a simple “debug and invoke” script, so I can add additional breakpoints at will.

I wanted to show you the shell version because I want to constantly reinforce that PowerShell is really built for you to use as a shell first; these debugging commands are among the easiest I have ever used in a programming language, I didn’t have to suss out any weird flags and the REPL is super friendly.

I also want you to understand when you use Visual Studio Code, the PowerShell ISE, or some other tool that the debugging that is available is generally the one you always have anyway, you dont need some heavy tool to figure out what is wrong with your program!

If you want to get fancier (and potentially emulate some of the additional heavy hitter features of VS Code like watching the value of variables change), Set-PsBreakpoint can also do things like set conditions and fire scriptblocks when you hit your condition and allow you to perform certain actions instead of just break.

A common case is to add additional tests to your breakpoints so that you can ask things like “If the value of $var is X, then break” for things like “I want to break on the fiftieth time this runs” or “If this is true then break”.

The Get-Help Set-PsBreakpoint page shows a lot of really interesting additional examples that I wanted to briefly trot out for completeness, and to remind you that PowerShell help is solid as heck.

Check out more on the online help for Set-PsBreakpoint

Example 2: Set a breakpoint on a function

PS C:\> Set-PSBreakpoint -Command "Increment" -Script "sample.ps1"

This command creates a command breakpoint on the Increment function in the Sample.ps1 cmdlet. The script stops executing immediately before each call to the specified function.

Example 3: Set a breakpoint on a variable

PS C:\> Set-PSBreakpoint -Script "sample.ps1" -Variable "Server" -Mode ReadWrite

This command sets a breakpoint on the Server variable in the Sample.ps1 script. It uses the Mode parameter with a value of ReadWrite to stop execution when the value of the variable is read and just before the value changes.

Example 4: Set a breakpoint on every command that begins with specified text

PS C:\> Set-PSBreakpoint -Script Sample.ps1 -Command "write*"

This command sets a breakpoint on every command in the Sample.ps1 script that begins with write, such as Write-Host.

Example 5: Set a breakpoint depending on the value of a variable

PS C:\> Set-PSBreakpoint -Script "test.ps1" -Command "DiskTest" -Action { if ($Disk -gt 2) { break } }

This command stops execution at the DiskTest function in the Test.ps1 script only when the value of the $Disk variable is greater than 2.

It uses the Set-PSBreakpoint cmdlet to set a command breakpoint on the DiskTest function. The value of the action is a script block that tests the value of the $Disk variable in the function.

The action uses the BREAK keyword to stop execution if the condition is met. The alternative (and the default) is CONTINUE.

Learning PowerShell as a DBA: First Steps and A Bit Of History

I am going to kick off a series of blog posts that I hope will help you understand the usefulness and power of PowerShell (or at least come away with some tips and tricks.)

Before I start showing off keyboard shortcuts or breaking down some of the peculiarities you may need to understand to get the full benefits of PowerShell, I want to talk a little about history and why the shell is good.

In the Beginning was the Command Line

Before there were modern graphical user interfaces there was the command line.

There were many types of machines leading up to the giant time share computers of the 60s and 70s (and even later in some cases), but back then it was fairly common to write a program ahead of time, encode them in punch cards or otherwise, and then use a shared “terminal” (sometimes just a keyboard or input device, usually at your place of business or school) to submit your work.

After that, you often would get you output either printed back to you or in more modern cases, could even see that output on a screen.

Very quickly on multi user machines it became clear that allowing direct access to the systems functions was not only dangerous, but difficult to use and prone to error. Many programs were written to solve these problems and ease the burden of a fairly complex task.

Heroes In A Half Shell

To facilitate the more “modern” terminals (maybe even supporting iterative input!) many different approaches were taken, and over time a large set of conventions and methods took over the idea of what a shell was to computer operators.

These predecessor programs actually have an out-sized influence on the applications of today, and many shells are descendants that contain some of the zany choices which had to be made due to hardware or software limitations of code 30 years or more ago.

As a result of this many people think that the shell is something from an earlier time or a place only hackers go, some of you may have tried to accomplish your goals but because the shell is so separate from the standard experience, it was probably slow going.

While the shell is definitely closer to the bare metal experience than many of us are used to, it at one time was designed to be the primary user interface, and most shells have many methods to help you deal with the problems baked into a command line interface.

If you’ve never used a shell before or you are familiar with other shells (bash, zsh, cmd, etc) I want to show you that not only is PowerShell not scary, but instead how it will remove drudgery from your day to day work and how using it doesn’t need to feel kludgy.

If you are a SQL professional like me, I want to show you how you can use features of PowerShell to:

  • Manage your ever growing code base and server farm.
  • Automate tasks which seem annoying (text processing!)
  • Reporting on multiple machines and transforming them into useful input.
  • Solve problems that used to take thousands of lines of SQL in tens of lines of PowerShell

Community

I also want you to know that there is an active community of people who want to help you when you run into problems. please come join us at the SQL Community Slack and head on over to #PowerShellHelp or #dbatools.

You can find me there talking as @ck, I would love to hear from you!

Acknowledgements

I also want to acknowledge a few people as this series begins who have helped and influenced me greatly in these endeavors:

  • Chrissy LeMaire – As the project leader for dbatools.io, she has shown tireless enthusiasm in showing the both the usefulness and fun that can be had when authoring PowerShell.
  • Rob Sewell – A tireless contributor and head of the dbareports project, his want to demonstrate and help those with less knowledge seems boundless.
  • Klaas Vandenberghe – A tireless supporter of quality code and best practices, Klaas has an encyclopedic knowledge of PowerShell, PowerShell publications, and more; he has always driven me to ask more questions about my code and methods.
  • Friedrich Weinmann – A PowerShell wizard and hyper contributor to dbatools, his deeper dives into some of the technologies and his advice have gotten me out of a few weird places.
  • Cláudio Silva – Cláudio’s constant pushing for more and better code to bring back to the community is always impressive, and his enthusiasm leaks through the screen any time I work with him.
  • Carlos Chacon for giving me the idea for this series of posts (though indirectly) I truly saw his enthusiasm at learning more about using PowerShell when he learned I wasn’t a PowerShell expert, but instead a SQL Professional like himself (with a lot less experience of course!)
  • Neal Stephenson for some great books and a great quote.

Contributing to the SQL Server Data Partners Podcast

Just wanted to mention a great podcast that some of the dbatools team and I were lucky enough to be invited to attend, the SQL Server Data Partners Podcast

You can listen to the specific episode below: SQL Server Data Partners Podcast – dbatools (Episode 91)

If you haven’t checked the podcast out before, it features two SQL Server professionals Carlos Chacon and Steve Stedman, who do a great job of keeping the conversation light-hearted and momentum moving during some of the more in depth technical conversations. I really enjoy their interview style and I highly recommend checking them out.

Different episodes range widely on the podcast and cover topics such as career advice and experiences, code best practices, and personal interviews from some of the titans of the SQL Server community.

The focus of this episode was dbatools; topics ranged from how the project started and gained acceptance, bringing in new people and maintaining interest in the project, and getting a little deeper into why dbatools mattered to us.

It was really fun to be invited and I was happy to be able to chip in about learning and growing as one of the more junior PowerShellers of the team.

Chrissy and I had such a fun time we are working towards releasing our own dbatools oriented “people talking on the internet type thing” soon!

You committed files to the wrong branch (master), and now you want to apply your changes to the right one (development)

I wanted to share a scenario and solution for a problem I am seeing arise in the dbatools project:

  • Commits are registered against the “wrong” branch
    • A pull request is submitted
    • The file comparison shows hundreds of changes instead of 1-2

There are many ways this can happen, but usually due to forgetting to create a feature branch, or accidentally creating said branch from master instead of development.

To fix this problem, you will want check the log of commits to determine the hash of the command you want, and then create a new branch (from the correct source) and apply those commits.

git cherry-pick is a command that allows you to apply a commit on a different branch than what you what you originally intended.

To fix a pull request where you have more files than you want:

  • Make sure you are in the branch you had a problem.
    • git status to see your current branch and git checkout branchname will change your context to the branch name you want.
    • Use the git log command to see a list of commits, you should see your changes and a relevant hash.
    • Record the first 10-12 characters of the commit hashes of your changes (technically you only need enough of the values to be unique.)
    • Checkout the correct source branch (development) with git checkout development
    • Create the correct feature branch with git checkout -b new-feature-name (this will also check it out for you and set it to be the current branch)
    • Cherry pick the commits with git cherry-pick hash
    • Push the new branch to your repository git push origin new-feature-name
    • You should now be able to Pull Request/Compare your code.

A note on comparisons, any time you make a new Pull Request GitHub will automatically list the files changed and the number of commits you put in place, if this seems different than what you did, give me a jingle and we can see which part of this process we want to do 🙂