I had the chance to spend a couple of hours with Paul Galjan (@PaulGaljan) last fall while in Seattle for Microsoft’s Tech Ready.  We got on the topic of what makes effective technology demonstrations and eventually started discussing ways to generate SQL Server workloads.  I mentioned that I often time use Windows Performance Monitor to log Windows perfmon counters to a SQL Server database.  Paul thought I should share this idea with a blog post on the Everything Microsoft site.  This morning an email thread popped inside EMC on exactly the same topic.  I figured that was a sign to get busy and write this up to find out if anyone else finds this useful.

 

Software and hardware testing is a much more interesting field of work than many people might realize.   A large part of my job at EMC is to show customers how Microsoft applications work with EMC hardware and software.  I spend a lot time talking about SQL Server performance, reliability and recovery.   When I need to show something like hardware based remote replication with RecoverPoint it really helps to have a live database under load during a demonstration.  Most of what I do is feature and functionality testing that doesn’t require a workload large enough to test the limits of large hardware configurations and/or enterprise software.  For these situations let me show you how I use Windows Performance Monitor.

 

The first step is to setup a database on the server that you want to use as the load target, for example, the primary replica of an Always On Availability Group.  Use SQL Server Management Studio (SSMS) or a script file to generate an empty test database.  Make sure that you pay attention to the initial data and log file size settings.  If you want to show what happens with small files and autogrow enabled you can keep the default data and log file settings.  If you don’t want to be impacted but autogrow operations create files sized appropriately.  I usually size my files to be about 10% smaller than the physical disk size set maximum files sizes equal to 95% of the disk size.  That way if I fail to notice that the disk is running out of space, SQL will shut down before Windows freaks out.  I have used this type of setup to show how thin LUN expansion on VNX works.  It is very effective because most DBAs have seen this happen.

 

Here is a look at the New Database dialog from SSMS showing the default (ridiculously small) file sizes.  I’m going to use the defaults for creating the screen shots I need for this article. You will want to change the initial database size and autogrowth settings to match the scale to which you think this will grow.

Number 1.png

The next step is to go to Administrative Tools | ODBC Data Sources on the computer that you are going to use for generating load.  This can be either the same machine that is running SQL Server or a remote machine.  If you are using a remote machine make sure that Windows Firewall on both machines allows the remote SQL connection on the TCP/IP port that SQL Server uses or is that the firewall is turned off if this is a secure and dedicated testing network.  In this example I’m configuring the connection on a single server that will be both the load generator and the SQL Server system under test (SUT).

Number 2.png

 

I’m going to go through all the important screens needed to create a Data Source Name but I did want to include a shot of the first couple that shows I’m using a System DSN and selecting the SQL Server driver.

Number 3.png

Make sure you change the default database for the ODBC connection to the blank database that you created above.  Otherwise you will end up creating new some new tables and logging your Perfmon data to the MASTER database.  Not to terrible in a test environment but it might take you a while to figure out where the data is going because everything from here on out will still work as long as the user that is authenticating to SQL Server has rights to MASTER.

Number 4.png

I always use the Test Connection option before completing the New DSN wizard to insure there are no authentication, network or firewall issues.

Number 5.png

Now that we have a DSN for the database that is going to be the destination of our perfmon data everything else happens with Performance Monitor so let’s take a look at how that gets setup.  The first thing you’ll need to do is create a new user defined data collector set.  I always start with the System Performance template.

Number 6.png

I creatively called my Data Collector Set "Performance Data".  You now need to change the location and type of logging destination.

Number 7.png

Drill into the data collector set until you get to the performance counter collector that was created when you choose a template above.  Right click to get the Properties dialog.  For log format choose SQL and for Data source name choose the DSN you created above.  You can also add and remove counters from the default set and change the sample interval both of which will impact that amount of data you write to the target database.

Number 8.png

Save those settings and then open the properties for the Data Collector Set (Performance Data as shown above).  The most important setting is the Run As: account.  If you want to write to a SQL server instance across the network you need to have an account with that privilege.  I’m using a domain account with sysadmin privileges on the target SQL server.  Any time you change properties of the data collector you will need to supply the Run as credentials again.  You will also want to check out the Schedule and Stop Condition tabs to make sure your collector runs for an appropriate amount of time.

Number 9.png

Now you can right click on the Data Collector Set and choose Start or wait for the scheduled time for starting that you configured above.  Go to SQL Management Studio and look at the list of user tables for the target database.  You should see three tables that were created the first time you started the data collector as shown below.

Number 10.png

 

 

Right click on the database name and select Reports | Standard Reports | Disk Usage by Top Tables.

Number 11.png

That’s it you are logging perfmon to a SQL database.  Now you can play with Always On, Replicaiton, Backup and Restore, Log Shipping, etc with a live database and it only took about 15-20 minutes to setup the whole thing.  Good luck and have fun with this.