Cannot enable the Microsoft .NET Framework 3.5 feature on Windows Server 2012

I was building a new SQL Server Cluster and I was going a little to fast and forgot to enable the .net framework 3.5 feature on the windows server before I installed SQL Server.  .NET 3.5 SP1 is a requirement for SQL Server 2012 when you select Database Engine, Reporting Services, Master Data Services, Data Quality Services, Replication, or SQL Server Management Studio, and it is no longer installed by SQL Server Setup.

No problem, I thought–I know how to enable that using add roles and features wizard.  Only a weird thing happened on the way–the installation failed.  It didn’t even include an error message.  Weird, I thought. When I tried the install using DISM as suggest by Albert Morillo, I got the following:

The operation completed, but the NetFx3 feature was not enabled.  Ensure the parent feature(s) are enabled first.

Parent features?  This was news to me.  Then I found there have been a couple of security updates that have come which will cause this to fail.  If you are getting the operations folks to build the servers, they will have to hold off on installing security update 2966827 and 2966828 until you enable the .NET 3.5 framework.  Once they were removed, I would enable the 3.5 framework and continue my installation.

Looks like I have another item to add to my server build checklist.

All the World’s a Stage

I didn’t realize Pinal Dave had published his site stats last year and I recently came upon them and was very impressed with those numbers.  Kudos to him for all his effort.  My blog on the other hand would need him to chop off about 6 digits and we might be getting a little closer.  My blog won’t every be that popular; however, it is interesting to see where people are coming from and it makes me think the few posts I have up are helping someone.  The SQLFamily has always been about helping others and I am glad to play a small part.

Thanks Mom for all the hits.  🙂

Screen Shot 08-01-14 at 04.31 PM

TempDB Configuration Matters

TempDB Database

TempDB Database

For each SQL Server database installation there are several accompanying system databases that help keep SQL Server running and functional.  They store information about security, what the server is doing, and how the system should process certain functions.  The TempDB database doesn’t keep permanent information–in fact, each time SQL Server restarts, the TempDB database gets purged and re-built as if it is a new database.  Why then should I care about the TempDB database?  Well, I am glad you asked.

Lots of Hats

The TempDB database has several functions it handles; some of which include: Sorting, local and global tables, index re-organizing, hash comparisons, XML Variables, spooling, triggers, snapshot isolation, and other internal functions.  In short–it is the workhorse of the environment and ALL databases on the system will interact with it at some point, so the configuration becomes very important.

Transaction Dependent

Because local table variables and hash joins are stored in the TempDB database, queries that use these objects/functions have a part of their processing done in the TempDB database.  When a query needs to move a process to TempDB, it creates a worktable and write data out.  What happens when I have 1,000 transactions per second and all of them have a hash join?  I will have a line out the door of threads waiting to be able to write to TempDB for their turn.  I can potentially have GAM (Global Allocation Map) and SGAM(Secondary Global Allocation Map) contention.  Also, because these processes are dependent for a transaction to process, the speed in which than can be written and read becomes important.

Providing some relief

There are two recommendations that can provide immediate to your TempDB environment.  The first is to create multiple TempDB files of the same size.  This will help alleviate the SGAM contention.  The second is to move the TempDB data files to the fastest disks in the system and/or to the most number of spindles as possible.  TempDB is the first candidate for Flash drives should the entire system not be able to take advantage of those disks.

How large should my TempDB files be? Determining the size of the TempDB files may be a bit of trial and error; however, if you have databases already on the system, one way to help make a decision is to run DBCC CHECKDB WITH ESTIMATEONLY to get the size the CHECKDB command uses.

How many TempDB Files should I have? While there are varying ideas, the one I hold to is 1 file per CPU core up to 8 files.  Monitoring will help provide insight if more are needed in the future.

The Importance of Backups

A re-post of a posting at

Too many organizations do not have adequate protection of their data and are susceptible to data loss.  While security is important and you might think this post is about implementing policy to limit your exposure to hacking, I am talking about something much more basic than that–I am talking about database backups.  A database backup is the first step in ensuring data availability and limiting exposure to corruption or human error.

What kind of backup strategy do you need?  Well, what is your tolerance for data loss?

To help you answer this question, there are two components you need to consider.

RPO – Recovery Point Objective

TechTarget defines RPO as “the age of files that must be recovered from backup storage for normal operations to resume if a computer, system, or network goes down as a result of a hardware, program, or communications failure.”  My definition would be something like–the moment in time you want to be able to restore to.  If you experience corruption or disk failure, how close do you need to get to that point in time?  Defining this metric–which will vary from system to system, will give you your RPO.

RTO – Recovery Time Objective

TechTarget defines RTO as “the maximum tolerable length of time that a computer, system, network, or application can be down after a failure or disaster occurs.”  My definition would be–The time needed to restore a system to usable functionality. I should note this time would include the alerting and response of your team if user intervention is required.

It would be easy to say, I want less than 1 second of data loss (RPO) and  less than 5 minutes of downtime (RTO); however, don’t expect to pay TGIF prices for Ruth Chris service.  Microsoft has made great strides in giving SQL Server many options for High Availability and Disaster Recovery and the ability to keep the system up ; however, none of these solutions remove the requirement to take backups.  The amount of history you keep will depend on your business requirements and the costs associated with keeping that storage.

If your organization does not have the RPO and RTO points defined, it is definitely time to make it happen.

Unable To Login to SQL Server Error 18456

I installed SQL Server on a machine the other day, I connected from my desktop, finished my normal server setup routine and life went on. Later, I connected to the machine via Remote Desktop, fired up SSMS and got this error message.

With an accompanying error in the log
Login failed for user ‘MyAccount’. [CLIENT: ]
Error: 18456, Severity: 14, State: 11.

Very strange, I thought to myself. Turns out I had added my account to the local administrator’s group and not created a Login on the SQL Server. (It also appears I missed a step on my server checklist, as I normally remove the Builtin\Administrators group, but who is keeping score?)

I create the Login, and specified the group, and I was good to go.

EXEC sp_addsrvrolemember [Domain\Account], ‘sysadmin’;

Newest Regional Mentor for PASS

This month I am pleased to accept the invitation to serve as a Regional Mentor for the Mid-Atlantic region of the PASS organization. It was not that long ago I stepped into my first chapter meeting in Richmond and discovered the unique mix that is the SQLFamily. Many local meetings, several SQLSaturdays, and the SQLSummit have only added to the experience and I am happy to be able to share some of these experiences with others.

As a regional mentor, I will serve as a resource for the local chapters in providing help as they operate their chapters. I will be looking to increase or strengthen the number of chapters in the region and maybe even add another SQLSaturday to the mix (looking at you Norfolk or Lynchburg). I am looking forward to it and as always–I will see you on the SQL trail.

Verizon Fios setting

I recently switched to Verizon FIOS and spent more time than I care to admit getting my printer set up.  The printer is attached to the network and was working previous to my change; however, when I tried to connect to the printer from my laptop I would get a connection failure error.

The resolution was to disable the IGMP Proxy under the advanced settings and ta-da!, I could connect again.


SSIS Conditional Logic Based On Stored Procedure Result

I was looking to implement some conditional logic in and SSIS package. In my case, I wanted the package to halt if some validation rules were not met. The logic for these validation rules are found in a stored procedure and I needed a way to return a flag to the SSIS package the success r failure of the validation. I found I could do this with the Result Set tab on an Execute SQL Task. Here is what I did to set this up.

First, I created a stored procedure with the validation rules. For simplicity, I have created a stored procedure that returns a value. You will notice I have named the returning result ‘ReturnResult’.

    DECLARE @ReturnVariable SMALLINT
    SET @ReturnVariable = 1

    SELECT @ReturnVariable AS ReturnResult


Second, I created an SSIS package with a variable of type Int32 named ProcedureResults and an Execute SQL Task component. After establishing the connection on the SQL Task, I changed the ResultSet to Single Row.ConditionalLogic_ExecSQLTask I click on the Result Set ‘tab’ on the left and link the returning value from my stored procedure to the variable in the package.


Third, I added two data flow tasks and linked them with a precedent constraint so the Execute SQL Task would be the first item executed. I only want a data flow tasks to execute based on the results of the stored procedure.

Fourth, I modified the Evaluation Operation of the constraint to be of type Expression and I added and expression it should use when the package runs. In the success constraint the value was @ProcedureResults > 0. ConditionalLogic_ResultSet

The failure contraint? You guess it, @ProcedureResults <= 0 .

**Interesting sidenote.  My stored procedure returns a SMALLINT.  When I created this on a client system the stored procedure returned a SMALLINT data type, just like my example above; however, the Execute SQL task was unable to process the result set.  What happened was no matter what I returned in the stored procedure, the package would stop at the Execute SQL Task and the package would succeed.  It was not until I changed my SMALLINT to an INT that it was able to process the result and continue in the package.  When I created the package on my local machine, I was unable to recreate the issue.  I haven’t been able to put my finger on the difference between the environments, but it seems to have gotten fixed somewhere.  🙂

Free Space In Database Files

File growth can cause system outages as SQL Server will have to wait for the file to grow before it can resume operations.  Of course, this depends on the size of the database, the growth rate, etc; however, the size of your databases is something you will want to keep an eye on.  The below script gives you this information and so you can better monitor these rates and grow your database during the maintenance window.

This query will only grab the information for the database you are connected to.

, s.physical_name AS [FileName]
, s.size * CONVERT(FLOAT, 8) AS [Size] /*File size on Disk*/
* CONVERT(FLOAT, 8) AS [UsedSpaceInKB]
/ 1024 AS [UsedSpaceInMB]
, ((s.size * CONVERT(FLOAT, 8)
/ 1024 AS 'Available Free Space In MB' /*Matches SSMS file report*/
, (s.size * CONVERT(FLOAT, 8)
/ (s.size * CONVERT(FLOAT, 8)) * 100 AS 'Percent Free'
, growth
, max_size
, state_desc
FROM    sys.master_files AS s
WHERE   s.database_id = DB_ID()

Getting a little fancy, we can grab the information for each database on the instance, but you would have to insert this into a table to do any ordering.

EXEC dbo.sp_MSforeachdb ' Use ? SELECT AS [Name],
s.physical_name AS [FileName],
s.size * CONVERT(float,8) AS [Size], /*File size on Disk*/
CAST(FILEPROPERTY(, ''SpaceUsed'') AS float)* CONVERT(float,8) AS [UsedSpaceInKB]
, (CAST(FILEPROPERTY(, ''SpaceUsed'') AS float)* CONVERT(float,8)) / 1024 AS [UsedSpaceInMB]
, ((s.size * CONVERT(float,8) - CAST(FILEPROPERTY(, ''SpaceUsed'') AS float)* CONVERT(float,8))) / 1024 AS ''Available Free Space In MB'' /*Matches SSMS file report*/
, (s.size * CONVERT(float,8) - CAST(FILEPROPERTY(, ''SpaceUsed'') AS float)* CONVERT(float,8)) / (s.size * CONVERT(float,8)) * 100 AS ''Percent Free''
, growth, max_size, state_desc
sys.master_files AS s
WHERE s.database_id = db_id()'
**I was delayed in getting this post out and I no longer remember where I got the original code from. I am pretty sure it was from a trace file looking at what SSMS did do pull this information. If this is your script (or started as your script), I am sorry I did not note it above.**

Excel Connection Manager

I find myself needing to import files from time to time and usually use SSIS to import the data–normally to a temp table.  I did so the other day and got this error.

SSIS package “MyTest.dtsx” starting.
Information: 0x4004300A at 2008 10K, SSIS.Pipeline: Validation phase is beginning.
Error: 0xC00F9304 at MyTest, Connection manager “Excel Connection Manager”: SSIS Error Code DTS_E_OLEDB_EXCEL_NOT_SUPPORTED: The Excel Connection Manager is not supported in the 64-bit version of SSIS, as no OLE DB provider is available.
Error: 0xC020801C at 2008 10K, Excel Source [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager “Excel Connection Manager” failed with error code 0xC00F9304.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.Error: 0xC0047017 at 2008 10K, SSIS.Pipeline: component “Excel Source” (1) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at 2008 10K, SSIS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at 2008 10K: There were errors during task validation.SSIS package “MyTest.dtsx” finished: Failure.

I normally just run the package in Visual Studio, so I changed the 64 bit option of the package property and that allowed me to execute the package successfully.  Should you need to save the package and run it in a scheduled job, you will need to make this change elsewhere as noted here.

This setting can be found by clicking on Properties/[Package Name] properties . . . Where [Package Name] is the name of the package you are working with.