Truncate a large table in Availability Group

Recently, I had to truncate 3 TB table from the availability group. Since truncate deallocates extents, it does not generate as much log compared to delete (logs individual records). However, I wanted to research this as we have synchronous AG and did not want to flood the secondary replicas with too much log.

I came across an article from Paul Randal about this exact issue. Thanks Paul. Link to Paul’s article here https://www.sqlskills.com/blogs/paul/the-curious-case-of-log-generated-during-a-drop-table/

To confirm this, I executed the truncate command on test server ( it does not have AG) and tracked the log backup size. Since, there was no database activity at that time, all the growth should be related to this event. The math worked out very well and it was exactly in the range as mentioned in Paul’s blog. I also read the log backup contents to verify.

So, we executed this task on prod during non-peak hours and secondary replicas were able to catch up without any hiccups.

This article is for future reference. Thanks Paul for the wonderful article. Timing could not have been more right and the math was exact.

Advertisement

Master database in single user mode after restore

Recently I had to restore master database on a SQL instance. One way to restore master database is

  1. Restore the master database as user database on same version of SQL.
  2. Detach the database from the instance.
  3. Copy the files to original instance.
  4. Restart the original instance.

Make sure the logical and physical file names are same as original when the database was restored in step 1.

Anyways, after I did the above steps, I noticed the master database was coming up in “single user” mode on the original instance ( Step 4).

I, eventually, realized that it was due to how I was detaching the database. So, when I was detaching the database, I selected the option to “drop connections” in the UI and SQL accomplishes this by putting the DB in single user mode before it detaches.

That’s the reason why it was coming up in single user mode on the original instance when the files were copied over.

Custom SQL Job Failure Notifications

Job failure notifications is a common way to send notification when a SQL job fails. Setting up job notifications is fairly straight forward. However, sometimes there may be a need to do custom logic on when these notifications need to be sent. Since the job history is logged in the msdb database, we can query the tables and build out of the box solutions. One solution I had to do recently was to send notification email only if the job failed 5 successive times.

In order to do this, whenever the job fails, it should execute a step( part of the job) to check the last 4 run’s job final output (step id = 0) and if they all failed, send an email notification.

Below code shows how this can be done.

Declare @JobID uniqueidentifier
SET @jobID = CONVERT(uniqueidentifier,$(ESCAPE_SQUOTE(JOBID)))

IF ( SELECT MAX(run_status)
FROM (SELECT A.run_status
FROM Msdb.dbo.sysjobhistory A 
INNER JOIN Msdb.dbo.sysjobs B ON A. Job_id=b.job_id
WHERE step_id=0 AND B.Job_ID = @jobID
ORDER BY A.instance_id DESC OFFSET 0 ROWS  FETCH First 4 ROWS ONLY) A)= 0

BEGIN
EXEC msdb.dbo.sp_send_dbmail 
    @profile_name = '<<SQLMailProfile>>', 
    @recipients = 'youremail@email.com', 
    @body = '<<email body>>', 
    @subject = '<<email subject>>' ; 
END

 

Registering SPN for SQL Server Service Accounts

When setting up a new SQL Server, one of the important step is to register the SPN of the service account.This registration is not required if the service account is domain administrator or if you give explicit permissions to self register the SPN for the service account. Both options are not wise, so anytime a new SQL Server is set up or service account is changed – we have to manually register the SPN on the domain. This task needs to be done by someone who has domain admin rights.

Registering SPN’s enables kerberos authentication for delegation and for double hop scenarios such as linked server, you can impersonate the actual user other wise you have to specify SQL Account and this can become security loophole in your system.

Below are the steps to enable kerberos delegation:

1. Register SPN for serviceaccount with all possible combinations
SetSPN -A MssqlSvc\ComputerA.domainname.com domain\serviceaccount
SetSPN -A MssqlSvc\ComputerA domain\serviceaccount
SetSPN -A MssqlSvc\ComputerA.domainname.com:1433 domain\serviceaccount
SetSPN -A MssqlSvc\ComputerA:1433 domain\serviceaccount

In case of a clustered instance, specify the Virtual SQL Cluster Name(without the instance name). You have to mention the TCP port the SQL is running on.

2. Enable the service account to trust for delegation. This is a setting in the AD.You can choose either to trust for all delegation or if you want, you can specify which service to delegate.

3. Make sure TCP/IP protocol is enabled and named pipes is disabled.

If you have any other SQL components such as Analysis service or Reporting service, you can register them as well to use Kerberos.

Example of SSRS SPN registration:
http/computername.domainname.com:80 domainname\serviceaccount
http/computername.domainname.com domainname\serviceaccount
http/computername domainname\serviceaccount
http/computername:80 domainname\serviceaccount

If you use performance dashboard reports, you need to have kerberos authentication for SSRS. These dashboards are very useful and you can download from github.

Also, If you want to list all the SPN’s registered for a service account, you can use
SetSPN -L domainname\serviceaccount

If you want to delete a spn, you can use
SetSPN -D MssqlSvc\ComputerA.domainname.com domain\serviceaccount
SetSPN -D MssqlSvc\ComputerA domain\serviceaccount
SetSPN -D MssqlSvc\ComputerA.domainname.com:1433 domain\serviceaccount
SetSPN -D MssqlSvc\ComputerA:1433 domain\serviceaccount

Issues when migrating stored procedures to native compiled

Recently, I started working on a project to migrate all disk based tables to InMemory and regular stored procedures to native compiled stored procedures.

While migrating stored procedures, I encountered several issues, that needed code changes. The native compilation advisor will alert on the issues that needs to be addressed before making the stored procedure native compiled.

This is not complete list but some of the issues I came across in my project

1. Cannot use Select * .
2. Cannot refer user defined functions.
3. Cannot refer tables without schema name.
4. Cannot use CASE expression, Choose, IIF.
5. Cannot use functions such ‘%’,Like.
6. Cannot have debug statements like Print.
7. Cannot have Create\Drop\Truncate Table.
8. Cannot use table variables.(use inmemory table variable)
9. Cannot refer objects in other databases.
10. Cannot use Linked Server Queries.
11. Cannot have explicit Begin Transaction\Commit\Rollback.
12. Lock Hints are not allowed.

I will add more exceptions here as I come across them.

Table Variable vs Temp Table

One of the fundamental difference between temp table and table variable is the inability to create statistics or indexes on table variables and this can significantly effect the query performance.

Without statistics, the query optimizer does not have any idea on the data distribution to come up with the best query plan.Let’s consider a simple example:

1. Table Variable: You can see in the execution plan picture below, the estimated no of rows returned is 1 where as the actual no of rows returned is 10000.

declare @TestTable Table (sno int)
declare @a int = 1
while (@a<=10000)
Begin
insert into @TestTable(sno)
select 1
set @a=@a+1
End
--Get Execution Plan
set Statistics XML ON
select * from @TestTable
set Statistics XML OFF

@TableVariable

2. Temp Table: You can see in the execution plan picture, both estimated no of rows and actual no of rows returned are 10000.

Create Table #TestTable (sno int)
GO
insert into #TestTable(sno)
select 1
Go 10000
create statistics CS_TestTable on #TestTable(sno)
--Get Execution Plan
set Statistics XML ON
select * from #TestTable
set Statistics XML OFF

#TestTable

TSQL to get Actual & Estimated Execution Plan


Create Table myTable(sno int identity(1,1),sname varchar(20),scity varchar(30))
Insert into myTable values('SomeName','SomeCity')
Create NonClustered Index NC_sname on myTable(sname)
GO

--GETS THE ESTIMATED EXECUTION PLAN AND THE QUERY IS NOT EXECUTED
Set SHOWPLAN_XML ON
GO
Select SName from myTable WHERE SName='SomeName'
GO
Set SHOWPLAN_XML OFF
GO

--GETS THE ACTUAL EXECUTION PLAN AND THE QUERY IS EXECUTED
Set Statistics XML ON
Select SName from myTable WHERE SName='SomeName'
Set Statistics XML OFF
GO
Drop Table myTable

Primary Key Script

The below script generates primary key across all the tables in the database.

;with CTE as (select distinct COLUMN_NAME,ORDINAL_POSITION,TABLE_NAME,b.name,c.index_id,
b.[object_id] as ObjectID,b.parent_object_id as [TableObjectID]
from INFORMATION_SCHEMA.KEY_COLUMN_USAGE A Inner Join sys.key_constraints B on
A.TABLE_NAME=Object_name(B.parent_object_id) and SCHEMA_NAME(SCHEMA_ID)=TABLE_SCHEMA
and A.Constraint_Name=B.Name
left outer join sys.indexes C on C.object_id=B.parent_object_id and is_primary_key=1)
,CTE2 as (select distinct T2.[objectid],T2.TableObjectID,T2.index_id,
STUFF((Select ','+Column_Name
from CTE T1
where T1.[objectid]=T2.[Objectid] and T1.[Tableobjectid]=T2.[TableObjectid]
and T1.[index_id]=T2.[index_id] order by ORDINAL_POSITION asc
FOR XML PATH('')),1,1,'') as [keycolumns] from CTE T2 )

Select distinct Table_Schema+'.'+Table_Name as [TableName],CONSTRAINT_NAME,'ALTER TABLE '+Table_Name +' ADD CONSTRAINT '+Name+ ' PRIMARY KEY '+
case when index_id=1 then ' CLUSTERED '
when index_id>1 then ' NONCLUSTERED '
when index_id=0 then '' End
+'('+keycolumns +');' as [Create Script]
from INFORMATION_SCHEMA.KEY_COLUMN_USAGE A Inner Join sys.key_constraints B on
A.TABLE_NAME=Object_name(B.parent_object_id) and A.CONSTRAINT_NAME=B.Name
Inner join CTE2 C on C.ObjectID=B.object_id and C.TableObjectID=B.parent_object_id