Thursday, August 1, 2019

Graph Table Enhancements : Edge Constraints, Utility Functions


The second part of the article https://visakhm.blogspot.com/2017/05/graph-tables-in-sql-2017.html can be found here

https://social.technet.microsoft.com/wiki/contents/articles/53162.graph-table-enhancements-edge-constraints-system-utility-functions.aspx

Here I have explained on the concept of Edge Constraints and how they can be used to enforce the relationships between Node tables within an Edge table.

The article also explains on certain system functions which can be used in conjunction with Node and Edge tables to extract the information from pseudo columns in Graph tables which can be utilized in the filter conditions of queries.

Wednesday, April 10, 2019

T-SQL Tips: Generating Nested XML Structures Efficiently Using FLOWR Expressions

Introduction

I have been thinking about publishing an article on this for quite a while now. Of late have been too busy (or may be lazy is a better word!) to write on this. Finally thought of breaking the 
Previously I had blogged about how FOR XML PATH can be used to generate nested XML structures in the below two articles



There is an alternate method that can be used to generate these nested XML structures using FLOWR expressions. This will be much more efficient than using nested subqueries for large data volumes.

 Illustration

Using the same example as in the previous article, we can see how FLOWR expression can be applied to get same result

CREATE TABLE [dbo].[Orders](
[OrderID] [int] IDENTITY(1,1) NOT NULL,
[CustName] [varchar](100) NULL,
[OrderDate] [date] NULL,
[ReferredBy] [varchar](100) NULL,
[AgentGrp] [varchar](30) NULL,
PRIMARY KEY CLUSTERED 
(
[OrderID] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
SET IDENTITY_INSERT [dbo].[Orders] ON
INSERT [dbo].[Orders] ([OrderID], [CustName], [OrderDate], [ReferredBy], [AgentGrp]) VALUES (1, N'ABC Corp', CAST(0xB8350B00 AS Date), N'Agent 1', N'110')
INSERT [dbo].[Orders] ([OrderID], [CustName], [OrderDate], [ReferredBy], [AgentGrp]) VALUES (12353, N'R & K Associates', CAST(0x7A370B00 AS Date), N'Agent 5', N'105')
INSERT [dbo].[Orders] ([OrderID], [CustName], [OrderDate], [ReferredBy], [AgentGrp]) VALUES (13345, N'Zyng Enterprises', CAST(0x5D370B00 AS Date), N'Agent 3', N'110')
INSERT [dbo].[Orders] ([OrderID], [CustName], [OrderDate], [ReferredBy], [AgentGrp]) VALUES (15789, N'Maxim Bailey', CAST(0x7A370B00 AS Date), N'Agent 1', N'120')
INSERT [dbo].[Orders] ([OrderID], [CustName], [OrderDate], [ReferredBy], [AgentGrp]) VALUES (22345, N'Kyzer', CAST(0xA5370B00 AS Date), N'Agent 2', N'120')
INSERT [dbo].[Orders] ([OrderID], [CustName], [OrderDate], [ReferredBy], [AgentGrp]) VALUES (29398, N'ABC Corp', CAST(0x54370B00 AS Date), N'Agent 4', N'105')
SET IDENTITY_INSERT [dbo].[Orders] OFF
GO







The result will be as below


As seen from result above the FLOWR expression helps us in building the XML in the structure we want.
The first part of the query using FOR XML will built a simple XML structure with all attributes involved. Then we apply query function on top of the simple XML we built and use FLOWR expressions inside to get XML to the format we desire

Another illustration is given below

declare @catalog table
(
catalogid int identity(1001,1),
catalogdescription nvarchar(100),
catalogcreationdate datetime
)


insert @catalog (catalogdescription,catalogcreationdate)
values
(N'mens wear',getdate()-120),
(N'womens wear',getdate()-35),
(N'sports wear',getdate()-90),
(N'accessories',getdate()-25),
(N'Beauté',getdate()-20)


declare @products table
(
ProductID  int identity(10001,1),
ProductDesc nvarchar(100),
CatalogID int
)

insert @products (ProductDesc,CatalogID)
values ('Crop Tops',1002),
('Sweat Shirts',1002),
('Bodyfit Jeans',1001),
('Golden Perfurme',1005),
('Jerseys',1003),
('Pendant with Earstuds',1004),
('Anklet',1004),
('Shorts',1001)


declare @productattributeTypes table
(
AttributeTypeID int identity(10001,1),
AttributeTypeDesc  nvarchar(100)
)

insert @productattributeTypes (AttributeTypeDesc)
values ('Shoe Size'),('Belt Size'),('Base Material'),('Color'),('Pattern'),('Size')

declare @productattributevalues table
(
AttributeID int identity(10001,1),
ProductID  int ,
AttributeTypeID int,
AttributeValue nvarchar(100)
)

insert @productattributevalues (AttributeTypeID,ProductID,AttributeValue)
values (10003,10001,'Cotton'),
(10003,10002,'Polyester'),
(10006,10002,'XL'),
(10006,10003,'32'),
(10005,10003,'Slim fit'),
(10003,10004,'Cologne'),
(10006,10004,'100 ml'),
(10003,10005,'Polyester'),
(10003,10006,'Black Metal'),
(10003,10007,'White metal with stones'),
(10006,10008,'Cotton')

Here's the query


with the result as below




Conclusion

From the two illustrations above, its evident that FLOWR expression based method can be used effectively to generate nested XML structures

Artifacts

The full code for generating the XML structures can be found below


Monday, October 29, 2018

SQL Tips: String Or Binary Data Truncated Error Message Enhancement

Introduction

I'm sure lots of us who have been developing in SQLServer over ages would agree to the fact that the error

String or binary data would be truncated.
The statement has been terminated.
is one of the most frustrating errors you would have ever come across in T-SQL. Especially in case of long stored procedures with lengthy INSERT...SELECT statements it was always a daunting to task to find the column which acted as the root cause for the above error. And most times the error happens at a later stage due to absence of any breaking data at the time of implementation.

A good majority of senior developers have always complained against this ambiguity and multiple connect items were logged for this issue which got pretty good support as well.

Implementation

When feedback platform was moved to feedback.azure.com  Microsoft had opened a request in that for the connect requests and this had also got good number of votes


Accordingly MS started background work to fix this and finally the fix was released in SQL 2019 version to enhance the error message to include more information. Based on this, the error message has been modified to the below

String or binary data would be truncated in table 'XXXXXXXXXX', column 'YYYYYYY'. Truncated value: 'ZZZZZZ'.
The statement has been terminated.

This was really a good news for all of us, but still there was a small concern that we have wait a while to see this in action as most of the currently implemented instances were on SQL 2016 and SQL 2017.
But it seems MS read our mind on this and now I'm really happy to see the announcement that this enhancement has been backported to SQL 2017 CU 12 and on SQL 2016 SP2 CU.
To enable this currently a trace flag has also being introduced (trace flag 460) which can be enabled at session level or at server level itself. Once set it replaces the older error message with the new one above for truncation exceptions raised. The future SQL 2019 releases should have this message as the default and wont require setting the trace flag explicitly for this.

Illustration

Lets see an illustration of the above error message on a sample table

The code will look like below

--setting the trace flag for the session
DBCC TRACEON  (460);  
GO  

--sample table
declare @t table(
v varchar(10)
)

-- insert values
insert @t 
values
('test'),
('this is a long value to check for truncation error') -- this raises the error

Now lets check the result



As expected we will get the  new error message which gives clear indication of the table, column and value which caused the truncation exception. This makes it much easier for someone to debug and fix the issue.

Now if you turn off the trace flag and try, you can see that it reverts to the old error message 



Conclusion

As seen from the above illustration, this new enhancement on truncation error is really a life saver for someone developing or debugging Transact SQL code and is sure to save many hours of development effort on truncation issues which is one of the high frequent issues we come across in ETL, datawarehousing projects.

The official announcement regarding the release can be found in the below link


Let me end this article by conveying big thanks to Pedro Lopes (@SQLPedro) – Senior Program Manager and the entire MS team for the help and support provided in addressing this issue and coming up with the release.

Tuesday, October 16, 2018

SQL Agent Tips: Using JOBID Token in T-SQL Job Steps To Fetch The Job GUID

Introduction

One of the pretty cool features available in SQL Server Agent is the ability to use tokens for returning values for many system parameters at runtime (for example Server Name, Job Name, Database Name etc).
Recently there was an issue reported in the forums regarding the usage of JOBID token for getting the details of job within job step. This prompted me to analyse more on the feature.
This article explains how you can use the JOBID token to return the job identifier information inside the job step

Illustration

There are quite a few scenarios where we need to use job related information in scripts inside SQL Agent job steps. There are tokens available for this purpose which can be used to return data from job metadata.
One of common use case is when we need to capture the job id for logging purpose inside a step within the same job.
The code can be given as below

DECLARE @jobID varchar(100)

SET @jobID = '$(ESCAPE_SQUOTE(JOBID))'

EXEC [dbo].[notifyjobfailure]

    @jobID

The notifyjobfailure procedure looks like this

CREATE proc  [dbo].[notifyjobfailure]
@jobid varchar(100)
as
insert jobnotify(jobname)
select @jobid
GO

Where jobnotify is a simple table  for capturing the job_id along with the current system time.

Include this within step of a job and try executing it. We will find that the job step will fail like below


As seen from the image above, we get a conversion error trying to use the token inside job step.

To understand why this is happening I tweaked the datatype of the column within the table and the parameter to be of type varchar.

So modified code will look like this

ALTER proc  [dbo].[notifyjobfailure]
@jobid varchar(100)
as
insert jobnotify(jobname)
select @jobid

GO

Try executing the job and we can see that it executes fine now.
Check the table and we can see the below data populated


Checking the data we can see that the token value gets passed in hexadecimal format rather than as a GUID.
Taking this into consideration, we can tweak the code again to add a conversion step to ensure value gets passed as a valid unique identifier (GUID). Accordingly, the code will look like this


DECLARE @jobID uniqueidentifier

SET @jobID = CONVERT(uniqueidentifier,$(ESCAPE_SQUOTE(JOBID)))

EXEC [dbo].[notifyjobfailure]
    @jobID

The procedure code will also be modified as below

CREATE proc  [dbo].[notifyjobfailure]
@jobid uniqueidentifieras
insert jobnotify(jobname)
select @jobid

GO

Now go ahead and execute the job and you will find that it executes successfully

Conclusion

Based on the illustration above we can understand the below points

1. The value for JOBID token gets passed as a hexadecimal value
2. While trying to save the value to a table column of type uniqueidentifier always make sure we do an explicit conversion. Otherwise the code will break as seen from the illustration



Tuesday, August 14, 2018

SSIS Tips: Enforcing TLS 1.2 For SSIS 2012 Connections

Impetus


The main inspiration behind this article comes from a recent issue faced in one of my projects for configuring TLS 1.2 based connectivity to a HTTP endpoint and steps taken in resolving the same

Scenario


In one of the projects, there was a SQL Agent job which started suddenly failing. There was no changes done on any of the core functionality so it was evident that the failure had something to do with some changes done at the destination end.The failure was attributed to a task which was utilizing a HTTP connection manager to connect to a URL for downloading response in  XML format.

The failure message looked like below in the SQL Agent history




i.e.
 Error: The underlying connection was closed: An unexpected error occurred on a send.
  Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
    An existing connection was forcibly closed by the remote host

On analysis and checking with the admin team, we came to know that they enabled TLS 1.2 on the server endpoint. This was causing the connection to fail as the SQLServer we had was on 2012 version and it was still using TLS 1.0 based connection.

The challenge was to see how this can be sorted out. This article discusses a quick solution which can be applied in scenarios like above to ensure successful connection

Solution

The solution involves forcing the connection to use TLS 1.2 this can be done by using the below single line of code inside a script task in your SSIS package.

This should be the first task inside the package and will have single line code as shown below

    System.Net.ServicePointManager.SecurityProtocol = (System.Net.SecurityProtocolType)3072;

Here's how code looks inside the Script Task


3072 corresponds to TLS 1.2 protocol in the SecurityProtocolType enumeration within System.Net namespace

namespace System.Net
{
    [System.Flags]
    public enum SecurityProtocolType
    {
       Ssl3 = 48,
       Tls = 192,
       Tls11 = 768,
       Tls12 = 3072,
    }
}
And you need to have .Net framework 4.5 installed to get this work correctly.

Once this is done it enforces TLS 1.2 for the connection following and connections would be successful.

Conclusion

As shown above this method can be used in SSIS 2012 to ensure TLS 1.2 protocol is enforced for connecting to a client app which is using enhanced encryption