Quantcast
Channel: SQL BI / Data Access Technologies
Viewing all 80 articles
Browse latest View live

Error While Deploying SSIS 2012 project from SSDT to SSIS Catalog : System.ComponentModel.Win32Exception: A required privilege is not held by the client

$
0
0
A .NET Framework error occurred during execution of user-defined routine or aggregate "deploy_project_internal": System.ComponentModel.Win32Exception: A required privilege is not held by the client. I ran across this issue in one of my newly...(read more)

Syntax error, permission violation, or other nonspecific error

$
0
0
While developing a package in BIDS 2008 if we use SNAC 10.0 driver in OLE DB Source connection, call to parameterized stored procedure fails parsing with error: Syntax error, permission violation, or other nonspecific error If we change the connection...(read more)

Could not load package because of error 0x80070002 while upgrading from SSIS 2008 to SSIS 2012

$
0
0
When doing an In-place upgrade from Microsoft SQL Server 2008 or 2008 R2 to SQL Server 2012, be aware of this little caveat with SSIS. Last week I upgraded (note, I did an in-place upgrade and not a side-by-side installation) my Sql 2008 instance to Sql...(read more)

Error ‘Microsoft Office Excel cannot access the file’ while accessing Microsoft Office 11.0 Object Library from SSIS

$
0
0
Folks, yet another stumble with SSIS and Excel. This time I am using Microsoft Office 11.0 Object Library. The code runs fine on Windows Server 2003. Below is the code sample. ======================================================= using Microsoft.Office...(read more)

Leveraging a Hadoop cluster from SQL Server Integration Services (SSIS)

$
0
0
With the explosion of data, the open source Apache™ Hadoop™ Framework is gaining traction thanks to its huge ecosystem that has arisen around the core functionalities of Hadoop distributed file system (HDFS™) and Hadoop Map Reduce. As of today, being...(read more)

Import / Deployment / Execution of SSIS packages fails after you migrate SSISDB from SQL Server 2012 to SQL Server 2014.

$
0
0
In this blog, I would like to discuss about one of the interesting scenarios that you would witness after you migrate your existing SSISDB catalog from SQL Server 2012 to SQL Server 2014. Issue: Whenever you try to deploy your new Integration services...(read more)

SQLAllocHandle on SQL_HANDLE_HENV, from Linux SQLNCLI driver

$
0
0

While connecting to SQL from client application(running on linux server),  using using ‘SQL Native client 11.0 (ODBC) driver for Linux’ we were getting following error message.

Error message

[root@axxxxxazureapi]#  isql -v< DSN> <USERNAME> <PASSWORD>

[IM004][unixODBC][Driver Manager]Driver’s SQLAllocHandle on SQL_HANDLE_HENV failed

[ISQL]ERROR: Could not SQLConnect

 

In my case the issue is related with wrong installation path of the sqlncli dll. when we were trying to connect to SQL from Linux the application is trying to get the dll from a different path.

 

The best way to get the root cause is to take ODBC trace while issue is happening. It will give you more details about the failing API.

Before getting the trace check the UnixODBC version with following command:

odbcinst –j from linux command prompt and verified that unixODBC 2.3.0 (exactly that version, not older or newer) is installed.

Output is following:

$ odbcinst -j

unixODBC 2.3.0

DRIVERS…………: /etc/odbcinst.ini

SYSTEM DATA SOURCES: /etc/odbc.ini

FILE DATA SOURCES..: /etc/ODBCDataSources

USER DATA SOURCES..: /home/odbcuser/.odbc.ini

SQLULEN Size…….: 8

SQLLEN Size……..: 8

SQLSETPOSIROW Size.: 8

How to get the ODBC trace:

The ODBC Driver for SQL Server on Linux supports tracing of ODBC API call entry and exit.

To trace your application behavior, first add the following line to the odbcinst.ini file:

Trace=Yes

Then start your application with strace. For example:

strace -t -f -o trace_out.txt executable

In place of executable just use the command that you are using to run your application. In our case it is like following.

strace -t -f -o trace_out.txt isql -v <DSN>< USERNAME> <PASSWORD>

Once you get the error check the trace file. It will give you more information about the connection failure. On that basis you can troubleshoot the issue.

In my case the path of sqlncli.dll is not found because the application is pointing to a different path and the sqlncli.dll was installed to a different location. After moving the file to that location our application was working fine.

Author : Mukesh(MSFT), Suport engineer, Microsoft

Reviewed by : Snehadeep(MSFT), SQL Developer Technical Lead, Microsoft

SSIS package fails with error : The step did not generate any output. The return value was unknown

$
0
0

One of my customers recently ran into a very strange issue. We had many SSIS Packages that were running as a SQL Server Job.

But, off late we saw all the Jobs that were running a SSIS Package failed. All other SQL Jobs were executing without any issues. All those SSIS Jobs were failing with the Same Error message.

Message

Executed as user: <Account>. The step did not generate any output. The return value was unknown. The process exit code was -1073741502. The step failed.

It was really a generic error message with no information with it. Everything ran fine if we ran it from BIDS or Command Prompt. The issue arises only when we ran it as a Job.

We created a new package with a simple execute SQL task and even that failed as a Job. So, we enabled the SSIS Logging for the Package. Unfortunately, it wasn’t even creating the Log file. So thought that it was not even reaching the point of Execution and it was failing even before that.

We ran a Process Monitor to check if that could give us any lead. But Even Procmon didn’t have any information on it. We checked the Event Viewer Logs. But no Error messages in that as well.

Suddenly, I saw an Informational Message being logged in the Event Viewer.

Application Popup: DTExec.exe- Application Error: The application was unable to start correctly(0xc0000142). Click ok to close the application.

As Expected it was failing at the start stage.

So finally after some research, found that this issue could occur if the Desktop Heap goes out of Memory. Every Desktop has a single Desktop heap associated with it. The desktop heap stores certain user interface objects, such as windows, menus, and hooks. When an application requires a user interface object, functions within user32.dll are called to allocate those objects.  If an application does not depend on user32.dll, it does not consume desktop heap. So if the Desktop heap is Out of Memory, then Process won’t start.

This behavior is kind of explained in http://support.microsoft.com/kb/824422/en-us

The Resolution is to basically increase the Desktop Heap Size.

HOW TO CONGIGURE DESKTOP HEAP

Backup the registry refer kb (322756 How to back up and restore the registry in Windows )

1. At a command prompt, type REGEDT32.EXE to start Registry Editor.

2. In Registry Editor, locate the following registry key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\SubSystems

3. In the right pane of Registry Editor, click Windows.

4. On the Edit menu, click Modify.

5. In the Edit String dialog box, locate the SharedSection parameter string in the Value data box, and then specify a larger value for the SharedSection parameter.

Note The SharedSection parameter specifies the system and desktop by using the following format, where <xxxx> defines the maximum size of the system-wide heap (in kilobytes), <yyyy> defines the size of the per desktop heap, and <zzzz> is the size of the desktop heap for each desktop that is associated with a non-interactive Windows station:

SharedSection= <xxxx>, <yyyy> , <zzzz>

Change the following value from:

SharedSection=1024, 3072,768

TO

SharedSection=1024, 3072, 1536 (We are making this as double.)

6. Restart the machine.

And Yaaayyy! It worked!!!

Happy Learning!! J

Author : Mahaja(MSFT), Suport engineer, Microsoft

Reviewed by : Snehadeep(MSFT), SQL Developer Technical Lead, Microsoft


UNVIPOT TRANSFORMATION WITH A COMBINATION OF SINGLE AND MULTIPLE DESTINATION COLUMNS

$
0
0

It is highly not uncommon that while using Unpivot Transformation in SSIS you feel the need to unpivot the columns to multiple destination columns along with single destination column. Dealing with them individually is easy but together is a little complicated when you think about it but that isn’t the case.

Let’s take an example:

You have a source table like this:

clip_image001

You want your destination table to look this this:

clip_image003

How would we normally design our package keeping in mind that we need to unvipot columns into multiple destination columns:

clip_image005

clip_image007

When you design the package as above, you would get the error:

“PivotKeyValue is not valid. In an UnPivot transform with more than one unpivoted DestinationColumn, the set of PivotKeyValues per destination must match exactly.”

A simple reason to it is that you cannot have multiple destination columns and single destination columns in the same unpivot transformation task. In a broader way, you cannot have different set of unpivoted destination columns in the same task.

Let us now see how we can overcome this hurdle.

We need to have different unpivot transformation task for different set of unpivoted destination columns required. Pass through the values and use them in the further unpivot transformation task as it meets your requirement.

As in our example:

We need to modify the package and have it as below:

clip_image009

Unpivot Transformation Task 1 Configuration (Multiple Destination columns):

clip_image011

Unpivot Transformation Task 2 (Single Destination Column):

clip_image013

As our final aim was to move all the B’s into B, keeping in mind that A and B are the multiple destination columns we used the available A’s and B’s pair (pair in the sense, having a common pivot key value) and we combined these paired B’s with the left over B’s and move them to the destination by having a separate unpivot transformation task for it.

Let us do some mathematics here: Calculation of number of rows which get loaded into the destination-

From the source we extract: 3 rows

After the Unpivot Transformation Task 1, we have 3 combination of A’s and B’s and so the number of rows we get after the execution of this task is: 3*3 = 9 rows.

After the Unpivot Task 2, we have 2 left columns of B’s, i.e. B4 and B5 and we had the Known history being passed through. So in total we have 3 columns, thus the number of rows being successfully loaded in the destination: 9*3 = 27 rows.

An alternate solution to this referring to our example is to have A4 and A5 with all null values and have just a single unpivot task and have all the A’s have their destination column as ‘A’ and all the B’s have the destination column as ‘B’. This solution would be feasible when you have a key -> value kind of relationship as the destination table will have null values for the key’s which do not have an associated value in the source. The former solution is feasible when you have the values not associated with a single key and associated with all the keys. If we use the alternate solution, we’ll have 3*5 = 15 rows (3 being the number of source rows and 5 being the pairs of A’s and B’s).

Enjoy Unpivoting!!

Author : Deepak(MSFT), Suport engineer, Microsoft

Reviewed by : Snehadeep(MSFT), SQL Developer Technical Lead, Microsoft

SSIS Packages executed from Visual Studio Business Intelligence Development Studio – BIDS [Design time] on a 64bit development server goes to unresponsive state during package execution.

$
0
0

Through the use of this blog, I would like to address a common scenario that you would run into when executing a complex SSIS package from within the BIDS environment in a x64 bit development server, where the visual studio- BIDS environment goes to a hanged state during the package execution.

When I say a COMPLEX – SSIS package, what I really mean is a SSIS Package that contains a couple of blocking transformations. In simpler terms, a blocking transformation is a one which must read and process all input records before creating any output records and thus has a great impact on the performance of a SSIS package execution. A Sort / Aggregate/ Fuzzy Lookup transformations etc. are good examples of a blocking transformation. When you execute these packages, you often see some temporary files been created / deleted on your physical drives. This is what we call as Buffer spooling. Now what could cause a buffer to swap? There are two possible causes. The first one is when a memory allocation fails. The second one is when Windows signals the low memory resource notification event. Both will trigger SSIS to reduce its working set. SSIS does so by moving buffer data to disk. You can track the package execution using the SSIS Performance Counters named Buffers spooled. When Buffers spooled > 0 it means that the memory is swapped to disk and this is a good indication that your SSIS packages execution is going to get delayed.

Now consider that you are developing / running these packages on a 64bit machine using the Visual Studio – Business Intelligence Development Studio [BIDS]. The SSIS-debug runtime process name that executes the package for you is DTSDEBUGHOST.EXE. Typically when you want your SSIS packages to get executed using the 64bit Runtime, you go to Project à right-click Properties [Project property Pages] à select the Debugging pane and change the Debug Options property Run64BitRuntime to “True”. This will execute your SSIS packages under the project to the run under the 64bit SSIS-Debug runtime calling the 64bit DtsDebugHost.exe. More details available under 64 bit Considerations for Integration Services: http://technet.microsoft.com/en-us/library/ms141766(v=sql.105).aspx

clip_image002

Note: The Run64BitRuntime project property applies only at design time.

But what if your complex SSIS package hangs in the middle of execution, you encounter buffer spooling and your SSIS package takes a very long time to complete or never completes even after hours of waiting time. It might be any SSIS package that was working fine in your other machine or your colleague’s machine. When you bring-up the Windows Task Manager on your x64bit machine, you notice something different than what you were expecting, you see that the DTDebugHost.exe *32 [32bit version of SSIS Debug Runtime] is been called and not a 64bit DTDebugHost.exe. Also notice that the DTSDebugHost.exe*32 memory consumed is somewhere around 2.5-3.0 Gigs and it is stagnant / halts there after buffer spooling.

clip_image004

Now you start thinking if the project settings that you had set to execute the package using 64BitRuntime have really come into effect. Also you notice that the description pane for this property clearly informs us that this property is ignored if 64bit SSIS runtime is not installed on the machine. Now did we not assume that when we selected the Business Intelligence Development Studio or Management Tools – Complete in the SQL Server installation – Feature selection page during the SQL Server setup installation on our 64bit machine, we expected a 64bit SSIS Debug Runtime [64bit DTSDebugHost.exe] to be installed along on our machines.

clip_image006

On further investigation you would determine that you would be able to see the 32bit SSIS Debug Runtime [DtsDebugHost.exe * 32] under <Installation Drive>\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\DtsDebugHost.exe, but 64bit SSIS Debug Runtime [DtsDebugHost.exe] is NOT available under <Installation Drive>\Program Files\Microsoft SQL Server\100\DTS\Binn. Does that mean that our SQL client installation didn’t complete successfully? Answer – No. This is as per the design of the SQL Server installer.

For installing the 64bit version of DTSDebugHost.exe on 64-bit computers, you need to select the following options in the SQL Server installation – Feature selection page,

- Select Integration Services to install the Integration Services service to run packages outside the design environment and also to install the 64 bit SSIS debugging environment wiz. DTSDebugHost.exe – x64 bit.

- On a 64-bit computer, selecting Integration Services it installs only the 64-bit runtime and tools. If you have to run packages in 32-bit mode, you must also select an additional option to install the 32-bit runtime and tools:

· i.e. select Business Intelligence Development Studio or Management Tools – Complete as well.

clip_image008

INFERENCE:

If you are intending to use x64 bit SSIS runtime when executing the SSIS packages from the Visual Studio BIDS environment,  you to need to have the following,

a. In the Project Properties of an Integration Services package, you need to select 64-bit execution by setting the value of the Run64BitRuntime property to true on the Debugging page. By default, the value of this property is True. When the 64-bit version of the Integration Services runtime is not installed, this setting is ignored.

b. You must install the Integration Services feature on this machine for the respective 64 bit DTSDebugHost.exe [ 64 bit SSIS Debug Engine ] to be triggered during the package execution.

This way of including the x64 bit version of DTSDebugHost.exe in a separate installer and providing it as a part of Integration Services installation separately is as per the Microsoft SQL Server product Licensing Policy. We just provide the end user the 32bit version of SSIS designer / debugging runtime [ DTSDebugHost *32 ] along with the Visual Studio – Business Intelligence Development Studio alone. If the user is looking forward to use the x64 bit version of SSIS debugging runtime [ DTSDebugHost.exe *64], he/she needs to install the licensed version of Integration Services on the machine.

Also the recommended installation on the SSIS package developer machine would be the following,

· Integration Services to install the Integration Services service and to run packages outside the design environment and for 64bit DTS Debugging Engine [64bit DTSDebugHost.exe]  [if the SSIS package requires a 64bit Debugging Engine as in our case]

· Business Intelligence Development Studio to install the tools for designing packages. [MUST]

· Client Tools SDK to install managed assemblies for Integration Services programming. [if necessary]

Referenced Articles:

1. Considerations for Installing Integration Services: http://msdn.microsoft.com/en-us/library/ms143731(v=sql.105).aspx

2. 64 bit Considerations for Integration Services:               http://msdn.microsoft.com/en-us/library/ms141766(v=sql.105).aspx

3. Introducing Business Intelligence Development Studio:            http://msdn.microsoft.com/en-us/library/ms173767(v=sql.105).aspx

Author : Krishnakumar(MSFT), Suport engineer, Microsoft

Reviewed by : Snehadeep(MSFT), SQL Developer Technical Lead, Microsoft

“… an error occurred during the pre-login handshake.”&“[DBNETLIB][ConnectionOpen (SECDoClientHandshake()).]SSL Security error” when connecting to SQL Server.

$
0
0

This blog is regarding one of most commonly faced issues that you may receive when connecting to the SQL Server. Mostly you may run into this issue after some improper Windows security update (say KB2655992 in my case) or improper application of Poodle security fix.

 

ISSUE DESCRIPTION FROM SQL CONNECTIVITY STANDPOINT:

When we try to connect to the SQL Server instance using the SQL Server Management Studio, it may fail with the following error message,

Error:

TITLE: Connect to Server

——————————

Cannot connect to <mySQLServer>.

——————————

ADDITIONAL INFORMATION: 

A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: Shared Memory Provider, error: 0 – No process is on the other end of the pipe.) (Microsoft SQL Server, Error: 233)

For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=233&LinkId=20476

 ——————————

BUTTONS:

OK

——————————

 

Using a UDL file, we saw the following error message,

—————————

Microsoft Data Link Error

—————————

Test connection failed because of an error in initializing provider. [DBNETLIB][ConnectionOpen (SECDoClientHandshake()).]SSL Security error.

—————————

OK  

————————— 

 

Note: Forcing the connection to use any other protocol like TCP, Named Pipes & Shared Memory also throws the same error message. The Dedicated administrator connection (DAC) was also throwing a “Login timeout expired.” error.

 

OUR FINDINGS & INFERENCE:

 

There are numerous reasons on why you may witness these error messages. But usually, if you see these 2 specific error messages from SQL Server Management Studio & UDL file. Then it is worth checking the below settings.

 

As per the error message received, “… an error occurred during the pre-login handshake.” and “[DBNETLIB][ConnectionOpen (SECDoClientHandshake()).]SSL Security error.” means that the client application was able to complete the TCP 3-way handshake properly (hence you notice “A connection was successfully established with the server”), but during the pre-login handshake,  the client application checks with the SQL Server on the TDS protocol version to be used henceforth for the communication, the login passed by the client application (Windows authenticated login or SQL Authenticated Login), whether there is any client-side or SQL Server connection encryption using SSL certificates or TLS etc. If the SQL Server doesn’t respond to this request from the client in a timely fashion or fails to respond due to any internal machine-level issues, we end-up at this particular error message. (wiz. “… an error occurred during the pre-login handshake”)

 

These error messages are thrown from the actual SQL Server drivers / providers that are used to establish the connection to the SQL server. E.g (OLE DB provider for SQL Server / SQL Server Native Client etc.). So different SQL Drivers/ Providers throws different error messages for the same issue. When we tried the connection from the UDL file, we see a different error message for this reason, but this error message was more straightforward.

 

Error:

Test connection failed because of an error in initializing provider. [DBNETLIB][ConnectionOpen (SECDoClientHandshake()).]SSL Security error.

 

Hence, we directly jumped to the SCHANNEL registry hive to check the values. Wiz.

Target hive:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL

 

CAUSE:

 

By default, you may not find the below registry keys, which is completely fine. But  in my case, when we checked the values for these registry keys on the target SQL Server, the following were the values.

 

i) HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Server

DisabledByDefault  was set to 1

Enabled  was set to 0

 

ii) HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Server

DisabledByDefault  was set to 1

Enabled  was set to 0 

 

iii) HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0 \Server

DisabledByDefault  was set to 0

Enabled  was set to 0 

 

Based on these values, we figured-out that none of the security provider protocols were enabled. We confirmed that all SSL 2.0, SSL 3.0 & TLS 1.0 were disabled and this is not an ideal scenario.

 

RESOLUTION:

 

We enabled the TLS 1.0 protocol by setting the following value.

Under,

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0 \Server

Set Enabled to 1

 

Rebooted the SQL Server machine for the changes to reflect.

After the successful reboot, we were able to connect to the SQL Server locally without any issues. Also tested the connection from a remote server machine and confirmed that the connections to the SQL Server worked as expected without any issues. SQL server was just a victim here like any other application that uses windows security providers and since all the security provider’s protocols were disabled, the SQL Server wasn’t able to accept any new connection request.

 

 

Please drop in your comments or connect with Microsoft BI-ONE CSS team if you are still encountering the same issue even after performing the above steps.

Happy troubleshooting!!!!

 

Author: Krishnakumar Rukmangathan, Technical Advisor, SQL Server BI-ONE Developer team, Microsoft

Reviewed by: Sunil Kumar B.S, Escalation Engineer, SQL Server BI-ONE Developer team, Microsoft.

Import / Deployment / Execution of SSIS packages fails after you migrate SSISDB from SQL Server 2012 to SQL Server 2014.

$
0
0

 

In this blog, I would like to discuss about one of the interesting scenarios that you would witness after you migrate your existing SSISDB catalog from SQL Server 2012 to SQL Server 2014. Issue: Whenever you try to deploy your new Integration services projects to the migrated SSISDB catalog (of SQL Server 2014) using Microsoft Visual Studio – SQL Server Data tools [SSDT] or when you try to import your old SSIS packages to your newly migrated SSISDB (of SQL Server 2014) directly from the SSISDB catalog (of SQL Server 2012) using the SQL Server Management Studio [SSMS] or Execute your already migrated SSIS packages on the newly migrated SSISDB on the SQL Server 2014, you may experience the following error message,

 

  ERROR MESSAGE:

 The required components for the 64-bit edition of Integration Services cannot be found.  Run SQL Server Setup to install the required components.

A .NET Framework error occurred during execution of user-defined routine or aggregate “start_execution_internal”:

System.Data.SqlClient.SqlException: The required components for the 64-bit edition of Integration Services cannot be found.  Run SQL Server Setup to install the required components.

System.Data.SqlClient.SqlException:

   at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)

   at System.Data.SqlClient.SqlCommand.RunExecuteNonQuerySmi(Boolean sendToPipe)

   at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource`1 completion, String methodName, Boolean sendToPipe, Int32 timeout, Boolean asyncWrite)

   at System.Data.SqlClient.SqlCommand.ExecuteToPipe(SmiContext pipeContext)

   at Microsoft.SqlServer.Server.SqlPipe.ExecuteAndSend(SqlCommand command)

   at Microsoft.SqlServer.IntegrationServices.Server.ServerConnectionControl.RaiseError(SysMessageId messageId, SysMessageSeverity severity, Object[] args)

   at Microsoft.SqlServer.IntegrationServices.Server.ServerApi.GetDtsPath(Int16 use32bitOn64, String& path)

   at Microsoft.SqlServer.IntegrationServices.Server.ServerApi.GetISServerExecPath(Int16 use32bitOn64, String& path)

   at Microsoft.SqlServer.IntegrationServices.Server.ServerApi.StartExecutionInternal(SqlInt64 projectId, SqlInt64 executionId, SqlInt64 versionId, SqlInt16 use32BitRuntime)

. (Microsoft SQL Server, Error: 27222)

 

  

Fig 1. Error message thrown when executing the SSIS packages on migrated SSISDB of SQL Server 2014.

  

CAUSE:

You may have taken a backup, restored and moved the SSISDB from your existing SQL Server 2012 SSISDB to your new SQL Server 2014 SSISDB following the below article, ( either using automated T-SQL Scripts or running through the screens of SSMS following the article)

Backup, Restore, and Move the SSIS Catalog:     http://msdn.microsoft.com/en-us/library/hh213291.aspx

 

But you may have missed out or not implemented the action item #8 under Restore the SSIS Database, wiz.

“8. Determine whether the SSISDB catalog schema and the Integration Services binaries (ISServerExec and SQLCLR assembly) are compatible, by running catalog.check_schema_version.”

This stored procedure actually checks to see if your restored SSISDB catalog schema and Integration services binaries (IsServerExec and SQLCLR assembly) are compatible.

 

When we actually run this stored procedure, you may notice that it fails with the exact same error message as above,

 

USE [SSISDB]

GO

DECLARE @return_value int

EXEC    @return_value = [catalog].[check_schema_version]

                @use32bitruntime = 0

SELECT  ‘Return Value’ = @return_value

GO

                

Fig 2. Same error message thrown when running the above stored procedure on the newly migrated SSISDB of SQL Server 2014

 

In continuation to this, if you look into the properties of the restored SSISDB on your SQL Server 2014, the schema build still points to the SSISDB 2012 rather than SSISDB 2014.

 

 Fig 3. Right-Click the SSISDB under the Integration Services Catalogs and go to Properties check the Schema version & Schema Build.

                        

On your SSISDB properties window:

Schema Version: 2

Schema Build: 11.0.XXXX.0    -> this points to SQL Server 2012 SP XXX (SSISDB version of source SQL Server)

 

Ideally, it should be pointing to SQL Server 2014, wiz. 12.0.0 XXXX  -> pointing to SQL Server 2014 version.

 

 

 Fig 4. Ideal Catalog Properties window with the Schema Version & Build pointing to SQL Server 2014.

 

Thus there is a mismatch on the versions of SSISDB catalog schema (set to SQL Server 2012 still) and Integration services binaries (IsServerExec and SQLCLR assembly) (set to SQL Server 2014) because of which any activities on your newly migrated SSISDB fails.

This will ideally make your entire SSISDB unusable i.e. If you try deploying SSIS project from SSDT/SSMS or import SSIS packages or execute any package inside SSISDB would fail with the same error message.

 

RESOLUTION/ WORKAROUND:

If you are using the MSDN article as it is or scripts to automate the backup, restore and movement of the SSISDB from SQL Server 2012 to SQL Server 2014 based on this article, then it will fail in the #8 action item and this at present seems to be a known behavior.

The article or automated scripts for migrating the SSISDB will work flawlessly when you are moving the SSISDB between the same versions of the SQL Server (SQL 2012 to SQL 2012 / SQL 2014 to SQL 2014). But not across different versions of SQL Server.

 

At present, there is no possible method / recommended way in which the Schema of the SSISDB can be changed directly from SQL Server 2012 to SQL Server 2014. (Dated to the time the blog is written)

This may be available in future releases of the product though. At present, the only possible workaround available is as below,

 

Workaround:

1. Backup the existing SSISDB on your SQL Server 2014 (Optional)

2. Delete the existing SSISDB on your SQL Server 2014

3. Recreate a new catalog from scratch. Check the properties of the SSISDB and confirm that the Schema build points to SQL Server 2014.  (Check NOTE, if you run into any issues with creating a new catalog)

4. Redeploy the SSIS Projects from the SQL Server 2012 environment or import the packages as required.

5. Check the working of the SSIS packages now.

 

NOTE:

If “Create Catalog…” fails with the following error message,

TITLE: Microsoft SQL Server Management Studio

——————————

An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.Management.IntegrationServices)

——————————

ADDITIONAL INFORMATION:

The server principal ‘##MS_SQLEnableSystemAssemblyLoadingUser##’ already exists.

Cannot find the login ‘##MS_SQLEnableSystemAssemblyLoadingUser##’, because it does not exist or you do not have permission. (Microsoft SQL Server, Error: 15025)

For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%20SQL%20Server&ProdVer=12.00.2000&EvtSrc=MSSQLServer&EvtID=15025&LinkId=20476

——————————

 

Go to Security -> Login -> check for any login with the name “MS_SQLEnableSystemAssemblyLoadingUser” and delete this login and the recreate the catalog again.

 

 

Referenced Articles:

 1. Backup, Restore, and Move the SSIS Catalog:     http://msdn.microsoft.com/en-us/library/hh213291.aspx

 2. catalog.check_schema_version :      http://msdn.microsoft.com/en-us/library/hh479596.aspx

  

Please drop in your comments or connect with Microsoft BI-ONE CSS team if you are still encountering the same issue even after performing the above steps.

 

Author: Krishnakumar Rukmangathan, Technical Advisor, SQL Server BI-ONE Developer team, Microsoft

Reviewed by: Sunil Kumar B.S, Escalation Engineer, SQL Server BI-ONE Developer team, Microsoft.

Report Manager: “System.InvalidOperationException: This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms.”

$
0
0

Applies To: Tested on SQL Server Reporting Services 2008, 2008 R2, 2012 and 2014.

When you browse Report Manager URL, you get an HTTP 500 error or a blank page (in case if you have disabled friendly HTTP messages) on the browser window. When you check the Reporting Services log files you would find the below error being logged:

ERROR: System.Web.HttpException: Error executing child request for Error.aspx. —> System.Web.HttpUnhandledException: Exception of type ‘System.Web.HttpUnhandledException’ was thrown. —> System.InvalidOperationException: This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms.

Whereas you would able to browse Report Server URL successfully and it lists all the Reports, Data Sources, Folders, etc. without any issues.

 

Cause:

This is happening because FIPS is enabled on the Reporting Services server and Report Manager does not support the Local Security Policy “System cryptography: Use FIPS compliant algorithms for encryption, hashing, and signing”. (https://technet.microsoft.com/en-us/library/ms345220%28v=sql.105%29.aspx)

To ascertain that FIPS is enabled you can:

(1)    Check the registry key:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\fipsalgorithmpolicy

And the value of it should be set to 1.

(2)    Or else, go to Local Security Policy (Start -> Run -> secpol.msc) and then go to “Security Settings -> Local Policies -> Security Options” and on the right-side windows you should see the policies in that please look for the Policy “System cryptography: Use FIPS compliant algorithms for encryption, hashing, and signing” and checked the security thing and it should be Enabled.

 

For more information on why FIPS is not supported you can refer:

https://support.microsoft.com/en-us/kb/911722

http://blogs.technet.com/b/secguide/archive/2014/04/07/why-we-re-not-recommending-fips-mode-anymore.aspx

 

How to resolve this issue:

If you do not need FIPS, go ahead and change the above mentioned registry change from 1 to 0 or else change the local security policy from Enabled state and Disabled state.

If you cannot disable FIPS, don’t worry we can still get around it. With reference to https://support.microsoft.com/en-us/kb/911722, in order to get around this issue you would have to edit Report Manager’s web.config file as explained below.

File to be edited:

<system-drive>\Program Files\Microsoft SQL Server\MSRS<version>.<instance>\Reporting Services\ReportManager\Web.config

What to do?

(1)    In the Web.config file, locate the <system.web> section.

(2)    Add the following <machineKey> section to in the <system.web> section:

<machineKey validationKey=”AutoGenerate,IsolateApps” decryptionKey=”AutoGenerate,IsolateApps” validation=”3DES” decryption=”3DES”/>

(3)    Save the Web.config file.

 

Once the file has been changed, you would have to restart Reporting Services service for the change to become effective.

Recommendation: Take a backup of the web.config file prior to making the change.

Note: As Reporting Services 2008 and above doesn’t use IIS, this configuration change has to be made in the Report Manager’s web.config file and Reporting Services service needs to be restarted after making the change and not IISRESET.

I have also added this note to https://technet.microsoft.com/en-us/library/ms345220%28v=sql.105%29.aspx as a comment.

 

HTH!

 

Author:  Deepak Lakhotia – SQL Server BI-ONE Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan- Technical Advisor, SQL Server BI-ONE Developer team, Microsoft

Error while connecting to SQL Server –“Could not connect because the maximum number of ‘1’ user connections has already been reached.”

$
0
0

Applies To: Tested on SQL Server 2008, 2008 R2, 2012 and 2014.

Issue:

In this blog, I would like discuss about one of most commonly faced issues that you may encounter when connecting to the SQL Server. When you try to connect SQL Server you may get the error below. The error can occur while connecting to SQL Server from any custom application and also from SQL Server Management Studio (SSMS). You may not be able to connect to the SQL Server locally as well.

Error:

A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 – No process is on the other end of the pipe.) (Microsoft SQL Server, Error: 233).

If you go to SQL Server machine and check for the SQL Server Error logs (default location: C:\Program Files\Microsoft SQL Server\MSSQLXX.<InstanceName>\MSSQL\Log) you will the below error message.

Error:

2015-07-07 14:19:14.82 Logon       Could not connect because the maximum number of ‘1’ user connections has already been reached. The system administrator can use sp_configure to increase the maximum value. The connection has been closed. [CLIENT: XX.XX.XX.XX] 

Cause:

The above error message from SQL Server Error log says that the SQL Server is configured to accept a maximum of 1 active connection.

Resolution:

In order to solve the issue you have to close all the existing active connections and change maximum concurrent connections to infinite from command prompt.

Steps:

1. Close all concurrent SQL Server connections from command prompt. To do this, run the command below from command prompt window (cmd).

 “C:\Program Files\Microsoft SQL Server\MSSQLXX.MSSQLSERVER\MSSQL\Binn\sqlservr.exe” -sMSSQLSERVER -mSQLCMD –c

 (The above command is for the default instance of the SQL Server (MSSQLSERVER), for a named instance of SQL Server provide the name of the instance after the –s switch. Also this is default path of sqlservr.exe. If you have installed SQL Server in a different path, use that path instead. )

2. Open another command prompt (cmd) as an administrator and execute the command below to connect SQL Server.

sqlcmd -E

(In the above command, we are connecting to a default instance of SQL Server using windows authentication. Please check the below article for the other available switches)

Use the sqlcmd Utility:  https://msdn.microsoft.com/en-us/library/ms180944(v=sql.120).aspx

3. Once connected to SQL Server, execute the below command to change the number of concurrent sql connection to 0 (0 means infinite no. of connections).

sp_configure ‘show advanced options’, 1;  (hit enter)

go (hit enter)

reconfigure (hit enter)

go (hit enter)

sp_configure ‘user connections’, 0  (hit enter)

go (hit enter)

reconfigure (hit enter)

go  (hit enter)

Exit (hit enter)

4. Exit the SQL command prompt.

5. Restarted the SQL Server Service.

6. Try connecting to SQL Server.

 

Note: You can also change the number of concurrent SQL Server connection from the SQL Server Management Studio (SSMS) (Properties->Connections->Maximum number of concurrent connections), but in my case since the connection from SSMS was not successful, we made the changes using command prompt.

Please drop in your comments or connect with Microsoft BI-ONE CSS team if you are still encountering the same issue even after performing the above steps.

 

Author: Jaideep Saha Roy – SQL Server BI-ONE Developer team, Microsoft

Reviewer: Krishnakumar Rukmangathan – SQL Server BI-ONE Developer Team, Technical Advisor, Microsoft

 

 

 

The OLE DB provider "MSDASQL" for linked server "" supplied inconsistent metadata for a column.The column "" (compile-time ordinal 2) of object "" was reported to have a "DBCOLUMNFLAGS_ISLONG" of 0…

$
0
0

Applies To: SQL 2012 and above. (Requires supportability towards Multi-Subnet Failover)

Understanding of the Issue:

Let us consider a situation where we want to create a Linked Server to pull data from SQL Server Always On (using the Listener name) onto our local SQL Server (or another SQL Server) and along with it also leverage the option of having the “MultiSubnetFailOver” property being set. How can we achieve this?

To achieve the above requirement, we would have to create an ODBC DSN using SQL Native Client Provider 11.0 to connect to the SQL Server Always On using the listener name and check the checkbox saying “MultiSubnetFailOver” during the setup of the ODBC DSN (see below image) and then create a Linked Server to connect to that ODBC DSN. 

NOTE: The SQL OLE DB Providers doesn’t support “MultiSubnetFailover” property. Refer article:  https://msdn.microsoft.com/en-us/library/gg471494(v=sql.120).aspx

Now, let us consider an instance where the source table is something like this:

CREATE TABLE [dbo].[testtable](

 [id] [int] IDENTITY(1,1) NOT NULL,

 [value] [nvarchar](max) NULL

 

And data in it is something as below:

 

If we try and pull this data from the above configuration using a query like:

select * from openquery(mylinkedserver, ‘ select id, value from master.dbo.testtable’)

We would receive the below error:

The OLE DB provider “MSDASQL” for linked server “mylinkedserver” supplied inconsistent metadata for a column. The column “value” (compile-time ordinal 2) of object “”master”.”dbo”.”testtable”” was reported to have a “DBCOLUMNFLAGS_ISLONG” of 0 at compile time and 128 at run time.

Cause:

This issue is occurring because we’re trying to pull in a column which has a column size being set to “Max”. This becomes an issue here as we’re switching providers from MSDASQL to SQLNCLI and there is an upper limit on the number of characters which can be converted for any row data of certain datatype. See the resolution section for further information.

Resolution:

How can we ascertain that the data conversion upper-limit on the number of the characters is the cause and identify the limit on the number of characters allowed?

If we change the above OPEN QUERY to cast the nvarchar(max) to nvarchar(10000):

select * from openquery(mylinkedserver, ‘ select id, cast(value as nvarchar(10000)) from master.dbo.testtable’)

We would hit the below error:

OLE DB provider “MSDASQL” for linked server “mylinkedserver” returned message “[Microsoft][ODBC SQL Server Driver][SQL Server]Statement(s) could not be prepared.”.

OLE DB provider “MSDASQL” for linked server “mylinkedserver” returned message “[Microsoft][ODBC SQL Server Driver][SQL Server]The size (10000) given to the convert specification ‘nvarchar’ exceeds the maximum allowed for any data type (4000).”.

The above error clearly help us in ascertaining that the limit for nvarchar is 4000 characters for the data conversion to occur. Similarly, the limit for all data types can be found.

 

What can we do to overcome the issue?

We can try below options:

(a) If we have the option to change the source table, change it as such that it should not have nvarchar(max) datatype based columns. We can have nvarchar(size) based datatype column, where size  can have a max value of 4000. Similarly, it holds true for other datatypes with their respective limits.

(b) If we do not have the option to change the source table, then we would have to cast the incoming rows to 4000 for nvarchar “[cast(value as nvarchar (4000)]” in the openquery statement which would help in trimming down data after 4000 characters and allow the source data to get converted successfully as below:

select * from openquery(mylinkedserver, ‘ select id, cast(value as nvarchar(4000)) from master.dbo.testtable’)

(c) If neither (a) nor (b) is possible, we can pull data into successive chunks of characters of length 4000 and then concatenate them to a variable and then use it like print on the query output windows or use it somewhere else as per the requirement. We can try the below query (and change accordingly as per the requirement) to achieve this:

SQL Script:

–Need it for successive run, if you’re running it for the first time then comment it.

drop table #temp_table

– Create temporary table where we would be pushing all the data.

create table #temp_table(

ID int,

Value nvarchar(max))

–Declare and select what would we be using as a variable.

DECLARE @oneid int

DECLARE the_cursor CURSOR FAST_FORWARD

FOR select * from openquery(testbradtest, ‘select id from testdatabase.dbo.table1 order by (id)’)

OPEN the_cursor

FETCH NEXT FROM the_cursor INTO @oneid

–Loopthourgh all the rows one by one.

WHILE @@FETCH_STATUS = 0

BEGIN

declare @length int,  @sqlqueryforlength nvarchar(max)

–Find length of the column in consideration

set @sqlqueryforlength = ‘select @val1=lengths from openquery(testbradtest, ”select len(value) as lengths from testdatabase.dbo.table1 where id =’ + cast((@oneid) as NVARCHAR) + ”’)’

exec sp_executesql @sqlqueryforlength ,N’@val1 NVARCHAR(max) output’, @val1=@length OUTPUT

declare @currentvalue nvarchar(4000) –Current set of 4000 characters.

declare @finalvalue nvarchar(max) — Final row value which is a replica of what is there is in the source table (more than 4000 characters).

declare @startindex int, @sqlquery nvarchar(max)

select @startindex = 0

select @finalvalue = NULL

–Loop until we cover all the characters in the row value taking 4000 at a time.

while (@length >0)

BEGIN

– Until the length or left over is more than 4000 characters.

if @length >4000

BEGIN

set @length = @length – 4000

set @sqlquery = ‘select @val1=value1 from openquery(testbradtest,”select cast((substring(value,’+ cast((@startindex+1) as NVARCHAR) + ‘,’ + cast((@startindex +4000) as NVARCHAR) + ‘)) as NVARCHAR(4000)) as value1 from testdatabase.dbo.table1 where id =’ + cast((@oneid) as NVARCHAR) + ”’)’

exec sp_executesql @sqlquery ,N’@val1 NVARCHAR(max) output’, @val1=@currentvalue OUTPUT

set @finalvalue = COALESCE(@finalvalue, ”) + @currentvalue

END –If.

– Dealing with last set of characters.

else

begin

set @sqlquery = ‘select @val1=value1 from openquery(testbradtest, ”select cast((substring(value,’ + cast((@startindex+1) as NVARCHAR) +’,’ + cast((@startindex + @length) as NVARCHAR) +’)) as NVARCHAR(4000)) as value1 from testdatabase.dbo.table1 where id = ‘ + cast((@oneid) as NVARCHAR) + ”’)’

exec sp_executesql @sqlquery ,N’@val1 NVARCHAR(max) output’, @val1=@currentvalue OUTPUT

set @finalvalue = COALESCE(@finalvalue, ”) + @currentvalue

BREAK

end –ELSE.

–Increment the starting index.

set @startindex= @startindex + 4000

END

–insert data into temporary table

insert into #temp_table (Id, Value)

select @oneid, @finalvalue

–Fetch next row value.

FETCH NEXT FROM the_cursor INTO @oneid

END 

CLOSE the_cursor

DEALLOCATE the_cursor

–Select data from the temp table

select * from #temp_table

 

HTH!

 

Author:  Deepak Lakhotia – SQL Server BI-ONE Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan- Technical Advisor, SQL Server BI-ONE Developer team, Microsoft


SQLCMD 2014 fails to authenticate via Kerberos

$
0
0

 

Understanding of the issue:

When we try to execute a query on a SQL server from a remote machine using SQLCMD.exe, the connection goes via NTLM rather than KERBEROS even though all the pre-requisites for the Kerberos authentication is fulfilled.

 

1

When using SQLCMD of version 2012 and below, the connection goes via Kerberos provided the prerequisites are fulfilled.

 

2

This behavior is seen only in SQLCMD, but when the same query is run through SQL Server Management Studio (SSMS), it goes via Kerberos irrespective of the SSMS version.

 

3

 

Cause:

It is a by design behavior with SQL 2014 that the SQLCMD requires SPNs with a NETBIOS name in order to go through Kerberos.

 

Resolution:

Create a SPN with the HOSTNAME/ NETBIOS name along with the default SPNs with Fully Qualified Domain Name (FQDN)

 

4

 

In case you are still facing the same issue, please reach out to CSS team.

 

Author:  Chetan KT – SQL Server BI-ONE Developer team, Microsoft

Reviewer:    Sunil Kumar B.S. Escalation Engineer, SQL Server BI-ONE Developer team, Microsoft

 

 

 

 

How to perform a silent installation with SQL Server Migration Assistant (SSMA)

$
0
0

In this blog, I would like to discuss about how we can perform a silent installation (without constant interaction or prompts) of SQL Server Migration Assistant (SSMA). Please note that this method is not a documented and hence this post is provided “AS IS” with no warranties, conferring no rights. / supported scenario.

As you would know, Microsoft SQL Server Migration Assistant (SSMA) is a tool to automate migration from Microsoft Access database(s) / Oracle database / MySQL database / Sybase ASE / DB2 databases to SQL Server and Azure SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing to reduce cost and reduce risk of your database migration project.

 

Now let us come back to the topic, the following are the download pages (Please check online for the most recent versions available)

Download SQL Server Migration Assistant (SSMA) v.6.0:

Microsoft SQL Server Migration Assistant v6.0 for Access: http://www.microsoft.com/en-in/download/details.aspx?id=43690

Microsoft SQL Server Migration Assistant v6.0 for Oracle: http://www.microsoft.com/en-in/download/details.aspx?id=43689

Microsoft SQL Server Migration Assistant v6.0 for MySQL: https://www.microsoft.com/en-us/download/details.aspx?id=43688

Microsoft SQL Server Migration Assistant v6.0 for Sybase: https://www.microsoft.com/en-us/download/details.aspx?id=43691

Microsoft SQL Server Migration Assistant v6.0 for DB2: https://www.microsoft.com/en-in/download/details.aspx?id=45296

 

All the documented methods for silent installation uses the Microsoft Windows Installer (MSIEXEC.EXE) executable program that interprets the Installer MSI packages and installs products. Msiexec enables you to specify property values from the command line. In this blog, we are using SSMA for Oracle as an example. The catch here is that the SSMA download page installation comes with SSMA for Oracle.6.0.0.exe & SSMA for Oracle.6.0.0.ExtPack.exe packages only and no MSI files. The following method includes the way in which you can extract the MSI file from the EXE installer executable program and then perform a silent installation using Microsoft Windows Installer (MSIEXEC.EXE) with property values from the command line.

 

Step 1:

Extract the MSI File from the SSMA for Oracle.6.0.0.exe Installer EXE program:

There are numerous methods discussed online to achieve this, but the one that we had tested and was working is as below,

In this method, we are using the fact that most of the installers extracts the .MSI file to the temp folder during their installation process.

1. Start the SSMA for Oracle.6.0.0.exe.

1

2. Now when you see the above window, please do not click Next in this window and do not close it.

3. Start the Run command and traverse to the temp folder. You can type in %temp% and hit enter. (C:\Users\<LoggedInUser>\AppData\Local\Temp)

2

4. Now inside the temp folder, sort the folder by Date Modified field.

5. Now check for any file with the .TMP file format. Note this file will have a name like SS***.tmp file.

3

 

6. Copy this file back to any other local folder and then rename the file extension format from .TMP to .MSI fil

7. Now we have the MSI file ready with us. (You can verify that the file is intact by double-clicking it. Also make sure that you close it after verification)

 

Step 2:

Use MSIEXEC with property values to perform the silent installation:

This is a simple straight forward method. The entire set of command-line options are available in the following articles,

Msiexec (command-line options):           https://technet.microsoft.com/en-in/library/cc759262(v=ws.10).aspx

Standard Installer Command-Line Options:         https://msdn.microsoft.com/en-us/library/windows/desktop/aa372024(v=vs.85).aspx

In our case, we just need to run the following command.

1. From an elevated command prompt window, please traverse to the local folder where you had stored the local copy of *.msi file.

2. Please execute the following command, msiexec /I SSM8AED.msi /qn /Lv* “C:\SSMA\installlog.txt”

Command:       msiexec /I SSM8AED.msi /qn /Lv* “C:\SSMA\installlog.txt”

4

Command-line properties used:

/I –  </package | /i> <Product.msi>
Installs or configures a product

/qn – /q[n|b|r|f]
Sets user interface level
n – No UI
b – Basic UI
r – Reduced UI
f – Full UI (default)

Logging Options
/l[i|w|e|a|r|u|c|m|o|p|v|x|+|!|*] <LogFile>
i – Status messages
w – Nonfatal warnings
e – All error messages
a – Startup of actions
r – Action-specific records
u – User requests
c – Initial UI parameters
m – Out-of-memory or fatal exit information
o – Out-of-disk-space messages
p – Terminal properties
v – Verbose output
x – Extra debugging information
+ – Append to existing log file
! – Flush each line to the log
* – Log all information, except for v and x options

3. As mentioned in the previous step, the /Lv* option is for logging and if you open & look into the log file, you would see the following logs that confirms that the product is successfully installed.

———-

MSI (s) (3C:48) [14:04:50:414]: Windows Installer installed the product. Product Name: Microsoft SQL Server Migration Assistant for Oracle. Product Version: 6.0.0. Product Language: 1033. Manufacturer: Microsoft Corporation. Installation success or error status: 0
———-

If you are still having any issues with the installation, please leave your comments below.

Happy Migration !

NOTE: This method is not documented and hence this post is provided “AS IS” with no warranties conferring no rights/supported scenario.

 

Author: Krishnakumar Rukmangathan, Technical Advisor, SQL Server BI-ONE Developer team, Microsoft

Reviewed by: Sunil Kumar B.S, Escalation Engineer, SQL Server BI-ONE Developer team, Microsoft.

 

 

Connection timeout issue with .NET Framework 4.6.1 – TransparentNetworkIPResolution

$
0
0

 

Understanding of the issue:

Client application using .NET Framework 4.6.1 fails to connect to SQL Server and the error we get is something like following.

Error:

‘Connection Timeout Expired. The timeout period elapsed while attempting to consume the pre-login handshake failed or the server was unable to respond back in time. The duration spent while attempting to connect to this server was – [Pre-Login] initialization=11977; handshake=5138;’

Connection string example:

Server=myServerName;Database=myDataBase;Trusted_Connection=True

One more behavior to observer is, the connection works well with lower versions of .NET Framework.

 

What is the issue?

In short, the issue is because of a new property introduced in .NET Framework 4.6.1. A parameter in the connection string called ‘TransparentNetworkIPResolution’. This parameter is by default is set to true.

A little background on why we introduced this parameter,

There is a known design limitation with the way the SQL Server connections work on a SQL Server Availability Groups/ AlwaysOn inside a MultiSubnet Environment. Whenever the Availability Group Listener resolves to 2 different IP addresses of different Subnets based on the DNS lookup resolution the connection will go to the first IP for first attempt. If that is not the primary, then the connection will timeout.

By default, the behavior of the SQL client libraries is to try all IP addresses returned by the DNS lookup – one after another (serially) until the all of the IP addresses have been exhausted and either a connection is made, or a connection timeout threshold has been reached. This can become an issue when the first IP address returned is not the primary. Connection may timeout before it can attempt to connect to the next IP address.

To overcome this behavior, we had introduced a connection property called MultiSubnetFailover which we had to manually set it to true in such scenarios.

In .NET Framework 4.6.1, a new parameter “TransparentNetworkIPResolution” is introduced which would eliminate the need of this MultiSubnetFailover property in connection string. With this property being set, an initial connection attempt to the first-returned IP address is still made, but that attempt is timed-out after only 500ms, and then connection attempts to all the IP addresses are attempted in parallel. By default, this property is set to true.

In this scenario, where the connection is failing with .NET Framework 4.6.1, the DNS lookup is taking longer than 500ms. As this property is by default set in the connection string (Doesn’t check if this is Always On or not) it was timing out at 500ms. This made the connection timeout irrespective of it is Always On or not.

Tracert result will be something like following:

C:\Users\testuser>tracert test
Tracing route to test.xyz.com [158.1.2.14]
over a maximum of 30 hops:
1    <1 ms    <1 ms    <1 ms  158.1.2.3
2     1 ms    <1 ms    <1 ms  test1.xyz.com [158.1.2.4]
3   555 ms   552 ms   576 ms  test2.xyz.com [158.1.2.5]
4   604 ms   595 ms   559 ms  test3.xyz.com [158.1.2.6]
5   560 ms   559 ms   554 ms  158.1.2.7
6   553 ms   553 ms   559 ms  158.1.2.8
7   765 ms   783 ms   780 ms  158.1.2.9
8   776 ms   813 ms   842 ms  158.1.2.10
9   775 ms   854 ms   783 ms  158.1.2.11
10   832 ms   777 ms   772 ms  158.1.2.12
11   791 ms   819 ms   801 ms  158.1.2.13
12   773 ms   807 ms   771 ms  test.xyz.com [158.1.2.14]

One of the sample network communication for this scenario:

5

 

Resolution:

This issue can be resolved by modifying the connection string to set the parameter ‘TransparentNetworkIPResolution’ to false.

Example:

Server=myServerName;Database=myDataBase;Trusted_Connection=True; TransparentNetworkIPResolution = False

When we set it to false in the connection string it will not timeout at 500ms and the connection succeed.

Specifications of this property:

  • TransparentNetworkIPResolution is enabled by default
  • If we have MultiSubnetFailover parameter already set in the connection string, then “TransparentNetworkIPResolution” will be ignored.
  • If the database is mirrored, then the parameter is ignored

Please reach out to the CSS team in case you are still experiencing this issue after the above changes.

 

Author:         Chaitra Hegde – SQL Server BI-ONE Developer team, Microsoft

Reviewer:    Krishnakumar Rukmangathan, Technical Advisor, SQL Server BI-ONE Developer team, Microsoft

 

 

 

A new release of ODBC for Modern Data Stores

$
0
0

 

After more than 15 years since the last release, Microsoft is looking at updating the Open Data Base Connectivity (ODBC) specification.

ODBC was first released in September of 1992 as a C-based call-level interface for applications to connect to, describe, query, and update a relational store. Since its introduction, ODBC has become the most widely adopted standard interface for relational data, fostering an ecosystem of first-party data producers as well as 3rd party vendors that build and sell ODBC Drivers for a variety of data sources.

ODBC was designed for relational databases conforming to the ISO SQL-92 standard, and has not undergone a significant revision since the release of ODBC 3.5 in 1997. Since that time, not only have relational data sources evolved to support new data types, syntax, and functionality, but a variety of new sources of data have emerged, particularly in the cloud, that don’t conform to the constraints of a relational database.

The prevalence of tools, applications, and development environments that consume ODBC have led a number of vendors to create ODBC drivers to cloud-based and non-relational data sources. While this provides connectivity to existing tools, having to flatten the data into normalized relational views loses fidelity, functionality, semantics, and performance over a more natural representation of the data.

So we started to consider what it would look like to extend ODBC to more naturally support more modern Relational, Document-oriented, and other NoSQL stores. Borrowing and extending syntax and semantics from structured types in SQL-99, we identified a relatively small set of extensions necessary to support first-class representation of the modern data sources, as well as a set of general enhancements to improve the ability for applications to write interoperable code across ODBC drivers.

To be part of ODBC, these extensions would have to be backward compatible with existing drivers and applications, and extend ODBC in ways natural and consistent with the existing API.

From these efforts a new version of ODBC slowly took shape. We started to discuss these extensions with a few ODBC vendors to flesh out requirements and vet designs, and have reached the point that we’re ready to start talking more broadly about ODBC 4.0 – the first significant update to ODBC in over 15 years.

Thoughts/comments/ideas? Please share! And stay tuned here for more updates…

 

Michael Pizzo
Principal Architect, Microsoft Data Group

ODBC Driver 11 and above fails with Output parameter of type SQL_VARIANT

$
0
0

Problem:

If you have an application which is using ODBC driver 11 or above and is executing a stored procedure with an output parameter of type SQL_VARIANT, the execution fails with the following exception:

An error encountered: SQLState = HY000, NativeError: 0, ErrorMessage: [Microsoft][ODBC Driver 11 for SQL Server]Protocol error in TDS stream.

Please do note that the issue happens only when we you use ODBC 11 or above driver with SQL_VARIANT type.  At the same time, when you execute the same code logic with SQL Native Client driver, it will succeed without any errors.

 

Cause / Resolution:

Our product team has confirmed that this is an issue with our ODBC 11 and above drivers. We’re working towards including the fix as part of August CU. The release details will be updated as soon as the fix is released.

 

Author: 

Selvakumar Rajakumar
Escalation Engineer
Microsoft SQL BI-ONE Developer Team

 

 

 

Viewing all 80 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>