Quantcast
Channel: SQL BI / Data Access Technologies
Viewing all 80 articles
Browse latest View live

Enable TLS 1.2 protocol for Reporting Services with custom .NET application

$
0
0

 

If you are using any custom application which is integrated with any payment gateway you might be thinking of using a secure protocol while doing the communication between them. You might be thinking about using TLS 1.2 as a protocol to do all the communication. Let us consider a scenario, let us say if you have custom application which has integrated reports from the Reporting Services then in that case you need to make sure that the TLS 1.2 communication is happening between your custom application and Reporting Services as well as between Reporting Services and Report Server databases. You may even need to consider about the Report Data sources communication as well if the hosted SQL instances of the Report Server database and Report databases are different. It could be easily explained with following diagram.

1

 

Reporting Services Related configuration (SERVER 2)

Before we start configuring the changes, I would like to tell you that a couple of months back this wasn’t a supported scenario but right now Reporting Services can communicate with SQL Server using TLS 1.2

1.Install the Windows Patches

Before you install the patch, you need to make sure that the SQL Server TLS 1.2 support updates are already installed.  You can find these updates in the below article.

https://support.microsoft.com/en-us/kb/3135244

After this, based on the operating system that you are on, you need to install the .NET Framework ‘s latest patch.  The download link is present in the below KB article link.

https://support.microsoft.com/en-us/kb/3154520

If you have already installed this patch, you will end-up seeing the below message when you run the installer one more time.

2

 

2.Reporting Services configuration

In your Reporting Services configuration manager, you need to explicitly disable http://:80 and need to keep only https://:443 enabled. You need to do that for both Report Server and Report Manager URLs.

If you notice the below screenshot, you would see that we have only one HTTPS URL, there is no HTTP URL.

3

 

NOTE:  

  • If you are testing this in your lab environment then you can create a self-signed certificate and use it.
  • If it’s a production environment you would have certificate already installed in your environment.
  • But in both the scenarios you need to make sure that certificates are installed in Trusted Root Authority and the certificate needs to be trusted.

 

3. Registry Changes

Once you have installed the above windows patches in the SSRS server, it is now capable of initating a communication over TLS 1.2. But by default, it would always initiate the communication in TLS 1.0 .  Doing the following registry changes will enforce it to use TLS 1.2 only.

Think about a scenario where you have a custom application that was hard-coded to connect using TLS 1.2, it can still connect to SSRS even without these below registry changes. It’s just the connection from SSRS to SQL Server that may still use TLS 1.0. So the below registry changes totally depends on what you exactly need. It is not a mandatory requirement to enable TLS 1.2. But these below registry changes would enable it definitely.

You need to go to the following registry location on the SSRS Server.

3a. Protocol Section

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols]

4

and make sure that you are making following changes.

SSL 2.0

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client] “DisabledByDefault”=dword:00000001   “Enabled”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server]

“DisabledByDefault”=dword:00000001   “Enabled”=dword:00000000

SSL 3.0

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client] “DisabledByDefault”=dword:00000001   “Enabled”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server]

“DisabledByDefault”=dword:00000001   “Enabled”=dword:00000000

TLS 1.0

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client] “DisabledByDefault”=dword:00000001   “Enabled”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server]

“DisabledByDefault”=dword:00000001   “Enabled”=dword:00000000

TLS 1.2

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client] “DisabledByDefault”=dword:00000000   “Enabled”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server]

“DisabledByDefault”=dword:00000000   “Enabled”=dword:00000001

 

NOTE: 

  • Over here if you would notice, we have disabled all the other protocols apart from TLS 1.2 and this is mandatory.
  • You may want to export the existing registry key (backup) before start making the changes.

5

 

3b. .NET Framework Section

Also, you need to make sure that you have the following registry keys added.

For 32 bit:

[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v2.0.50727]

“SystemDefaultTlsVersions”=dword:00000001

For 64 bit:

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v2.0.50727]

“SystemDefaultTlsVersions”=dword:00000001

 

4. Reboot

Once all these changes are done, you need to restart the machine.

 

 

SQL Server Changes (SERVER 3)

Probably if you think conceptually, you may realise that we don’t need to make any changes in SQL Server side (This is the server where you have hosted your ReportServer and ReportServerTempDB databases are hosted). This is because the communication always gets initiated from the client. But still you should have TLS 1.2 enabled alone apart from other protocols.

  1. To make this registry changes, you need to replicate the same changes what has already been articulated in section

          3a. Protocol Section  under Reporting Services Related configuration (SERVER 2)

  1. You also need to make sure that you are applying the right set of updates on the SQL Server database instance. You can find the reference in this article – https://support.microsoft.com/en-in/kb/3135244
  1. After making all the above changes you need to restart the physical machine.

 

Now let us test the working of the TLS 1.2 communication for the list of changes that we had done so far. Until this point, we have completed the modifications for configuring SSRS and SQL Server for TLS 1.2.  So that means you can do the test and see if the communication is going over TLS 1.2.

You can run a fiddler trace and look into the HTTPS traffic and find TLS 1.2 communication.

6

 

 

Application Level Changes (SERVER 1)

Considering our initial scenario, you have right now configured Reporting Services to accept TLS 1.2 communication. Now we need to make sure that your application is also configured for TLS 1.2

 

1.Windows Level Patch

If your application is not hosted on the same server where Reporting Services has been installed. Then in that case, you need to make sure you are installing the patch https://support.microsoft.com/en-us/kb/3154520on your application server.

 

2.Registry Changes

If your application is not hosted on the same server where Reporting Services has been installed. Then in that case, you need to make the same protocol level registry changes. This has already mentioned in 3a. Protocol Section  under Reporting Services Related configuration (SERVER 2) 

 

3.Application Code Changes

If you are calling the Reporting Services from your custom application. Then you need to make sure that in your application you are sending the communication over TLS 1.2

  • Applications that are using ServicePointManager-based APIs can set the protocol using as follows:
    Net.ServicePointManager.SecurityProtocol = SecurityProtocolTypeExtensions.Tls12;
  • Applications that are using the SslStream AuthenticateAsClient(String, X509CertificateCollection, SslProtocols, Boolean) overload can set the SslProtocols value as  Tls12.

 

You need to include below .CS files in your solution.

SslProtocolsExtensions.cs

———————————————————————————————————-

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

namespace System.Net

{

using System.Security.Authentication;

public static class SecurityProtocolTypeExtensions

{

public const SecurityProtocolType Tls12 = (SecurityProtocolType)SslProtocolsExtensions.Tls12;

public const SecurityProtocolType Tls11 = (SecurityProtocolType)SslProtocolsExtensions.Tls11;

public const SecurityProtocolType SystemDefault = (SecurityProtocolType)0;

}

}             

———————————————————————————————————-

 

SecurityProtocolTypeExtensions.cs

———————————————————————————————————-

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

namespace System.Security.Authentication

{

public static class SslProtocolsExtensions

{

public const SslProtocols Tls12 = (SslProtocols)0x00000C00;

public const SslProtocols Tls11 = (SslProtocols)0x00000300;

}

}

———————————————————————————————————-

With this code changes, your application would start initiating the communication over TLS 1.2

 

NOTE: 

  • You can also verify that the communication is over TLS 1.2 by collecting a fiddler trace
  • If your making server level changes and if your application is hosted in a different machine, then you need to reboot the server.

 

FAQ

Q –   Can SSRS still use TLS 1.0?

TLS 1.0 is not deprecated, it’s actually still safe if you patched all security fixes. Let us consider this below scenario.

7

 

If you would have considered the above scenario, the TLS 1.0 won’t work here. The connection between SERVER1 and SERVER2 fails because at the SERVER2, we have disabled all the other protocols apart from TLS 1.2 and the SERVER 1 is still sending the request over TLS 1.0.

 

Q –   Do I definitely need TLS 1.2?

 It’s not mandatory, by default client and SQL server communication happens over TLS 1.0. But if you want more secure communication then you can enabled that in client and server.

 

Q – I don’t have any custom application and I want to enable TLS 1.2 protocol for Report Manager and Report Server?

 The above action plan is still applicable for the Reporting Services Native mode configuration without any custom application integration. You would need to make the changes that we have specified for SERVER 2 and SERVER 3 in the above section.

 

 Q –   Do I need to make “NETFramework\v2.0.50727” registry changes in SQL Server as well?

No, this registry changes is only for the client. So this changes is not required in SQL Server. But in any case if your SQL Server is acting as client and that client would also like to communicate over TLS 1.2. Then you need to install the windows patch mentioned in the above and include these registry changes (under SSRS Server section) along with the “Protocol” section changes.

 

Q-    Can I host application in another server other than SSRS server?

Yes, that’s possible. In that scenario you need to make sure that you are replicating the same changes performed at SSRS Server to have TLS 1.2 in place.

 

Q –   Do we have to change the code if web application and SSRS are hosted on the same server?

 Not necessary all the time. If the app doesn’t explicitly set the protocol, and, if the host has set “SystemDefaultTlsVersions”=dword:00000001, then there’s no code change needed.

 

Further References

  1. https://technet.microsoft.com/en-us/library/dn786418(v=ws.11).aspx 
  1. https://support.microsoft.com/en-us/kb/3154519 
  1. https://support.microsoft.com/en-us/kb/3154520 
  1. https://support.microsoft.com/en-us/kb/3135244

 

Author:        Sumit Ghosh – SQL Server BI Developer team, Microsoft

Reviewer:    Krishnakumar Rukmangathan, Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

 

 


SSAS (ROLAP) with Azure SQL Data Warehouse

$
0
0

 

Let’s consider you are using SQL Server Analysis Services in MOLAP (Multi-Dimensional Online Analytical Processing) Storage Mode with Azure SQL Data Warehouse and everything has been working as expected. Due to a requirement, the Storage Mode was changed to ROLAP (Relational Online Analytical Processing) Mode.

 

Now when you try to execute a simple MDX Query (In our case we are using Adventure Works)

 

Query:

SELECT [Measures].[Internet Sales Amount] ON 0,

NON EMPTY {[Date].[Calendar Year].members} ON 1

FROM [Adventure Works]

 

You would receive the following error:

 

Executing the query …

Errors in the high-level relational engine. The following exception occurred while the managed IDbCommand interface was being used: Parse error at line: 1, column: 51: Incorrect syntax near ‘@ResultCode’..

Run complete

 

You would receive the same error when you browse the Cube from SSMS:

6

 

To understand the type of query causing this issue, we tried to capture a SQL profiler trace, but since this is Azure SQL Data Warehouse we don’t have that option. The only other option was to capture a Memory Dump of the process to check the command text of the query being executed when this error occurred.

 

This is the Exception from the Dump:

 

Exception type:   System.Data.SqlClient.SqlException

Message:         Parse error at line: 1, column: 51: Incorrect syntax near ‘@ResultCode’.

StackTrace:

System_Data!System.Data.SqlClient.SqlConnection.OnError(System.Data.SqlClient.SqlException, Boolean, System.Action`1<System.Action>)+0x11a

System_Data!System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(System.Data.SqlClient.TdsParserStateObject, Boolean, Boolean)+0x327

System_Data!System.Data.SqlClient.TdsParser.TryRun(System.Data.SqlClient.RunBehavior, System.Data.SqlClient.SqlCommand, System.Data.SqlClient.SqlDataReader, System.Data.SqlClient.BulkCopySimpleResultSet, System.Data.SqlClient.TdsParserStateObject, Boolean ByRef)+0x13e1

System_Data!System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()+0x5c

System_Data!System.Data.SqlClient.SqlDataReader.get_MetaData()+0x6a

System_Data!System.Data.SqlClient.SqlCommand.FinishExecuteReader(System.Data.SqlClient.SqlDataReader, System.Data.SqlClient.RunBehavior, System.String)+0x12b

 

This is the SQL Query:

 

DECLARE @ResultCode INT;

DECLARE @TraceId INT;

EXEC @ResultCode = sp_trace_create @TraceId OUTPUT, 1;

SELECT @ResultCode as ResultCode, @TraceId as TraceId;

 

 

As you can see above, the query being executed is sp_trace_create which is the Stored Procedure used to create a SQL profiler Trace. As mentioned above, Azure SQL Data Warehouse doesn’t support creating SQL Traces which is the cause of the above error.

 

But the question here is, why is SSAS executing a Stored Procedure to create a SQL Trace?

 

The answer is simple, SSAS in ROLAP Mode doesn’t have pre-processed data and therefore dynamically generates and executes queries against the data source to retrieve the required data which is cached locally. Now that the data is cached, Analysis Services will continue to use the cached data until it gets notified that the backend data has changed.

 

This notification is handled by creating a SQL Trace, to check if any DML statements have been executed against the SQL Table.

 

By default, when you enable ROLAP Storage Mode, proactive caching is enabled and the notification is set to the SQL Server option. If you open this property you would see the following warning:

 

The current notification method is set to SQL Server. This option requires that either the data source or the Analysis Services service account be configured to use an account with either System Administrator or ALTER TRACE privileges on the SQL Server.

 

 

From our MSDN Documentation: https://msdn.microsoft.com/en-us/library/ms183681.aspx

SQL Server  Uses a specialized trace mechanism on Microsoft SQL Server to identify changes to underlying tables for the object.

 

Therefore to work around not having SQL Trace permissions, you would need to use either Client Initiated or Scheduled Polling notification.

 

Please follow our MSDN documentation to choose the preferred option for your business requirements:

 

 

Author:        Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Reviewer:    Sarath Babu Chidipothu – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Migrating Oracle to SQL Server using SSMA – Error: Cannot find either column “ssma_oracle” or the user-defined function or aggregate “ssma_oracle.rpad_varchar”

$
0
0

Issue:

You are trying to migrate server objects and data from Oracle to SQL Server using SSMA for Oracle and for a particular view which has the RPAD function and getting the following error while performing database synchronization.

Errors: Cannot find either column “ssma_oracle” or the user-defined function or aggregate “ssma_oracle.rpad_varchar”, or the name is ambiguous.
Synchronization error: Cannot find either column “ssma_oracle” or the user-defined function or aggregate “ssma_oracle.rpad_varchar”, or the name is ambiguous. On: <source_object>

 

Cause:

When you are trying to synchronize a database object on the target SQL Server console, either the ssma_oracle schema has not been synchronized initially or it is unchecked during the process of database synchronization.

1

 

Solution:

While server object synchronization, we need to ensure that the ssma_oracle schema is selected and synchronized first before synchronizing and migrating any other database objects.

2

 

A bit more:

For any database migration in SSMA, the schema mapping has to be done at the initial phase. The tool takes care of this with the ‘Convert Schema’ option. When we initiate the ‘Convert Schema’ command, we see a new schema being generated called ‘ssma_oracle’ on the target SQL Server database. This new schema contains oracle specific database objects namely the built-in functions, stored procedures and views(find more on ssma_oracle objects here).  This has to be synchronized first before synchronizing the user defined schema objects which we need to migrate else this will error out with the unidentified schema objects. On choosing the target SQL Server, the schema ‘ssma_oracle’ will be selected by default. If not, we need to explicitly enable and synchronize it before synchronizing other objects.

 

Following is the flow on how we arrive onto this issue and drive for resolution.

  1. After selecting the Oracle source server, selected the targeted SQL server.

3

As we see here, the ssma_oracle has not been created initially

2. You initiate the convert_schema on the Oracle source server objects:

4

This results in the creation of ssma_oracle schema which contains the oracle specific database objects.

3. Now when you try to synchronize with the database, without selecting the ssma_oracle, you hit the following error:

5

6

 

When you try to synchronize all objects by selecting the root schema node, SSMA will consider the synchronization of the ssma_oracle schema at the first place and then proceeds with the rest of the user defined schemas.

7

 

Reference:

 

Author:        Chetan KT   – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:   Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

 

 

 

 

 

Loading SSAS Tabular Databases in SSMS fails for Non-admin users with “Failed to retrieve data for this request”

$
0
0

I was working with one of my customers where he came across the following issue. I thought of sharing the info related to this, because I didn’t find any articles online that talks about this issue. The issue is, users are getting the below error when they are connecting to SSAS tabular instance using SQL Server Management studio.

1

 

Stack trace:

Failed to retrieve data for this request. (Microsoft.SqlServer.Management.Sdk.Sfc)

——————————

For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%20SQL%20Server&LinkId=20476

——————————

Program Location:

at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.Process(Object connectionInfo, Request request)

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItemBuilder.NavigableItemBuilderDataReader.RunQuery()

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItemBuilder.NavigableItemBuilderDataReader.Process()

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItemBuilder.NavigableItemBuilderDataReader.get_PropertyNames()

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItemBuilder.BuildDynamicItemWithQuery(IList`1 nodes, INodeInformation source, INavigableItem sourceItem, String urnQuery, Boolean registerBuilder, Boolean registerBuiltItems)

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItemBuilder.BuildDynamicItem(IList`1 nodes, INodeInformation source, INavigableItem sourceItem, IFilterProvider filter)

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItemBuilder.Build(IList`1 nodes, INodeInformation source, INavigableItem sourceItem, IFilterProvider filter)

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItem.RequestChildren(IGetChildrenRequest request)

at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.ExplorerHierarchyNode.BuildChildren(WaitHandle quitEvent)

==================================

Object ID property was not found (Microsoft.SqlServer.OlapEnum)

——————————

Program Location:

at Microsoft.SqlServer.Management.Smo.Olap.OlapEnumObject.PopulateDataSet(EnumResult erParent, String sResponse)

at Microsoft.SqlServer.Management.Smo.Olap.OlapEnumObject.GetData(EnumResult erParent)

at Microsoft.SqlServer.Management.Sdk.Sfc.Environment.GetData()

at Microsoft.SqlServer.Management.Sdk.Sfc.Environment.GetData(Request req, Object ci)

at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.GetData(Object connectionInfo, Request request)

at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.Process(Object connectionInfo, Request request)

 

However, when he (who is an SSAS administrator) tries to connect to SSAS Tabular instance, he is able to see the databases without any issues. Even the applications like PowerBI are able to connect to SSAS tabular instance and pull the data successfully. But the issue is happening only when the users(non-admins) connect to SSAS tabular instance from SQL Server Management studio.

We looked at the SSAS logs and found the following error “Error:  An error occurred when loading the Database Permissions.” is being logged.

We decided to capture a SSAS profiler trace to check if there are any issues. In the SSAS profiler trace, we figured out that it is trying to load the Database permissions file of a database. When we looked at the database, we found that the database is in SUSPECT mode.

After removing the corrupted SSAS database, the users were able to connect to SSAS and list the databases in SQL Server Management Studio.

 

Author:        Sarath Babu Chidipothu – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Reviewer:    Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Date format Issues in SSIS Packages between MM/dd/yyyy & dd/MM/yyyy

$
0
0

In this blog, I would like to address one of the interesting issues that I came across recently with the SQL Server Integration services packages related to the inconsistencies noticed with the Date format.

In this scenario, the SSIS packages were using a system variable @[System::StartTime], which was giving the value in format as ’18/08/2016 13:33:11′.. wiz. dd/MM/yyyy format in the development machines. But when the SSIS packages were moved to the production server and configured under a SQL Agent Job, the SQL Agent jobs were failing with the following error message.

Error:

Msg 242, Level 16, State 3, Line 6

The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.

 

Now in order to troubleshoot this issue, we tried to open this SSIS package using the SQL Server Data Tools (SSDT) or Business Intelligence Development Studio (BIDS) in the production environment. When we opened the package and looked into the date format for the @[System::StartTime], it was showing as ’08/18/2016 13:33:11′. wiz. MM/dd/yyyy format. We collected a SQL Server Profiler traces reproducing the issue and we could clearly see that the SQL Server was throwing this error as a result of a mismatch between what is provided and what the SQL Server is expecting.

Exception: The conversion of a varchar data type to a datetime data type resulted in an out-of-range value. Microsoft SQL Server
Now why do we see this inconsistency in the date format between 2 different machines for the same SSIS system date variable like @[System::StartTime].

The things that would come into picture here are:

Is it the same SSIS package used in both the environments? Yes, it was the exact same package.

But in order to verify this, we opened the SSIS packages using the VS SSDT/ BIDS environment and checked one of the related SSIS package properties.

1.SSIS package Locale ID property:

<DTS:Property DTS:Name=”LocaleID”>2057</DTS:Property>

   1       
Check the package property LocaleID and verify that they are set as per our need.

In our case it should be and it was set to English (United Kingdom)

b. Region Format setting:

In our development machine, the region settings (intl.msc) [Control Panel –> Region –> Format]

were set as below.

      2

      In our production machine, the region settings were set as below.

3

 

So this was causing the SSIS package variables to show the Date in the MM/dd/yyyy format in our production environment.

 

Now, after we had changed this setting back to English (United Kingdom), the SSIS packages in production environment using the SSDT/BIDS development tools were showing the variables in the expected dd/MM/yyyy format. Also the SSIS packages now started to execute fine without any issues using the SSDT/BIDS development tools.

 

When we tried to schedule the execution of the SSIS packages using the SQL Agent Jobs, it still failed with the following error message,

Error:

Msg 242, Level 16, State 3, Line 6

The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.

 

Now, this seems interesting. Upon further investigation, we figured out that the SQL Server service account under which the scheduled SQL Agent Jobs are executed still had its Region settings as English(United States). So the changes that we had made under the Region setting at the machine level, holds good only for the logged in account using which the changes were made. It is not universally applicable to all the users profiles. i.e. The Region settings is based on and specific to the User profile alone.

 

So having understood this, we dropped and recreated the user profile on this machine for the SQL Server Agent service account at the machine level. This way we aren’t changing the SID associated with the user profile. After this, validated that the user profile’s region setting was set to English(United Kingdom). Then ran the SQL Agent job and it executed without any issues.

 

NOTE: You can also create a SQL Agent Proxy on a user account that has the region setting as English(United Kingdom) and still run the SQL Agent Jobs/ step using this proxy account.

 

Author:       Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Reviewer:   Sarath Babu Chidipothu – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

SSAS cube design errors while creating data source views for Teradata tables with columns having default date constraint

$
0
0

Issue:

While trying to create a data source view (DSV) on the SSAS project connecting to a Teradata source table which has a DateTime column with a default_date constraint, we get the below errors:
Error: “The string was not recognized as a valid DateTime. There is an unknown word starting at index 0. (Microsoft Visual Studio)”
Error: “Cannot remove this column, because it is a part of the constraint Constraint1 on the table <table_name>. (Microsoft Visual Studio)”

Following is the Teradata table definition which works fine where there is no default date constraint:

CREATE MULTISET TABLE TESTTABLE1, NO FALLBACK,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
Date_Column1 TIMESTAMP (0))
NO PRIMARY INDEX;
Scenario 1:

When the table has only one column with the default time constraint

Table Definition:
CREATE MULTISET TABLE TESTTABLE2, NO FALLBACK,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
Date_Column1 TIMESTAMP (0) DEFAULT DATE )
NO PRIMARY INDEX;

Error: TITLE: Microsoft Visual Studio
——————————
The string was not recognized as a valid DateTime. There is an unknown word starting at index 0. (Microsoft Visual Studio)
——————————
Program Location:

at System.DateTimeParse.Parse(String s, DateTimeFormatInfo dtfi, DateTimeStyles styles)
at System.Convert.ToDateTime(String value, IFormatProvider provider)
at System.String.System.IConvertible.ToDateTime(IFormatProvider provider)
at System.Data.Common.SqlConvert.ChangeType2(Object value, StorageType stype, Type type, IFormatProvider formatProvider)
at System.Data.Common.SqlConvert.ChangeTypeForDefaultValue(Object value, Type type, IFormatProvider formatProvider)
at System.Data.DataColumn.set_DefaultValue(Object value)
at System.Data.ProviderBase.SchemaMapping.AddAdditionalProperties(DataColumn targetColumn, DataRow schemaRow)
at System.Data.ProviderBase.SchemaMapping.SetupSchemaWithKeyInfo(MissingMappingAction mappingAction, MissingSchemaAction schemaAction, Boolean gettingData, DataColumn parentChapterColumn, Object chapterValue)
at System.Data.ProviderBase.SchemaMapping..ctor(DataAdapter adapter, DataSet dataset, DataTable datatable, DataReaderContainer dataReader, Boolean keyInfo, SchemaType schemaType, String sourceTableName, Boolean gettingData, DataColumn parentChapterColumn, Object parentChapterValue)
at System.Data.Common.DataAdapter.FillMapping(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 schemaCount, DataColumn parentChapterColumn, Object parentChapterValue)
at System.Data.Common.DataAdapter.FillFromReader(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 startRecord, Int32 maxRecords, DataColumn parentChapterColumn, Object parentChapterValue)
at System.Data.Common.DataAdapter.Fill(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords)
at System.Data.Common.LoadAdapter.FillFromReader(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords)
at System.Data.DataTable.Load(IDataReader reader, LoadOption loadOption, FillErrorEventHandler errorHandler)
at Microsoft.DataWarehouse.Design.DataSourceConnection.FillDataSet(DataSet dataSet, String schemaName, String tableName, String tableType)
at Microsoft.AnalysisServices.Design.DataSourceDesigner.AddRemoveObjectsFromDSV()
Scenario 2:

When the table has one column with the default time constraint and another column with a different data type.

Table Definition:
CREATE MULTISET TABLE TESTTABLE2, NO FALLBACK,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
NHTSA_COMPNT_SE_CD VARCHAR(10),
INS_DATETM TIMESTAMP(0) DEFAULT DATE )
NO PRIMARY INDEX;

Error: TITLE: Microsoft Visual Studio
———————————————————————
Error: Cannot remove this column, because it is a part of the constraint Constraint1 on the table TESTTABLE2. (Microsoft Visual Studio)
——————————
Program Location:

at System.Data.DataColumnCollection.CanRemove(DataColumn column, Boolean fThrowException)
at System.Data.DataColumnCollection.BaseRemove(DataColumn column)
at System.Data.DataColumnCollection.Remove(DataColumn column)
at System.Data.ProviderBase.SchemaMapping.RollbackAddedItems(List`1 items)
at System.Data.ProviderBase.SchemaMapping.SetupSchemaWithKeyInfo(MissingMappingAction mappingAction, MissingSchemaAction schemaAction, Boolean gettingData, DataColumn parentChapterColumn, Object chapterValue)
at System.Data.ProviderBase.SchemaMapping..ctor(DataAdapter adapter, DataSet dataset, DataTable datatable, DataReaderContainer dataReader, Boolean keyInfo, SchemaType schemaType, String sourceTableName, Boolean gettingData, DataColumn parentChapterColumn, Object parentChapterValue)
at System.Data.Common.DataAdapter.FillMapping(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 schemaCount, DataColumn parentChapterColumn, Object parentChapterValue)
at System.Data.Common.DataAdapter.FillFromReader(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 startRecord, Int32 maxRecords, DataColumn parentChapterColumn, Object parentChapterValue)
at System.Data.Common.DataAdapter.Fill(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords)
at System.Data.Common.LoadAdapter.FillFromReader(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords)
at System.Data.DataTable.Load(IDataReader reader, LoadOption loadOption, FillErrorEventHandler errorHandler)
at Microsoft.DataWarehouse.Design.DataSourceConnection.FillDataSet(DataSet dataSet, String schemaName, String tableName, String tableType)
at Microsoft.AnalysisServices.Design.DataSourceDesigner.AddRemoveObjectsFromDSV()

Cause:
While creating a DSV, SSAS cannot understand/map and cast the Teradata table column which is of the datatype: TIMESTAMP with a default date constraint. Even if we use Teradata Database Type: Date, we get this issue as the Base Class Library Type points to the ‘DateTime’ type.

Workaround:
Create a Teradata view with the definition of the date column being a normal TIMESTAMP without any default date constraint but the data is sourced from the table which has the default date constraint. Thereby we get the data on the view with the expected default date data.

On the SSAS project, create a DSV pointing to this view. The Teradata view is acting as a bridge to SSAS DSV connection to the Teradata table with the default date constraint.

Reference:

.NET Data Provider for Teradata Date and Time Data Types Overview

 

Author:        Chetan KT   – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:    Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Running SSIS packages outside the Developer tools using DTEXEC.exe without installing the Integration services.

$
0
0

In this blog, I would like to address one of the important information about SQL Server Integration services licensing.

Say that you had installed the Visual Studio – SQL Server Data Tools (SSDT) / Business Intelligence Development Studio (BIDS) and developed your SSIS packages on your development machines. The SSIS packages has some Script tasks or Script components internally and they execute perfectly fine from the SSDT/BIDS environment (design time) without any issues.

You would like to execute these packages outside the developer tools (SSDT/ BIDS) using the DTEXEC.exe (Microsoft SQL Server Execute Package Utility) tool. In this attempt, you went ahead and installed the SQL Server Express Edition with in turn install these components under the default location wiz. C:\Program Files\Microsoft SQL Server\120\DTS\Binn (For SQL server 2014). Now when you use DTEXEC.exe to execute these packages, your SSIS packages fails with the following error message.

Error Message:

1

C:\Windows\system32>dtexec /f “c:\users\admin\documents\visual studio 2013\projects\Integration Services Project1\Integration Services Project1\Package.dtsx”

Microsoft (R) SQL Server Execute Package Utility

Version 12.0.2000.8 for 64-bit

Copyright (C) Microsoft Corporation. All rights reserved.

Started: 11:20:30 AM

Error: 2016-10-25 11:20:36.22

Code: 0xC000F427

Source: Script Task

Description: To run a SSIS package outside of SQL Server Data Tools you must install Script Task of Integration Services or higher.

End Error

Warning: 2016-10-25 11:20:36.24

Code: 0x80019002

Source: Package

Description: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.

End Warning

DTExec: The package execution returned DTSER_FAILURE (1).

Started: 11:20:30 AM

Finished: 11:20:36 AM

Elapsed: 5.328 seconds

You may wonder on why the same SSIS packages that were working fine in the Visual Studio SSDT / BIDS developer tools are now failing when executed outside the developer tools (Design time) using the DTEXEC.exe (Run time). Why are they complaining about installing Script Task of Integration Services.

The answer is pretty simple. SQL Server Integration services is not a free product, it comes with the licensed SQL Server installer. You may not be able to execute the complex SSIS packages that contains any tasks / components like Script tasks, Script components, Sort / Aggregate/ Fuzzy Lookup transformations etc. using the DTEXEC.exe runtime outside your developer tools. SSIS packages aren’t meant to be executed this way.

  1. So if I cannot execute the SSIS packages outside the developer tools using the DTEXEC.exe without purchasing the SQL Server, then why did you provide this DTEXEC.exe with the SQL Express installation?

SQL Server Import/Export wizard is one of the free tools available with the SQL Server Express Edition using which you can copy data from a source to a destination. SQL Server Import/Export wizard internally creates simple SSIS packages to perform this task. So DTEXEC.exe is provided with the SQL Server Express installation only to help with the execution of the SSIS packages that are created as a part of the SQL Server Import/Export wizard usage. So DTEXEC.exe is provided solely for the purpose of running the SQL Server Import/Export wizard tool only.

Also please note that you would still be able to run your custom SSIS packages using DTEXEC.exe that comes with the SQL Server Express Installation that may contain simpler SSIS tasks/ components like source/destination components, the Data Conversion transformation, Execute SQL Task etc. Executing SSIS custom packages using DTExec.exe which ships with SQL Express is not a recommended way or a supported way of running the SSIS packages and it is expected that the package will fail with above errors when Integration Services is not installed

 

Hope you find this blog helpful to understand the licensing terms for Integration services.

 

Author:       Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Reviewer:   Sarath Babu Chidipothu – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Intermittent JDBC Connectivity issue – The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: SQL Server returned an incomplete response. The connection has been closed

$
0
0

Symptoms:

When connections are repeatedly made to SQL Server, one can observe about ~ 1% of connections having inability to connect.  The user witnesses the following error message,

Error:

The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: “SQL Server returned an incomplete response. The connection has been closed. ”

 
Background:

Due to some recent Microsoft updates, customer’s may experience connectivity issues to trading partners when using SSL / TLS to secure the connection. Recently, Microsoft has added two new ciphers to Windows, which use a different Key Algorithm. The minimum key length for allowed by these ciphers is 1024 characters. If the key length used by the trading partners is less than this, the SSL /TLS Handshake will fail.

The issue is only witnessed when using the older JDBC drivers below version 4.2.  Other drivers works fine. We were unable to reproduce the issue when using Sqlclient/ADO.net stack (even when we forced a DHE suite).

 

Resolution/Workaround:

Please follow the below action plan in ascending order.

Option-1

  • Update the JDBC Driver to 4.2 or later version :

We implemented a workaround where the JDBC driver will retry the connection in the specific case where the SSL handshake receives an incomplete response from the Server.

Update the JDBC Driver to 4.2 or later version. This has a re-try logic inside. Make sure you have a supported JVM / JRE on that machine.

https://www.microsoft.com/en-in/download/details.aspx?id=11774

 

Option-2

  • Disable DHE cipher suites :

Warning: If you use Registry Editor incorrectly, you may cause serious problems that may require you to reinstall your operating system. Microsoft cannot guarantee that you can solve problems that result from using Registry Editor incorrectly. Use Registry Editor at your own risk.

  1. Open Registry Editor.
  2. Access key exchange algorithm settings by navigating to the following registry location:HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\KeyExchangeAlgorithms
  3. Select the Diffie-Hellman sub key (if it does not exist, then create it).
  4. Set the Enabled DWORD registry value to 0 (if it does not exist, then create it).
  5. Exit Registry Editor.
  • Impact of the workaround: Encrypted TLS sessions that rely on DHE keys will no longer function unless alternative failover options have been implemented.

https://technet.microsoft.com/en-us/library/security/ms15-055.aspx

1

 

 

If above action plans (disabling the Diffie-Hellman Key Exchange ) doesn’t work then you can follow the below action plan.

 

Option-3

  • DHE suites can be disabled in the JVM by opening up <JAVA_PATH>\jre\lib\security\java.security with admin privileges and add DHE to jdk.tls.disabledAlgorithms

 

Option-4

  • Change the SChannel cipher suite priorities to lower or disable or delete the DHE suites (as compared to something such as ECDHE, which works if they are present in both client & server)

TLS_DHE_RSA_WITH_AES_128_CBC_SHA

TLS_RSA_WITH_AES_128_CBC_SHA

 

If you have performed the above action plans and you are still experiencing the issue, then collect a network capture on the client and server reproducing the issue and contact the Microsoft CSS team for further investigation.

 

Author:       Ranjit Mondal – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:   Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 


A connection was successfully established with the server, but then an error occurred during the login process. (provider: SSL Provider, error: 0 – The Local Security Authority cannot be contacted)

$
0
0

Symptoms:

Connections to the SQL Server fails after certain windows update and getting the below error:

Error:

A connection was successfully established with the server, but then an error occurred during the login process. (provider: SSL Provider, error: 0 – The Local Security Authority cannot be contacted) —> System.ComponentModel.Win32Exception (0x80004005): The Local Security Authority cannot be contacted

at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, UInt32 waitForMultipleObjectsTimeout, Boolean allowCreate, Boolean onlyOneCheckConnection, DbConnectionOptions userOptions, DbConnectionInternal& connection)

at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal& connection)

 

Resolution/Workaround:

Warning: If you use Registry Editor incorrectly, you may cause serious problems that may require you to reinstall your operating system. Microsoft cannot guarantee that you can solve problems that result from using Registry Editor incorrectly. Use Registry Editor at your own risk.
To make these registry changes, follow these steps:

 

  1. Click Start, click Run, type regedit in the Openbox, and then click OK.
  2. Locate and then click the following subkey in the registry:
    HKLM\System\CurrentControlSet\Control\SecurityProviders\Schannel
  3. On the Editmenu, point to New, and then click DWORD Value.
  • For the computer that is receiving the connection request, type DisableServerExtendedMasterSecret: REG_DWORD for the name of the DWORD, and then press ENTER.
  • For the computer that is initiating the connection request, type DisableClientExtendedMasterSecret: REG_DWORD for the name of the DWORD, and then press ENTER.
  1. Right-click the new DWORD entry, and then click Modify.
  2. Type 1 (or any non-zero value) in the Value databox to disable the TLS extension.

https://support.microsoft.com/en-in/kb/3081320

 

If you have performed the above action plan and you are still experiencing the issue, then please contact the Microsoft CSS team for further investigation.

 

 

Author:      Ranjit Mondal – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

Connecting to SQL Server Using PowerShell

$
0
0

In this blog, I would like to show you how we can connect to a SQL Server Instance using PowerShell cmdlets.

Let us consider a situation where you are in a remote machine and you don’t have SQL Server Management Studio(SSMS) client tool to access the SQL Server and you would like to query your SQL Server. In this case, PowerShell command is one of the best way to query the data.

Using a SQL Server Provider Path:

SQL Server: HEARTTHROB

Instance Name: SQL16

Database Name: msdb

Schema Name: dbo

Table Name: sysjobs

 

This query needs to be run using Powershell (Run as Administrator)

Import-Module SQLPS -DisableNameChecking

cd SQLSERVER:\SQL

 

/* SQL Cmdlets */

CD SQL

/* Getting into SQL Server Machine. For my case it is HEARTTHROB */

cd HEARTTHROB

/* Select the Instance : For the Named Instance Instance_Name( For my case it is SQL16)  For the Default Instance : DEFAULT  */

CD SQL16

/* Getting into Databases */

CD Databases

/* Select the particular Database */

CD msdb

/* Getting Into tables */

CD Tables

/* Getting Into Particular Table */

CD dbo.sysjobs

 

Invoke-Sqlcmd -Query “SELECT top 2 * from dbo.sysjobs;” -QueryTimeout 3

Invoke-Sqlcmd -Query “SELECT @@version;” -QueryTimeout 3

2

 

You can get the reference of this from this MSDN article:

A. Specify Instances in the SQL Server PowerShell Provider- https://msdn.microsoft.com/en-us/library/hh245280.aspx

B. SQL Server Identifiers in PowerShell – https://msdn.microsoft.com/en-us/library/cc281841.aspx

C. Specify Instances in the SQL Server PowerShell Provider – https://msdn.microsoft.com/en-us/library/hh245280.aspx

 

Creating a Connection Object:

Query:

[string] $Server= ” HEARTTHROB”

[string] $Database = “USERDB”

[string] $SqlQuery= $(“SELECT count ( *)   FROM [Sales]”)

 

$Command = New-Object System.Data.SQLClient.SQLCommand

$Command.Connection = $Connection

 

$SqlConnection = New-Object System.Data.SqlClient.SqlConnection

$SqlConnection.ConnectionString = “Server = $Server; Database = $Database; Integrated Security = True;”

 

$SqlCmd = New-Object System.Data.SqlClient.SqlCommand

$SqlCmd.CommandText = $SqlQuery

$SqlCmd.Connection = $SqlConnection

$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter

$SqlAdapter.SelectCommand = $SqlCmd

$DataSet = New-Object System.Data.DataSet

$SqlAdapter.Fill($DataSet)

 

$DataSet.Tables[0] | out-file “C:\powershell_query_test_result2.csv”

 

You might face the below issues while running the PS Commands

While importing Import-Module SQLPS -DisableNameChecking – You might get the below error:

Import-Module : File C:\Program Files (x86)\Microsoft SQL Server\130\Tools\PowerShell\Modules\SQLPS\Sqlps.ps1 cannot be loaded because running scripts is disabled on this system. For more information, see about_Execution_Policies at http://go.microsoft.com/fwlink/?LinkID=135170.

At line:1 char:1

+ Import-Module SQLPS -DisableNameChecking

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo          : SecurityError: (:) [Import-Module], PSSecurityException

+ FullyQualifiedErrorId : UnauthorizedAccess,Microsoft.PowerShell.Commands.ImportModuleCommand

 

To fix it you can run the command: Set-ExecutionPolicy RemoteSigned ; One prompt will come up- click yes to all.While importing Import-Module SQLPS -DisableNameChecking– – It might give you the below error:

  • WARNING: Failed to load the ‘SQLAS’ extension: SQL Server WMI provider is not available on PANDA1. –> Invalid namespace

You can run the below commands to fix it:

/* Unrestricted Execution policy */

SET ExecutionPolicy UnRestreicted

/* Import the SQLPS module */

Import-Module SQLPS –DisableNameChecking

/* Import the SQLAS Commandlet*/

Import-module sqlascmdlets

/* List SQLAS commands*/

Get-command -module SQLASCmdlets

 

Author:      Samarendra Panda – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

How a non-Admin users of SSIS 2012/2014 can view SSIS Execution Reports

$
0
0

 

There can be few scenarios where the requirement demands to have Full permission for the developers to have Full access to the SSIS Execution reports. However as per the design SSIS 2012 and SSIS 2014 doesn’t support this it. The non-admin user, by default can see the report which has been executed by them only. They won’t be able to see the reports which have been executed by the other users. Non-admin means they only have public access to all the databases (master, msdb, SSISDB etc.).

Now the Admin users [ either the part of ‘sysadmin’ server role or ssis-admin database (SSIS) role] can see all the SSIS Execution reports for all the users. The SSIS execution reports internally call the view [SSISDB]. [catalog]. [executions]. If we look at the code, we can see that there is a filter condition, which is restricting the non-admin user to see the reports.

WHERE      opers.[operation_id] in (SELECT id FROM [internal].[current_user_readable_operations])

           OR (IS_MEMBER(‘ssis_admin’) = 1)

           OR (IS_SRVROLEMEMBER(‘sysadmin’) = 1)

 

Resolution / Workarounds:

  1. The SSIS upgrade to the SSIS 2016 can be an option here. SSIS 2016 brought a new role in the SSISDB, This new ssis_logreader database-level role that you can be used to grant permissions to access the views that contain logging output to users who aren’t administrators.

          Ref: https://msdn.microsoft.com/en-us/library/bb522534.aspx#LogReader

 

  1. If upgrading to SSIS 2016 is not an option, you can use a SQL Authenticated Login to view the report after giving the ssis-admin permission. In that case that SQL Authenticated Login won’t be able to Execute the package, however they would be able to see all the reports. The moment they will try to execute the report, they will get the below error:

         The operation cannot be started by an account that uses SQL Server Authentication. Start the operation with an account that uses Windows Authentication. (.Net              SqlClient Data Provider).

I believe this option would be risky because we are sharing the admin permission to the non-admin users. Though they won’t be able to execute the report, however              they would be able to change the configuration of the report since they have the ssis-admin permission.

 

  1. There is one more option by changing the code of the view [SSISDB]. [catalog]. [executions].

[ Please note that Microsoft does not support this solution, as this involves changing the code of the SSISDB views. Also, this change can be                  overwritten if we apply any patches/fixes]

a. Let’s create SQL Authenticated Login with minimal permission:

testSSIS for my case:

SQL Server Instance -> Security-> Logins-> New

1

b. Go to the login->User Mapping under the same login and check the SSISDB database. You can give the read permission as shown below.

2

c. Create a SSISDB database role in my case SSISTestSSISDBRole and add the testSSIS user.

d. Also, you can add other windows account as the member in this role.

3

4

e. Go to the Alter View code and Alter the view by adding one more filter condition at the end. You need to go to the [SSISDB]. [catalog]. [executions] and alter the                     script.

5

 

         Change the below filter condition at the end.

WHERE      opers.[operation_id] in (SELECT id FROM [internal].[current_user_readable_operations])

OR (IS_MEMBER(‘ssis_admin’) = 1)

     OR (IS_MEMBER(‘SSISTestSSISDBRole’) = 1) — Extra filter condition.

OR (IS_SRVROLEMEMBER(‘sysadmin’) = 1)

All the non-admin userss would be able to see the reports for all the Executions . Please note that you would only be able to see the basic reports. The Drill through report will not work for this case.

Testing:

Go to:

6

NOTE:  Microsoft CSS does not support the above workaround. We recommend that you move to SQL Server 2016 and make use of the new ssis_logreader database-level role.

 

 

Author:      Samarendra Panda – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

Creating dynamic SSIS package [Object model] and using OleDBSource & OleDBDestination internally fails in SSIS 2016

$
0
0

 

Issue:

While dynamically creating SSIS packages using the object model and referencing the following SSIS libraries, you may receive the following exception thrown

SSIS Libraries referenced:

C:\Program Files (x86)\Microsoft SQL Server\130\SDK\Assemblies\

  1. SqlServer.DTSPipelineWrap.dll
  2. SQLServer.ManagedDTS.dll
  3. SQLServer.DTSRuntimeWrap.dll

 

 Error Message:

System.Runtime.InteropServices.COMException’ occurred in ConsoleApplication1.exe

Additional information: Exception from HRESULT: 0xC0048021

{“Exception from HRESULT: 0xC0048021”}

   at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSDesigntimeComponent100.ProvideComponentProperties()

   at ConsoleApplication1.Program.Main(String[] args) in c:\Users\Administrator\Documents\Visual Studio 2013\Projects\ConsoleApplication1\ConsoleApplication1\Program.cs:line 27

   at System.AppDomain._nExecuteAssembly(RuntimeAssembly assembly, String[] args)

   at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args)

   at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()

   at System.Threading.ThreadHelper.ThreadStart_Context(Object state)

   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)

   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)

   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)

   at System.Threading.ThreadHelper.ThreadStart()

 

Steps to reproduce the issue:

  1. Use the below C# code in a Console Application:

—————————————————————————————————————————————————

using System; 

using Microsoft.SqlServer.Dts.Runtime; 

using Microsoft.SqlServer.Dts.Pipeline; 

using Microsoft.SqlServer.Dts.Pipeline.Wrapper;

 namespace ConsoleApplication1

{

    class Program

    {

        static void Main(string[] args)

        {

            Package package = new Package();

            Executable e = package.Executables.Add(“STOCK:PipelineTask”);

            TaskHost thMainPipe = e as TaskHost;

            MainPipe dataFlowTask = thMainPipe.InnerObject as MainPipe;

 

            // Create the source component.   

            IDTSComponentMetaData100 source =

              dataFlowTask.ComponentMetaDataCollection.New();

            source.ComponentClassID = “DTSAdapter.OleDbSource”;

           CManagedComponentWrapper srcDesignTime = source.Instantiate();

            srcDesignTime.ProvideComponentProperties();

 

            // Create the destination component. 

            IDTSComponentMetaData100 destination =

              dataFlowTask.ComponentMetaDataCollection.New();

            destination.ComponentClassID = “DTSAdapter.OleDbDestination”;

            CManagedComponentWrapper destDesignTime = destination.Instantiate();

            destDesignTime.ProvideComponentProperties();

 

            // Create the path. 

            IDTSPath100 path = dataFlowTask.PathCollection.New();

            path.AttachPathAndPropagateNotifications(source.OutputCollection[0],

              destination.InputCollection[0]);

        }

    }

}

—————————————————————————————————————————————————

  1. Add the reference from:
  •           C:\Program Files (x86)\Microsoft SQL Server\130\SDK\Assemblies\Microsoft.SQLServer.ManagedDTS.dll
  •           C:\Program Files (x86)\Microsoft SQL Server\130\SDK\Assemblies\Microsoft.SQLServer.DTSRuntimeWrap.dll
  •          C:\Program Files (x86)\Microsoft SQL Server\130\SDK\Assemblies\Microsoft.SQLServer.DTSPipelineWrap.dll
  1. Debug the code and you may receive the above exception in the function : srcDesignTime.ProvideComponentProperties();

 

Cause:

The reason for the exception was that the version independent COM ProgID was not registered to point to the latest version, so loading of OLEDB SOURCE connection manager threw above error The code is using version independent ProgIDs:

“DTSAdapter.OleDbSource” &  “DTSAdapter.OleDbDestination”.The COM spec says, the version independent  ProgIDs should always load the latest version. But these ProgIDs are not registered.

 

Resolution/Workaround:

As workaround, modify the ProgIDs to the names of SSIS 2016 IDs and use version specific ProgIDs. wiz.

DTSAdapter.OleDbSource.5 & DTSAdapter.OleDbDestination.5 rather than DTSAdapter.OleDbSource & DTSAdapter.OleDbDestination in the above code sample.

 

You may find the information of these ProgIDs from the System registry.

For e.g.

The ProgID “DTSAdapter.OleDbSource.5” is registered to point to SSIS 2016 OLEDB Source.

under HKEY_CLASSES_ROOT\CLSID\{657B7EBE-0A54-4C0E-A80E-7A5BD9886C25}

Similarly, the ProgID “DTSAdapter.OLEDBDestination.5” is registered to point to SSIS 2016 OLE DB Destination are under

HKEY_CLASSES_ROOT\CLSID\{7B729B0A-4EA5-4A0D-871A-B6E7618E9CFB}

 

If you still have the issues, then please contact Microsoft CSS team for further assistance.

 

DISCLAIMER:

Any Sample code is provided for the purpose of illustration only and is not intended to be used in a production environment.  ANY SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE.

 

 

Author:       Ranjit Mondal – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:   Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Accessing SSRS URL hosted on Azure VM over the Internet

$
0
0

There might be a requirement where you would like the SQL Server Reporting Services URL hosted on your on premise environment to be accessed over the Internet (outside of your domain). This can be achieved by hosting Reporting Services on an Internet-facing Web server to distribute general information to the public at large or private corporate data to authorised and authenticated users.
More details: https://technet.microsoft.com/en-us/library/ms159272(v=sql.105).aspx .

Another requirement could be where SQL Server Reporting Services is hosted on a Azure VM and you would like to access it over the internet. This can be done by using the public IP of the Azure VM to access it.

In this blog, I would be more concentrating on the configuration from the Azure standpoint. I am considering that reader must have used the Reporting Services in the On-premises Environment and has basic knowledge about SSRS configuration.

1. Create a NEW Azure VM with SQL Server 2016 Installation.

You can use existing Azure VM that has SSRS installed or you can build a new Azure VM with SQL Server Installation along with SSRS.

  • To create a new Azure VM you can go to http://portal.azure.com.
  • Search with SQL Server 2016 like below and create a new VM.

1

  • Once the VM is created, you are all set to use the SSRS from the Azure VM.
  • RDP inside the VM and launch the “Reporting Services Configuration Manager.

2

  • Configure reporting services in the native mode. Now you can access the web portal URL inside the server. However, the same URL can’t be accessed from the remote machine for obvious reason that the local URL is not exposed to outside.
  • Once the VM is setup. The VM would have one private IP which can be seen if we do ipconfig inside from the VM. Also, there would be a public IP which would be used to connect to the VM from the Internet and the same can be visible from the Azure Portal.

3

2. Allow Communication on the Server Port.

  • In general Reporting Services by default use the port 80 for the http communication. We can give a different port as well in the reporting Services Configuration manager. For the time being let’s consider that the reporting Services is using the port 80.

To allow the communication from the outside world, we can create an Inbound firewall rule in the server to allow the http communication on the Port 80.

To do that we can go to Windows Firewall setting like below:

Step-1:

4

Step-2

5

Step 3:

Click on Next-> Choose “Allow the Connection” property -> Next -> Give a name to the rule-> Finish.

At this point if we do the telnet from the Internet, we would be able to telnet using the public IP.

 

3. Creating DNS for the Azure VM from the Azure portal.

To access the SSRS site from the Internet, we need to create the DNS for the Azure VM. We can do this from the Azure portal.

  • Open the VM Details.           6
  • Click on the Pubic IP section and enter details like following. Save it.7
  • So once we are done with this, we would be able to RDP the VM using the DNS name. For my case, it is: ssrsinternet.southeastasia.cloudapp.azure.com.
  • Edit the VM’s Network Security Group to allow traffic for port: 80. The name would be <Your VM Name>.nsg.       8
  • 9
  • Once this is done, redeploy the VM.

10

 

  • After that you would be able to access the SSRS web portal URL from the Internet using the URL http://<dns-sname>/reports . Use the user name and password you used to create the VM.

11

 

 

Reference:

https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sqlclassic/virtual-machines-windows-classic-ps-sql-bi

 

Author:      Samarendra Panda – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Khushboo Gupta – Technical Advisor, SQL Server BI Developer team, Microsoft

Issue in getting cached data for a data driven subscription when created with an expiry schedule

$
0
0

Applies To: Tested on SQL Server Reporting Services 2008 and 2008 R2.

 

In this blog, I would like to discuss about one of the known issues that you might have encountered in Reporting Services 2008 R2. If you have created a data driven subscription to get the cached data and along with that if you have an expiry schedule set, then you will notice that even if the data driven subscription runs fine, you won’t get the cached data for the report.

 

Scenario:

Let’s say you have created a data driven subscription for a report and selected a NULL delivery provider to get the cached data. Once the data driven subscription is set, under Manage Report -> Processing Options you can set the time or a schedule until which the cached report will last. Use the image below for reference:

1

If you select the second option for Report caching, you can assign a time(in minutes) for the cache expiry. If the third option is chosen, you need to create a schedule or use a shared schedule for the cache expiry. A shared schedule can be created from the Site settings-> Schedules-> New schedule.

Use the image below for reference:

2

The image below is a schedule that is configured to run once.

3

Once the schedule is created you can use the shared schedule in the Subscription.

 

Upon executing the subscription, you will notice that the data rendered in the report is Live and not Cached. To be precise, the ExecutionLog3 view of the report server database will have the “Source” field set to LIVE instead of CACHE. Use the image below for reference:

4

 

Issue:

Whenever a shared expiry schedule is configured to run once, it should be deactivated post the scheduled time. But the issue here is if the same expiry schedule is still mapped to the same subscription, further when the subscription runs, it won’t take the cache data thinking that the expiry schedule is active.

Resolution:

Well the resolution is quite simple. If you are creating a schedule with one time execution then make sure that once the schedule is executed, go ahead, and delete it or recreate it same schedule with a newer date.

Takeaway from this Blog:

We can easily figure out from this blog that whenever you create a data driven subscription to get a cached report make sure that the expiry schedule is not set to run for just one time. Instead of that you can schedule the expiry schedule as well that would run daily before the data driven subscription. In this way, you can make sure that before the data driven subscription runs the older cached data is deleted and after the subscription runs you will get a new set of cached data.

 

Author:      Jaideep Saha Roy – Support Engineer, SQL Server BI Developer team, Microsoft 

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

 

Analysis Service 2016 Model gets corrupted if you add unsupported parameter while configuring the data source

$
0
0

Applies To: Tested on SQL Server Analysis Service 2016

 

Hello Everyone, today I am going to discuss about a known issue is Analysis Service 2016 Tabular Model. If you have built an Analysis Service tabular project and added some unsupported content in the data source, then neither you will able to open the Model from the Visual Studio nor you will be able to browse it from SQL Server Management Studio.

 

Scenario and Issue:

Let’s say we have created an Analysis Service Tabular project using SQL Server Data Tools(SSDT). In the project, we are creating a data source to pull the data. If the data source that you are using is a third-party data source, then there will be some additional parameter that you need to mention in the connection. Such parameter you will mentioned under Extended Properties under Advanced tab in Table Import Wizard. (Refer the image below)

1

Now the catch here is at times if you add some parameter which is invalid or has some unsupported content and save the data source, it might get saved but later on if you saved the project and then open it again, the project won’t, and it would throw an error message as follow:

 

Error: An error occurred while scripting the catalog. The Extended Properties property is set to a value that is not supported.

 

The worst part is that you can’t even open the data source from the SSDT nor from SQL Server Management Studio(SSMS) after connecting it to the Analysis Service instance.

 

Resolution:

Well the resolution involves a concept that you need to know about Analysis Service 2016. In Analysis Service 2016, we have introduced sqllitedb to store all the metadata of a project. In every project folder you will see a metdata.sqllitedb file inside it.

This file will have all the connection strings, data source details and the metadata of the project.

In order to open the file you have to download a tool called DB Browser for SQLLite(You can download it from this link: http://sqlitebrowser.org/)

Once downloaded, open the tool and import the metdata.sqllitedb in it. Then click to the Browse Data tab and from the Table dropdown select DataSource.

In the table for the Data Source you will see a column named ConnectionString. (Refer the image below)

Click on the connection string value where the parameter is entered wrong.

On the right-hand side, you will see the value of the connection string in encrypted form.

 

All you need to do is to delete the connection string and then click Apply. (Refer the image below)

2

Once the change is done, save the metdata.sqllitedb file and replace it with the existing one for that project.

Now you will be able to open the package in SSDT or SSMS and then you can configure the data source and the connection string again.

This was the only option left for me to remove the unsupported content from the connection string rather than creating the entire project from the scratch.

 

Hope this helps for you as well.

 

Note: This workaround is provided AS-IS, without any warranty from Microsoft. Any modification to SSAS Metadata isn’t Supported by Microsoft.

 

Author:      Jaideep Saha Roy – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft


Error while impersonating an user in SQL Server Analysis Services

$
0
0

 

Hi folks, recently I came across an interesting issue related to impersonation in SSAS and I thought of sharing with you.

Scenario and Issue:

Let’s say we are trying to impersonate a user while browsing a cube in SSAS (Fig2). You might encounter with below error(Fig1). In first occurrence, it might feel like this is a login failure issue since error says as below but it’s not.

Error:  The following system error occurred: The user name or password is incorrect.

You might see below error in the event Viewer log as well

 

An account failed to log on.

Subject:

Security ID:

Domain\UserName

Account Name:

[SA account]

Account Domain:

[DOMAIN]

Logon ID:

0x192D1

 

Logon Type: 3

 

Account For Which Logon Failed:

Security ID:

NULL SID

Account Name:

Account Domain:

 

Failure Information:

Failure Reason:

Unknown user name or bad password.

Status:

0xC000006D

Sub Status:

0xC0000064

 

Process Information:

Caller Process ID:

0x7dc

Caller Process Name:

C:\Program Files\Microsoft SQL Server\MSAS13.MSSQLSERVER\OLAP\bin\msmdsrv.exe

 

Network Information:

Workstation Name:

[Server]

Source Network Address:

Source Port:

 

Detailed Authentication Information:

Logon Process:

OLAPSvc

Authentication Package:

Kerberos

Transited Services:

Package Name (NTLM only):

Key Length:

0

 

Initially I thought it might be related to Kerberos authentication, but we could reproduce the issue local to SSAS server. Hence it is not related to Kerberos.

We have tried EffectiveUserName Property in the connection string, but even there we are seeing the same error. For more info about EffectiveUserName, refer to https://docs.microsoft.com/en-us/sql/analysis-services/instances/connection-string-properties-analysis-services.

Note: EffectiveUserName is not case sensitive

While doing further research we found that Issue was with SSAS service account. In our case, we were using domain account as the SSAS service account and it looked like few permissions are missing.

We came to know that the SSAS service account is not part of the “Windows Authorization Access Group” active directory group. For more info about this AD group, refer to https://technet.microsoft.com/en-us/library/dn579255(v=ws.11).aspx#BKMK_WinAuthAccess.

We have followed the below steps to Grant the SSAS service domain account in this AD group: Windows Authorization Access Group.

Step 1:

Went to Active Directory Users and Computers

 

Step 2:

Double clicked on Windows Authorization Access Group =>Members

Step 3:

Added the required SSAS Domain service account and Apply

Step 4:

We restarted the SSAS service and issue has been fixed.Now we can able to impersonate the other user using different account while browsing the cube

 

Note: The problem does not exist when the SSAS service account is the local system account, but only happens when using a domain user account.

Hope this helps you as well.

 

Author:      Vikas Kumar – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Sarath Babu Chidipothu  – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

KPI Goal format not working in SQL Server Analysis Service Tabular project

$
0
0

While working on customer issue recently, came across following scenario where there was a problem in the format of goal of KPI in tabular model. When we set the format for the measure and use it as target measure for KPI, the Goal value doesn’t show up in the format of the measure when we browse the goal from client tools like Excel or PowerBI. If we browse the measure directly, that shows up in the correct format. We do not see this behavior in Multidimensional cube. The format for goal works fine in Multidimensional cube.

Steps to reproduce the issue:

  • Create the table with two columns. One for Value measure and one for Goal measure

  • Create a tabular model based on this table
  • Create 2 measures based on the 2 columns and change their format to Percentage

ValueMeasure:=SUM(TestTable[ValueSource])

GoalMeasure:=SUM(TestTable[GoalSource])

  • Create a KPI based on the Value measure with the Goal measure as the target measure

  • Deploy the solution to a tabular model analysis server 2016
  • Connect with client tool (example: Excel) to the model and select Value and Goal from the ValueMeasure KPI.

  • Now you see that Value is formatted as percentage, but ValueMeasure Goal is not formatted as percentage

By following the points below we observed that we are able to get the goal in percentage format:

  • Format of the Goal as well as the measure should be set before we create the KPI
  • The compatibility level of the project has to be set to 2016 versions (that is 1200 or above)

  • Now it shows the correct format for the goal

 

Author:     Chaitra Hegde – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Sarath Babu Chidipothu  – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Dynamically enable or disable the report printing option on the client side

$
0
0

Hello Everyone, in this blog I am going to discuss about an interesting scenario while working on a customer issue recently related to Client Printing of SSRS Reports.

Scenario:

Let’s say you want to disable Client Side Printing on Reporting Services when you open your reports through your application on Internet Explorer. You use Report Viewer Control to view the reports on the client application. There is a setting in SQL Server Management Studio where you can toggle the parameter for EnableClientPrinting. We connect to the Report Server through SSMS - > Right – click on the name of the Report Server - > Advanced - > EnableClientPrinting - > You can choose this to be True or False. I have given a screenshot of the same below –

But the above setting is only for the SSRS / Report Manager application and does not affect Report Viewer Control which is a different entity altogether. You have disabled this setting in the above properties and expect the same setting to be applied for Report Viewer and you do not want to update your application code every time you enable or disable this setting.

 

Resolution:

To change the setting of the Print Icon in Report Viewer automatically, you can use this code snippet in your application code.

------------------------------------

protected void Page_Load(object sender, EventArgs e)

{

if(!IsPostBack)

{

//Your Report Sever URL and Path

String reportServerURL = "http://bi-sql-vm/ReportServer14";

String reportPath = "/HelloWorld";

 

// Report Viewer Configuration

ReportViewer1.ServerReport.ReportServerUrl = new Uri(reportServerURL);

ReportViewer1.ServerReport.ReportPath = reportPath;

 

// Report Service Configuration

ReportingService2010 rs = new ReportingService2010();

rs.Url = reportServerURL + "/ReportService2010.asmx";

rs.UseDefaultCredentials = true;

 

// Creating a Property Object to filter the System properties

Property[] reportServerProperties = new Property[1];

reportServerProperties[0] = new Property();

reportServerProperties[0].Name = "EnableClientPrinting";

 

// Retrieving the System property for Client Printing

Property[] newReportServerProperties = rs.GetSystemProperties(reportServerProperties);

 

// Depending on the value configure at the Server Level, Report Viewer will be updated

if (newReportServerProperties[0].Value == "False")

ReportViewer1.ShowPrintButton = false;

else

ReportViewer1.ShowPrintButton = true;

 

// Refreshes the Report

ReportViewer1.ServerReport.Refresh();

}

}

----------------------------------------------

This is the easiest way to dynamically enable or disable the report printing option on the client side as set in the report server property.

DISCLAIMER:

Any Sample code is provided for the purpose of illustration only and is not intended to be used in a production environment.  ANY SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANT ABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE.

 

Author:     Anshu Jana – Support Engineer, SQL Server BI Developer team, Microsoft 

Reviewer:  Selvakumar Rajakumar - Escalation Engineer, SQL Server BI Developer team, Microsoft

 

Invalid Column __$command_id issue during CDC implementation in SSIS package

$
0
0

 

In this blog, we will be addressing the problem with CDC implementation in SSIS package which goes with the error ‘invalid column name __$command_id’.

 

CDC stands for the change data capture. It was introduced in SQL server 2012. In SSIS package we have CDC control flow task and CDC source and CDC destination in Data flow task. CDC processing logic split into two packages – an Initial Load package that will read all of the data in the source table, and an Incremental Load package that will process change data on subsequent runs.

 

Issue-:

While previewing the CDC source task in an incremental package load, we get the error: ‘invalid column name __$command_id

 

Workaround-:

The profiler reveals calls to the function [cdc].[fn_cdc_get_all_changes_DimCustomer_CDC]. Here DimCustomer_CDC is the CDC enabled table, that we select in CDC source control. One catch is here that during CDC source task setup we select the processing mode it could be net or all. The function name depends on this processing mode. So function name could be [cdc].[fn_cdc_get_all_changes_ DimCustomer_CDC] or [cdc].[fn_cdc_get_net_changes_ DimCustomer_CDC].

 

Adding the column __$command_id in the function definition should resolve this issue. The following T-SQL script will update all the functions that are related to CDC instance of any table. We need to run the below query on CDC enabled database.

 

sp_msforeachtable N'declare @tsql nvarchar(max), @fn_all sysname, @fn_all_bk sysname, @fn_net sysname, @fn_net_bk sysname, @supports_net_changes bit

if object_id(N''cdc.change_tables'') is null return

print N''Verifying object ?...''

SELECT @fn_all = N''fn_cdc_get_all_changes_'' + ct.capture_instance,

@fn_all_bk = N''[cdc].[fn_cdc_get_all_changes_'' + ct.capture_instance + N''_original]'',

@fn_net = N''fn_cdc_get_net_changes_'' + ct.capture_instance,

@fn_net_bk = N''[cdc].[fn_cdc_get_net_changes_'' + ct.capture_instance + N''_original]'',

@supports_net_changes = ct.supports_net_changes FROM cdc.change_tables ct WHERE ct.source_object_id = object_id(N''?'')

if @fn_all is null or object_id(N''[cdc].[''+@fn_all+N'']'') is null begin

print N''Workaround was not required for this table (is not enabled for CDC).''

end else if object_id(@fn_all_bk) is null begin

print N''Workaround was not applied for this table.''

end else begin

if object_id(N''[cdc].[''+@fn_all+N'']'') is not null begin

print N''Removing proxy function ''+@fn_all

set @tsql = N''drop function [cdc].[''+@fn_all+N'']''

exec (@tsql)

end

print N''Renaming function populating all changes back to original name''

exec sp_rename @fn_all_bk, @fn_all

if @supports_net_changes = 1 and object_id(@fn_net_bk) is not null begin

if @fn_net is not null and object_id(N''[cdc].[''+@fn_net+N'']'') is not null begin

print N''Removing proxy function ''+@fn_net

set @tsql = N''drop function [cdc].[''+@fn_net+N'']''

exec (@tsql)

end

print N''Renaming function populating all changes back to original name''

exec sp_rename @fn_net_bk, @fn_net

end

end'

go

 

Please reach-out to the SQL Server Integration services support team if you are still facing any issues.

 

DISCLAIMER:

Any sample code or query is provided for the purpose of illustration only and is not intended to be used in a production environment. ANY SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE.

 

Author:      Umakant Khandekar  – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:   Chetan KT - Technical Advisor, SQL Server BI Developer team, Microsoft

 

OData source connection manager to PWA (Project Web App) – Project Online URL fails with “(401) Unauthorized”

$
0
0

Over this blog, I would like to discuss about one of the interesting issues that we had worked upon recently. Here we are having a SSIS package that uses the OData Source Connection manager to connect to a PWA (Project Web App) – Project Online URL to pull the data and write back to a SQL Server table. This SSIS package was scheduled to run under a SQL Agent Job and while executing the SQL Agent Job, it failed with the following error messages,

 

Error message from the SSIS package execution report:

 

While testing the connection for the OData connection manager of the same package from Visual Studio -SQL Server Data Tools (SSDT), we received the following error message,

 

===================================
Test connection failed
===================================
The remote server returned an error: (401) Unauthorized. (System)
------------------------------
Program Location:
at System.Net.HttpWebRequest.GetResponse()
at Microsoft.SqlServer.IntegrationServices.OData.UI.ODataConnectionManagerForm.TestConnectionMiddle(Object callback)
------------------------------

 

As a part of our troubleshooting activity, we had collected a fiddler traces while reproducing the issue and we were able to see the following.

 

 

After some internal research, we were able to identify the cause of the issue as below.

 

Cause:

Most current Office mobile and desktop applications use modern authentication (which is an implementation of OAuth2), however there are third-party apps, older Office apps & Visual studio applications (SSDT-SSIS) that uses other authentication methods like basic authentication and forms based authentication. Most of the times ADFS claims rules are setup to block non-modern authentication protocols which blocks all access to O365 except browser-based applications.

This is controlled by the following settings.
SharePoint Admin Center  -> Device access -> Control access from apps that don’t use modern authentication -> Block

 

Resolution:

We need to set the control access to be allowed for apps that don’t use modern authentication.
SharePoint Admin Center  -> Device access -> Control access from apps that don’t use modern authentication -> Allow

Post this change, we were able to access the PWA – Project Online URL without any issues.

NOTE: There could be multiple other reasons for the “(401) Unauthorized” error, in this blog, we have tried to address one such reason for this error to be thrown.

 

Additional References:

1. OData Connection Manager: https://msdn.microsoft.com/en-us/library/dn584129(v=sql.110).aspx
2. Using the SSIS OData Source Connector With SharePoint Online Authentication: https://whitepages.unlimitedviz.com/2014/03/using-the-odata-source-connector-with-sharepoint-online-authentication/
3. Control access based on network location or app: https://support.office.com/en-us/article/Control-access-based-on-network-location-or-app-59b83701-cefd-4bf8-b4d1-d4659b60da08
4. Applications and browsers that use conditional access rules in Azure Active Directory: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-conditional-access-supported-apps
5. Block apps that do not use modern authentication (ADAL): https://docs.microsoft.com/en-us/intune-classic/deploy-use/block-apps-with-no-modern-authentication

 

Author:       Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Reviewer:   Sarath Babu Chidipothu – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Viewing all 80 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>