Quantcast
Channel: SQL BI / Data Access Technologies
Viewing all 80 articles
Browse latest View live

SSRS – Bing Maps seem to be broken with latest changes in Bing Map Service

$
0
0

Reporting Services Reports that contain Bing Maps seem to broken due to a change in the Bing Map APIs. From the initial analysis, Bing Maps SOAP web service seems to be shut down after July 31, 2017.

When we access the Bing Maps API from SSRS we are calling into this method http://dev.virtualearth.net/webservices/v1/imagery/contracts/IImageryService/GetImageryMetadata, and are receiving this message: “The service method is not found”.

 

Please note, SSRS Product Team is aware of this issue and are working with the Bing API Team.


Automating Azure Analysis Service Processing using Azure Automation Account

$
0
0

 

Analysis Services has been progressing day-by-day with new exciting features and there is an ask from the users to automate the Azure Analysis Services Processing. There are few ways which we can automate the processing.

  • Using the conventional way what we have for the On-Prem. Using SQL Agent Job/ Using AMO Objects/ Using PowerShell.

Since we are dealing with Azure, we need to think about automation which wouldn’t be dependent on the On-Prem VM to execute a PowerShell script or any On-Prem SQL Server Instance to run the SQL Server Agent Job.

Also, there are scenarios where we need to deal with the 2-factor authentication where we either get prompted for the phone authentication or need to re-enter the credential while connecting to the Azure Analysis Services.  Now think about a scenario where we are scheduling the job that would run un -attended, there might be a possibility that it prompts for the authentication if the AD token expires while scheduling it with on-prem schedulers. There is different way to tackle that, however we will not discuss this here.

We have an azure automation functionality where we can schedule the PowerShell Script to automate the functionality with the Azure Analysis Services.  Here are the steps we need to follow –

 

Objective: We will create partition for a fact table and process it: TabDemo in my Azure Analysis Services Instance:  asazure://southeastasia.asazure.windows.net/azureasdemo

 

Steps:

1. Creating Azure Automation Account and adding the SQL PowerShell Module

a. Login to http://portal.azure.com

b. Search for “Automation Account

c. Create an automation account.

d. Now you would be able to see the automation account which you just created. The name is “samasutomationaccount”

e. You need to Import the SQLServer PowerShell Modules first.

f. Click on the “Browse Gallery” and search with “SQLServer”.

g. Click on the Import and then OK button at right button corner of your screen.

h. You would be able to see the SQLServer module has been imported in your automation account gallery.

SQLServer Module:

You can download load the SQL Server Module from the link if you want to use it in the on prem: https://www.powershellgallery.com/packages/SqlServer/21.0.17178

Here are the commands you can use for the Analysis Services:

https://docs.microsoft.com/en-us/sql/analysis-services/powershell/analysis-services-powershell-reference

Add-RoleMember cmdlet Add a member to a database role. Add
Backup-ASDatabase cmdlet Backup an Analysis Services database. Database.Backup
Invoke-ASCmd cmdlet Execute a query or script in XMLA or TSML (JSON) format. Execute
Invoke-ProcessASDatabase Process a database. Process
Invoke-ProcessCube cmdlet Process a cube. Process
Invoke-ProcessDimension cmdlet Process a dimension. Process
Invoke-ProcessPartition cmdlet Process a partition. Process
Invoke-ProcessTable cmdlet Process a table in a Tabular model, compatibility model 1200 or higher. Process
Merge-Partition cmdlet Merge a partition. Merge
New-RestoreFolder cmdlet Create a folder to contain a database backup. RestoreFolder
New-RestoreLocation cmdlet Specify one or more remote servers on which to restore the database. RestoreLocation
Remove-RoleMember cmdlet Remove a member from a database role. Remove
Restore-ASDatabase cmdlet Restore a database on a server instance. Restore


2. Creating Credential:

a. We need to define a credential here which we would be using in the Powershell code later.

b. You need to specify the credential which has Admin access in Azure AS Instacne and then click on Create. The name of the credential I created is “SamCred”

 

3. Creating Runbook:

a. Select the Runbook

b. Click on the Add run book

c. Enter the below details:

Choose Powershell as Runbook Type and then click on CREATE

4. Create the Powershell Cmdlet script to automate partition creation and processing.:

a. The main objective code is to automate the creation of the partition for the current month and delete 36th month older partition.

Go to the Runbook we created earlier. Click on Edit

b. Enter the Below Power shell Script:

##Getting the credential which we stored earlier.
$cred = Get-AutomationPSCredential -Name 'SamCred'

## Providing the Server Details
$ServerName = "asazure://southeastasia.asazure.windows.net/azureasdemo"
$DatabaseName = "TABDEMO"
$TableName ="FactInternetSales"

## Getting the current Month and Year
$a= Get-Date
##$startMonth=$a.Month
$startMonth=10
$startYear=$a.Year
if ( $startMonth-eq 12)
{
$endMonth="01"
$endYear=$startYear+1
}
if ( $startMonth -ne 12)
{
$endMonth =$startMonth+1
$endYear=$startYear
}
## Pad 0 at the starting if month is in signle digit
if ( $startMonth -ne 10 -or $startMonth -ne 11 -or $startMonth -ne 12)
{
$startMonth=$startMonth.ToString("00")
}

if ( $endMonth -ne 10 -or $endMonth -ne 11 -or $endMonth -ne 12)
{
$endMonth=$endMonth.ToString("00")
}
$startMonth
$endMonth

##creating the partition for the current month and current year ( You can script out the JSON code from SSMS)

$Query = "{
`"createOrReplace`": {
`"object`": {
`"database`": `"TABDemo`",
`"table`": `"FactInternetSales`",
`"partition`": `"FactInternetSales_"+ $startMonth+$startYear+"`"
},
`"partition`": {
`"name`": `"FactInternetSales_"+$startMonth+$startYear+"`",
`"source`": {
`"query`": [
`"SELECT * FROM [dbo].[FactInternetSales] WHERE ORDERDATEKEY >= N'"+ $startYear+$startMonth+"01"+ "' AND ORDERDATEKEY < N'"+ $endYear+$endMonth+"01" +"'`"
],
`"dataSource`": `"SqlServer localhost AdventureWorksDW2014`"
}
}
}
}
"
#$Query
##Creating the parition

Invoke-ASCmd -Server $ServerName -Credential $cred -Query $Query
##Processing the partition

$PartName= "FactInternetSales_"+$startMonth+$startYear
$PartName
$result=Invoke-ProcessPartition -Server $ServerName -Database $DatabaseName -TableName $TableName -PartitionName $PartName –RefreshType Full -Credential $cred

##Deleting the Old partition

if ( $startMonth-eq 01)
{
$prevMonth="12"
$prevYear=$startYear-2
}
if ( $startMonth -ne 01)
{
$prevMonth=$startMonth-1
$prevYear=$startYear-3
}
if ( $prevMonth -ne 10 -or $prevMonth -ne 11 -or $prevMonth -ne 12)
{
$prevMonth=$prevMonth.ToString("00")
}
$prevMonth

$delQuery="
{
`"delete`": {
`"object`": {
`"database`": `"TABDemo`",
`"table`": `"FactInternetSales`",
`"partition`": `"FactInternetSales_"+$prevMonth + $Prevyear +" `"
}
}
}
"

#$delQuery

Invoke-ASCmd -Server $ServerName -Credential $cred -Query $delQuery

$error[0].Exception.Message
$error[0].Exception.StackTrace

 

c. Click on the Test Pane. And then hit on the start to test.

d. Schedule the runbook.

e. Click on the Add Schedule and enter the details:

 

This is how you would be able to Schedule the Processing.  To know more about azure automation, please refer the link below:

https://docs.microsoft.com/en-us/azure/automation/automation-intro

https://azure.microsoft.com/en-in/pricing/details/automation/

 

 

Author:     Samarendra Panda - Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

SQL Server Reporting Services 2016 Integration with an Application

$
0
0

There are multiple ways to integrate or embed SQL Reporting Services Report in an application. This can be achieved via:-

Couple of common issue that we face while calling reports in an application are: -

  • How to pass credentials from application to SSRS.
  • How to avoid login prompt while accessing a report from an application.

In this article, we will majorly focus on URL Access Method and SOAP APIs.

Scenario 1

Let’s take an example of a Java web application that is configured to use forms authentication and user wants to embed SSRS reports into it. Now what authentication should be used to ensure the application is able to access the reports without any login prompt?

To achieve the above-mentioned requirement couple of methods can be used: -

  1. SQL Server Reporting Services Configured for Forms/Custom Authentication: Forms authentication lets you authenticate users by using your own code and then maintain an authentication token in a cookie or in the page URL.

Sample : Reporting Services Custom Security Sample

In this case the user credentials being used by the application will be passed to SSRS. As SSRS is configured for forms authentication it will be able to understand those credentials and user will be able to access the report based on the permissions defined for that account in the web portal.

  1. SQL Server Reporting Services Configured for Windows Authentication(NTLM): A dedicated domain account can be used (domain\username) in the application that can be leveraged to access a report. This account can be set in the code.

Note: Domain user account being used in an application should have access to the report and the web portal.

Scenario 2

Let’s take another example where we have .Net web application which is configured for Forms authentication and we want to call SSRS reports in the application such that SSRS is configured for Windows authentication (NTLM).

This can be achieved either through following way: -

Using URL Access Method: You can set identity impersonate = true along with the username and password in the web .config file of the web application. These credentials will be used by application while making a call to report.

Example: <identity impersonate= true, username=”domain\username “ , password=”password”>

Note: Replace the "domain\username" & "password" with correct user credential that has access to the report.

Related Links:
Integrating Reporting Services Using URL Access - Web Application
Integrating Reporting Services Using URL Access - Windows Application

Scenario 3

Both web application and SSRS is configured to use windows authentication(Kerberos).

If they are configured to use Kerberos, then logged in machine credentials will be delegated from application to SSRS.

In this case HTTP SPNs needs to be registered for reporting services service account and website application pool account.

Delegation would need to be enabled for both the accounts.

For setting up SPN please check: KERBEROS – Inside OUT

Scenario 4

A web application is configured for windows NTLM authentication and SSRS is also configured for windows NTLM.

In this case as mentioned in scenario 2 a dedicated account can be passed with identity impersonation being set in the web.config file or credentials can be passed via code as shown:-
Windows Authentication:  The following code passes Windows credentials to the Web service.
C# Code

private void Page_Load(object sender, System.EventArgs e)
 {
 // Create a Web service proxy object and set credentials
 ReportingService rs = new ReportingService();
 rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
 }

 

Basic Authentication: The following code passes Basic credentials to the Web service.
C# Code

private void Page_Load(object sender, System.EventArgs e)
 {
 ReportingService rs = new ReportingService();
 rs.Credentials = new System.Net.NetworkCredential("username", "password", "domain");
 }

 

The credentials must be set before you call any of the methods of the Report Server Web service. If you do not set the credentials, you receive the error code an HTTP 401 Error: Access Denied. You must authenticate the service before you use it, but after you have set the credentials, you do not need to set them again as long as you continue to use the same service variable (such as rs ).

 

Related Links

Integrating Reporting Services Using SOAP - Web Application

Integrating Reporting Services Using SOAP - Windows Application

Using the WebForms ReportViewer Control

Using the WinForms ReportViewer Control

 

Author:    Khushboo Gupta - Technical Advisor, SQL Server BI Developer team, Microsoft

Reviewer:  Selvakumar Rajakumar – Escalation Engineer, SQL Server BI Developer team, Microsoft

Reporting Services Web Services with .NET CORE 2

$
0
0

 

Any application built using .NET CORE SDK can be executed on any platform (Windows, Linux & Mac). But due to this there are a lot of APIs available in .NET Framework that are no longer available in .NET CORE. One of the missing APIs is Web Service (SOAP) Clients. The way to move forward is to use WCF Connected Services and create a BasicHttpBinding against Reporting Services Web Services.

 

In this blog, we will look at accessing the SSRS Web Services using .NET CORE and WCF Connected Services.

Reporting Services (ReportService2010.asmx):

1. Create a New Project - .NET Core (Console App)

 

2. To add a Connected Service Reference, the Extension needs to be added to Visual Studio. This isn’t install by Default.

  • Open Tools -> Extension and Updates
  • Search for “Microsoft WCF Web Service Reference Provider”
  • Download and Install – “Microsoft WCF Web Service Reference Provider”
  • Restart Visual Studio and Reopen the Project

3. Add a Connected Service and Choose “Microsoft WCF Web Service Reference Provider - Preview” :

 

4. Provide the Reporting Services Web Service URL: http://servername/Reportserver/ReportService2010.asmx

 

5. Enter the Namespace and click Finish

6. Update Program.cs with the following Code:

using System;
using System.ServiceModel;
using System.Threading.Tasks;
using RSService;

namespace RSWcf
{
    class Program
    {
        static ReportingService2010SoapClient rsclient = null;

        static void Main(string[] args)
        {
            BasicHttpBinding rsBinding = new BasicHttpBinding();
            rsBinding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly;
            rsBinding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Ntlm;

            EndpointAddress rsEndpointAddress = new EndpointAddress("http://servername/Reportserver/ReportService2010.asmx");

            rsclient = new ReportingService2010SoapClient(rsBinding, rsEndpointAddress);

            var output = rsListChildren("/");
            output.Wait();

            if(output.Status == TaskStatus.RanToCompletion && output.Result.Length > 0)
            {
                foreach(CatalogItem item in output.Result)
                {
                    Console.WriteLine(String.Format("Item Path: {0}", item.Path));
                }
            }

            Console.WriteLine("Completed!");
            Console.ReadLine();
        }

        private static async Task<CatalogItem[]> rsListChildren(String ItemPath)
        {
            TrustedUserHeader trustedUserHeader = new TrustedUserHeader();
            ListChildrenResponse listChildrenResponse = null;
            try
            {
                listChildrenResponse = await rsclient.ListChildrenAsync(trustedUserHeader, ItemPath, false);
            }
            catch(Exception exception)
            {
                Console.WriteLine(exception.Message + exception.StackTrace);
                return new CatalogItem[0];
            }
           return listChildrenResponse.CatalogItems;
        }
    }
}

7. Execute the Project, you would see an output like this:

 

8. To Publish the Project for all operating systems, execute this command:

dotnet publish "C:\Projects\RSWcf\RSWcf.sln"

 

9. To Run the Application after publishing, execute this command:

dotnet "C:\Projects\RSWcf\RSWcf\bin\Debug\netcoreapp2.0\RSWcf.dll"

 

 

Author:    Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

How to display report parameter Label field in RDLC reports

$
0
0

While working on customer issue recently, I came across following scenario where there was a problem in displaying the report parameter label filed in the RDLC reports.

Scenario and issue:

Let's say we have report parameter called “Customer” in the RDLC report which get values from the “DataSet1” as shown below

We have a text box with expression in the report to display parameter “Customer” value field and label field as shown below.

If we run the report, we see only the value field instead of label field as show below.

We can clearly see that Parameters!Customer.Label  is not holding the Label Value and only holds the Parameter value. We see this behavior only in RDLC reports because of we don’t bind the dataset to the Label field with in the RDLC report.

 

To display the parameter label field in the RDLC report we can use the Built-In Lookup function.

We need to use Lookup to retrieve the values from the specified dataset for a name/value pair where there is a 1-to-1 relationship. For example, for an ID field in a table, you can use Lookup to retrieve the corresponding Name field from a dataset that is not bound to the data region.

Syntax

Lookup(source_expression, destination_expression, result_expression, dataset)

Example:

=Lookup(Fields!SaleProdId.Value, Fields!ProductID.Value,  Fields!Name.Value, "Product")

Replaced the “Parameters!Customer.Label”  expression with the Lookup function as shown below.

Note: If report parameter datatype does not match with the dataset field datatype, conversion required.

Expression Value: =Parameters!p.Value & " | " & Lookup(CInt(Parameters!p.Value), Fields!CustomerKey.Value, Fields!FirstName.Value, "DataSet1")

If we run the report, we see the value field and label field as shown below.

 

Author:    SatyaSai K – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

ARM Resource Manager Template Deployment with Azure Analysis Services

$
0
0

Analysis Services model can be deployed in Azure in different ways like from SSDT or by running the JSON create script from SSMS.

We can leverage the richness of template deployment in our Azure Analysis Services as well.

What is Template Deployment in Azure?

The Resource Manager template you deploy can either be a local file on your machine, or an external file that is in a repository like GitHub through the Azure Portal. The same template can be deployed in multiple environment just changing the Environment parameters.

https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-template-deploy-portal

https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-template-deploy

 

Objective

We will focus only to create the Azure Analysis Services Instance using a template. Here we will not use the Azure portal for the deployment, whereas we will use the PowerShell command to deploy the instance and configure the backup storage.

Also, we can use the parameter files explicitly to define the parameter values as described. Here we will pass the parameters directly from the PowerShell.

Prerequisite

  1. The storage account and a resource group must have been created before executing the template.
  2. You should be the Admin of the subscription who has the access to create a resource in the Subscription.

Implementation:

  1. You need to get the storage account access key: You can follow the below screenshot:

 

PowerShell Command:

##Installinig AzreRm.Resources Module.
#Install-Module AzureRm.Resources
##Install-Module Azure
Get-Module -ListAvailable -Name AzureRm.Resources | Select Version

Login-AzureRmAccount

# replace with your information
$serverLocation = "West Europe"
$serverName = "backuptestserver1"
$storageAccount = "samtestblob"
$storageKey = "n1P9xnk/3x4HkaybaLYmmtOVvLHd#####################################"
$containerName="azureasbackup"
$RGName = "RGSamSSAS"
$TemplateFile = "C:\temp\AzureASwBackup.json"
$skuTier="Development"
$skuName="D1"

$asAdmin ="xxxx@microsoft.com"

##Adding 99 years. Please note that if we don’t specify any expiry time, it will by
##default take one hour. After one hour you might not be able to take the backup.
$starttime = Get-Date
$endtime= $starttime.AddYears(99)

$storageaccountContext = New-AzureStorageContext -StorageAccountName $storageAccount -StorageAccountKey $storageKey
$containerSASURI = New-AzureStorageContainerSASToken -Name $containerName -Permission rwdl -FullUri -Context $storageaccountContext -StartTime $starttime -ExpiryTime $endtime
$parameters = @{}
$parameters.Add("serverName", $serverName)
$parameters.Add("serverLocation",$serverLocation)
$parameters.Add(“asAdmin”, $asAdmin)
$parameters.Add("skuName", $skuName)
$parameters.Add("skuTier",$skuTier)

# using SAS token
$parameters.Add(“storageContainerURI”, $containerSASURI)

New-AzureRmResourceGroupDeployment -ResourceGroupName $RGName -TemplateFile $TemplateFile -TemplateParameterObject $parameters -Verbose

 

AzureASwBackup.JSON.

Reference Document: https://docs.microsoft.com/en-us/azure/templates/microsoft.analysisservices/servers

{
  "$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "serverName": {
      "type": "string",
      "defaultValue": "BackupTestServer"
    },
    "serverLocation": {
      "type": "string",
      "defaultValue": "West Europe"
    },
    "storageContainerURI": {
      "type": "string",
      "defaultValue": ""
    },
    "asAdmin": {
      "type": "string",
      "defaultValue": "sapa@microsoft.com"
    },
    "skuName": {
      "type": "string",
      "defaultValue": ""
    },
    "skuTier": {
      "type": "string",
      "defaultValue": " "
    }
  },

  "resources": [
    {
      "name": "[parameters('serverName')]",
      "type": "Microsoft.AnalysisServices/servers",
      "apiVersion": "2016-05-16",
      "location": "[parameters('serverLocation')]",
      "sku": {
        "name": "[parameters('skuName')]",
        "tier": "[parameters('skuTier')]"
      },
      "tags": {},
      "properties": {
        "asAdministrators": {
          "members": [
            "[parameters('asAdmin')]"
          ]
        },
        "backupBlobContainerUri": "[parameters('storageContainerURI')]"
      }
    }
  ],
  "outputs": {
  }
}

 

Author:     Samarendra Panda - Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Orsolya Gal – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

“Unexpected error from external database driver (1). (Microsoft JET Database Engine)” after applying October security updates.

$
0
0

UPDATE: As informed, the security patches with the fix was rolled-out yesterday 11/14. We would request you to test this and verify that the issue is addressed. If not, please let us know. Please find the information below,

Windows 2008

https://support.microsoft.com/en-us/help/4050795/unexpected-error-from-external-database-driver-error-when-you-create-o

Windows 7/2008 R2

-Monthly Roll-up - https://support.microsoft.com/en-us/help/4048957/windows-7-update-kb4048957

-Security-only update - https://support.microsoft.com/en-us/help/4048960/windows-7-update-kb4048960

Windows 2012

-Monthly Roll-up - https://support.microsoft.com/en-us/help/4048959/windows-server-2012-update-kb4048959

-Security-only update - https://support.microsoft.com/en-us/help/4048962/windows-server-2012-update-kb4048962

Windows 8.1 and 2012 R2

-Monthly Roll-up - https://support.microsoft.com/en-us/help/4048958/windows-81-update-kb4048958

-Security-only update - https://support.microsoft.com/en-us/help/4048961/windows-81-update-kb4048961

Windows 10 Fall (“November”) Update, version 1511

-https://support.microsoft.com/en-us/help/4048952

Windows 10 Anniversary Update, version 1607, and Server 2016

-https://support.microsoft.com/en-us/help/4048953

Windows 10 , version 1703

-https://support.microsoft.com/en-us/help/4048954/windows-10-update-kb4048954

 

 

 

We have been seeing a recent influx in cases where the JET provider is no longer able to connect after the October update. This update (released October 10th, 2017) includes a security update release that inadvertently affects the JET provider. The update was kb4041678 and included in the patch kb4041681.  These patches affected the Operating System, which adversely has an issue with the following technologies: Microsoft Windows Search Component, Windows kernel-mode drivers, Microsoft Graphics Component, Internet Explorer, Windows kernel, Windows Wireless Networking, Microsoft JET Database Engine, and the Windows SMB Server. It is important to note that the changes were not to these technologies themselves.

 

Types of errors witnessed:

 Unexpected error from external database driver (1). (Microsoft JET Database Engine)

 [Microsoft][Driver ODBC Excel] Reserved error (-5016).

 [Microsoft][ODBC Excel Driver]General Warning Unable to open registry key 'Temporary (volatile) Jet DSN for process

 

WORKAROUNDS & SOLUTION:

 

Approach 1:

Use Microsoft.ACE.OLEDB.12.0 or Microsoft.ACE.OLEDB.16.0: (Recommended)

The following updates where not intended to cause any issue with Microsoft Jet Database Engine 4.0, at the same time the product group developers were not verifying these updates would be compatible with Microsoft Jet Database Engine 4.0 data provider as it had been deprecated back in 2002:

https://support.microsoft.com/en-us/help/4041678/windows-7-update-kb4041678

https://support.microsoft.com/en-us/help/4041681/windows-7-update-kb4041681

As both articles suggest for the below workaround.

 

In all current known cases, using the ACE provider works to connect to the excel files in lieu of the JET provider. The following download is the most up to date version for the ACE provider:

Microsoft Access Database Engine 2016 Redistributable

https://www.microsoft.com/en-us/download/details.aspx?id=54920

 

When looking into this issue, the largest thing to note is: The JET provider has been deprecated as of 2002. The last changes were made to this in 2000. See the following article for more details.

Data Access Technologies Road Map

https://msdn.microsoft.com/en-us/library/ms810810.aspx

Excerpt:

Microsoft Jet Database Engine 4.0: Starting with version 2.6, MDAC no longer contains Jet components.
In other words, MDAC 2.6, 2.7, 2.8, and all future MDAC/WDAC releases do not contain Microsoft Jet, the Microsoft Jet OLE DB Provider, the ODBC Desktop Database Drivers, or Jet Data Access Objects (DAO).
The Microsoft Jet Database Engine 4.0 components entered a state of functional deprecation and sustained engineering and have not received feature level enhancements since becoming a part of Microsoft Windows in Windows 2000.”

 

So, in short the JET provider has been working for a good 15 years after deprecation, but this most recent update caused a change which requires an update to how you are connecting to the Excel file. For the SSIS packages, we recommend pointing to the Excel by our Excel connector instead of using OLEDB.

You can locate the Excel Connector by opening up an SSIS package within SQL Server Data Tools. Create or go to an existing Data Flow task. You can see the Excel Source in the “Other Sources” Section:






When using JET, this is done through an OLEDB/ODBC source. You can use the same method for the ACE provider. The ACE provider will work, however it is not supported for use with programs such as SSIS, Management Studio, or other applications. Although this doesn’t tend to cause issues, it is important to note. That notwithstanding, the ACE drives provides the same functionality as the JET provider did. The only limitation I am aware of is the same you were encountering with JET which is that you can't have multiple users connecting to and modifying the Excel file.

Microsoft Access Database Engine 2016 Redistributable

https://www.microsoft.com/en-us/download/details.aspx?id=54920

Excerpt:

The Access Database Engine 2016 Redistributable is not intended:
 4. To be used by a system service or server-side program where the code will run under a system account, or will deal with multiple users identities concurrently, or is highly reentrant and expects stateless behavior.
Examples would include a program that is run from task scheduler when no user is logged in, or a program called from server-side web application such as ASP.NET, or a distributed component running under COM+ services.

 

I understand that there are many users that don’t know how many packages might be experiencing this issue.  It is possible to look through your dtsx packages using the following method.

If you run a command prompt as an administrator, you can use the find command to look through the packages for the keyword JET.  This will state all the files it looks through, but you will see a connection manager result for the files that have this in them. If you have already updated the package from JET to ACE, then this will not show the connection manager and is not affected by this security update.

In a command prompt:

Find /I “Jet” “C:\SampleFolder\SamplePackage.dtsx”

Alternatively, you can search the whole folder at once:

Find /C /I “Jet” “C:\SampleFolder\%.dtsx”

For more information on the Find command:

Find:  https://technet.microsoft.com/en-us/library/bb490906.aspx?f=255&MSPPError=-2147217396

 

Additionally after the installing and upgrading to ACE providers, you may run into the following error message,

The 'Microsoft.ACE.OLEDB.16.0' provider is not registered on the local machine
Or
The 'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine

If you are in a scenario where you have both 32-bit and 64-bit processing being performed on the same server that needs to use the ACE data provider, from the Microsoft standpoint, it is recommended to have 2 different servers (1 to process 32-bit mode and the other for 64-bit mode).

But there is an option (workaround) where you can have both the versions installed on the same machine by performing a “/quiet” install of these components from command line. To do so, download the desired AccessDatabaseEngine.exe or AccessDatabaeEngine_x64.exe to your PC, open an administrative command prompt, and provide the installation path and switch /quiet. Ex: C:\Files\AccessDatabaseEngine.exe /quiet. Please note that this approach is not recommended and haven't worked in few instances, so would request you to move to 2 different server approach if possible.

 

Approach 2:

Uninstall the security patch (Not recommended):

This patch seems to update Excel Jet connector from V4.00.9801.0 to v4.00.9801.1.

One of the workarounds is to uninstall the KB and it may fix the issue. Although, in some instances it may not help to resolve the issue.

 

Approach 3:

Registry change (Not recommended): [Request you to take a backup of the registry key before making any changes]

Another workaround would be to update the below registry key to point to an old copy of the DLL file:

[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Jet\4.0\Engines\Excel\Win32]

To get an old copy of the DLL, uninstall the patch KB4041681, copy the DLL "msexcl40.dll" from C:\Windows\SysWOW64\msexcl40.dll to a new location say "C:\\msexcl\\msexcl40.dll".

You can now modify the registry key [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Jet\4.0\Engines\Excel\Win32] to point to the new DLL location "C:\\msexcl\\msexcl40.dll" (by default it’d be pointing to C:\Windows\SysWOW64\msexcl40.dll)

 

Other workarounds discussed online:

There is a public forum discussion where many customers found various ways to work around this issue.

ODBC Excel Driver Stopped Working with "Unexpected error from external database driver (1). (Microsoft JET Database Engine)"

https://social.msdn.microsoft.com/Forums/en-US/2feac7ff-3fbd-4d46-afdc-65341762f753/odbc-excel-driver-stopped-working-with-unexpected-error-from-external-database-driver-1?forum=sqldataaccess

 

Solution:

The best recommended solution is to move to Microsoft ACE OLE DB provider.

Apart from this, Microsoft is working on a resolution and will provide an update in an upcoming release of the security patch. This is expected to be available in another 2-3 weeks or earlier. (Blog is updated with this information).  If this is business critical and you are still encountering any related issues, please reach out to the Microsoft Customer Services & Support(CSS) team.

 

DISCLAIMER:

THE ABOVE INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE.

 

Authors:   

Chrone Meade - Support Engineer, SQL Server BI Developer team, Microsoft

Jon Herman - Sr Support Engineer, SQL Server BI Developer team, Microsoft

Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

SSIS Scale Out worker is not showing up in Worker Agents

$
0
0

 

Understanding of the issue:

Adding the scale out worker to the master shows successful but this does not show up in the SQL Server integration services – Manage Scale Out (ISManager). Also, this will not be added in [SSISDB].[internal].[worker_agents]

Adding SSIS worker completes in the wizard as below

After this finished, if we check the ISManager or worker_agents it is not showing up.

 

We can find the logs for both master and worker in below locations

Master: <path>:\Users\[account]\AppData\Local\SSIS\ScaleOut\Master

(Example: C:\Users\SSISScaleOutMaster140\AppData\Local\SSIS\ScaleOut\Master)

 

Worker: <path>:\Users\[account]\AppData\Local\SSIS\ScaleOut\Agent

(Example: C:\Users\SSISScaleOutWorker140\AppData\Local\SSIS\ScaleOut\Agent)

 

Things to verify for this issue:

  • The port is open

We need port that is being used by Master service to be open in Firewall on the master machine. By default, the port is 8391.

Possible error in worker error log if the port is not open:

System.ServiceModel.EndpointNotFoundException: There was no endpoint listening at https://Master.domain.com:8391/ClusterManagement/ that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details. ---> System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond <MasterIP>:8391

   at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)

   at System.Net.ServicePoint.ConnectSocketInternal(Boolean connectFailure, Socket s4, Socket s6, Socket& socket, IPAddress& address, ConnectSocketState state, IAsyncResult asyncResult, Exception& exception)

   --- End of inner exception stack trace ---

   at System.Net.HttpWebRequest.GetRequestStream(TransportContext& context)

   at System.Net.HttpWebRequest.GetRequestStream()

   at System.ServiceModel.Channels.HttpOutput.WebRequestHttpOutput.GetOutputStream()

 

  • Validate certificates in certificate store

Worker server: Worker machine needs to have master certificate in its trusted root and the local worker certificate in personal.

If Master certificate is not present, we can find the copy of the certificate in master machine in DTS\Binn location (Example: C:\Program Files\Microsoft SQL Server\140\DTS\Binn\SSISScaleOutMaster.cer)

We need to copy it from this location and install the certificate on worker

 

Master server: Master server needs to have master certificate as well as the worker certificate in its trusted root.

If the worker certificate is not present we can find it in DTS\Binn location of worker server (Example: "C:\Program Files\Microsoft SQL Server\140\DTS\Binn\SSISScaleOutWorker.cer" )

Note: By default, when we try to add the worker, it will try to make appropriate changes to the worker config file as well as install the certificates

 

  • Validate worker configuration

Worker config file location: \DTS\Bin\ WorkerSettings.config

Example: C:\Program Files\Microsoft SQL Server\140\DTS\Binn\WorkerSettings.config

 

Possible error in worker log if configured to incorrect master name:

Error when sending agent heartbeat.

System.ServiceModel.EndpointNotFoundException:

There was no endpoint listening at https://wrongmaster.domain.com:8391/ClusterManagement/ that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details. ---> System.Net.WebException: The remote name could not be resolved: 'wrongmaster.domain.com'

at System.Net.HttpWebRequest.GetRequestStream(TransportContext& context)

at System.Net.HttpWebRequest.GetRequestStream()

at System.ServiceModel.Channels.HttpOutput.WebRequestHttpOutput.GetOutputStream()

 

  • Verify if the Master end point and certificate thumbprints are mapped properly
"MasterEndpoint": "https://masterserver.domain.com:8391",
"MasterHttpsCertThumbprint": "<Master Certificate Thumbprint here> ",
"WorkerHttpsCertThumbprint": "<Worker Certificate Thumbprint here>",

 

To get the certificate thumbprint,

DoubleClick on certificate -> Details -> Thumbprint

Make sure the config file is corrected and restart the worker service.

After all the above settings are verified, we have to add the worker from ISManager again.

If the issue is still present, check if the worker service is running under domain account, in that case, we can try changing it to local system and see if that works. After adding the worker, we can change the account back to domain account and restart the worker service.

 

Additional information:

If we are having issue with the master service itself not shown online or getting error saying master service is not installed on this server, we can check the master configuration to map it to correct instance

Master config file location:

<path>:\Program Files\Microsoft SQL Server\140\DTS\Binn\MasterSettings.config

"PortNumber": 8391,
"SSLCertThumbprint": "<Master certificate thumbprint>",
"SqlServerName": "<masterserver\\instancename >",
"CleanupCompletedJobsIntervalInMs":43200000,
"DealWithExpiredTasksIntervalInMs":300000,
"MasterHeartbeatIntervalInMs":30000,
"SqlConnectionTimeoutInSecs":15

By changing the server name here, we can map the master to different instance.

 

Author:    Chaitra Hegde – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 


SSIS packages executed through SQL Agent Job scheduler fails while connecting to Azure File Storage Path

$
0
0

Recently we ran into an interesting scenario where we had a SSIS package that contained a script task and inside the script task, we were making connections to Azure File Storage and pulling the data from the file.

While running this SSIS package from Visual Studio – SQL Server Data Tools (SSDT) & after deploying the SSIS package under SSISDB catalog and running the package from SSISDB catalog the package executes as expected accessing the files and pulling the data without any issues. But when we configure this SSIS package to run under a SQL Agent Job, it doesn’t pull the expected data accessing the Azure File Storage.

By Azure File Storage, we are referring to File services available under your Azure Storage which you would connect using the https://xxxx.file.core.windows.net/ ,  here you can upload any files and access it from your machines by mapping it to a local drive (or) access the path directly by providing the storage credentials.

 

 

 

 

 

 

 

 

Fig 1: Azure File Storage – File Services

 

Fig 2: Network Mapped Drive – From My PC

Fig 3: Network Mapped Drive – Accessing the share

We noticed that the issue happens only when we are running the SSIS packages over a SQL Agent Job. The same package runs fine from command prompt using DTEXEC.exe utility (or) from SSISDB (or) from SSDT works fine without any issues.

In order to narrow-down the issue further, we had collected Process monitor traces for both the working and failing scenarios and compared the logs.

 

In the success scenario: [Running the package from the Visual Studio – SSDT]

We are able to see that the access result to Azure File Storage wiz. \\XXXX.file.core.windows.net\testssis\ is returned as SUCCESS.

 

In the Failure Scenario: [Running the package from SQL Agent Job]

We are able to see that the access result to Azure File Storage wiz. \\XXXX.file.core.windows.net\testssis\ is returned as LOGON FAILURE.

 

After spending a significant time, we understood that we need to take care of the following 2 things to resolve this issue.

  1. We need to configure and store the Azure File Storage access credentials in the Windows Credential Manager in advance.

You can do this using the Windows Control Panel à Credential Manager à Windows Credential and Add a Windows credential. (or) Use the below command on an elevated command prompt window to store the windows credential.

Syntax:                 cmdkey /add:<yourstorageaccountname>.file.core.windows.net /user:AZURE\<yourstorageaccountname> /pass:<YourStorageAccountKeyWhichEndsIn>

Example:             cmdkey /add:XXXXX.file.core.windows.net /user:AZURE\XXXX /pass:mwpqXXXXXXXXXwaFXXXX7wnXXXXXXXXXXX==

This way you will be able to persist the connections across reboots and make it visible for the SQL Server Agent Job to use this.

and

  1. Instead of using the local path [Mapped network Drive Name] wiz. Z: for accessing the Azure File Storage, we need to stick onto the usage of UNC [Universal Naming Convention] path (like \\xxxx.file.core.windows.net\testssis) rather. [Recommended] in our package.

Once we added this credential under Credential Manager and modified our SSIS package to use UNC path, we were able to successfully establish the connection to the Azure File storage while running the SSIS packages through SQL Agent Job schedulers successfully.

 

Author:    Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Sarath Babu Chidipothu  – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Flat file connection manager doesn’t work for rows containing “Embedded Qualifiers”

$
0
0

I came across an interesting issue related to Flat file connection manager which doesn't work for rows containing "Embedded Qualifiers". I have worked with two different partners and both were having the same issue so thought of sharing my experience with you all.

Scenario and Issue:

Consider that we have a source CSV file which contains a row with Embedded Qualifiers like row number 4 -  {"002042","OBEE Blue ("ABC" prin)","", } in the below example, with text qualifiers for the Flat File Connection Manager set to '"' and irrespective of property AlwaysCheckForRowDelimiter been set to True (or) False, the package fails during execution.

Source CSV File content:
 CODE,NAME_OF,DEFINITION
 "002090","Grey",""
 "002091","Grey, Red",""
 "002092","White/Teal",""
 "002042","OBEE Blue ("ABC" prin)",""
 "002093","GPf Grey",""
 "002094","BMWand Blue",""

Error:

[Flat File Source [2]] Error: The column delimiter for column "<ColumnName>" was not found.

[Flat File Source [2]] Error: An error occurred while processing file "<Filepath>" on data row 5.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Flat File Source returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

This same CSV file works flawlessly in SQL Server 2008 R2 Business Intelligence Development Studio (BIDS) environment but fails with the above error when using SQL Server Data Tools (SSDT) for SQL Server 2012 and above.

You can find the related or similar issues described in below URL.

https://connect.microsoft.com/SQLServer/feedback/details/560592/flat-file-connection-manager-not-handling-text-delimiters-in-csv-files

https://connect.microsoft.com/SQLServer/feedback/details/282396/ssis-flat-file-parser-does-not-read-column-delimiters-embedded-in-text-data

 

You can find the description in detail with both working and non-working scenario below

Working Scenario:

BIDS/ SQL Server 2008-2008R2

1. Create a Package and select Data Flow task.

2. Create a Flat File Connection Manager with data as similar mentioned above.

3. Create a OLE DB destination as below.

4. At the end Data Flow task will look like as below.

5. When you Execute the package, it will get executed successfully.

6. We’ll get the desired result in SQL Destination table.

 

Non-Working Scenario:

SSDT/ SQL Server 2012 or above

1. Create a Package and select Data Flow task

2. Create a Flat File Connection Manager

3. Create an OLEDB destination as below.

4. At the end Data Flow task will look like as below.

5. When you Execute the package, it will fail with error [see screenshot]

So, this behavior is not a bug. This is by design behavior and we should follow the below changes in the flat file to fix it. I have used sample row for the demo purpose

Solution:

To use a specific character as text qualifier, e.g. “as text qualifier, we should use “” to represent the real “

So "OBEE Blue ("ABC" prin)" should be modified to "OBEE Blue (“"ABC”" prin)"

Starting from SQL Server 2012, the Flat file connection manager has been greatly improved to accommodate Embedded Qualifiers. Following articles talk about the Flat file connection manager improvements in detail.

https://www.microsoftpressstore.com/articles/article.aspx?p=2201315&seqNum=5

https://blogs.msdn.microsoft.com/mattm/2011/07/17/flat-file-source-changes-in-denali/

Hope this helps you as well.

 

Author:  Vikas Kumar – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Chaitra Hegde  – Support Engineer, SQL Server BI Developer team, Microsoft

TLS Issue with SSIS package while accessing OData Source like Dynamics AX Online

$
0
0

 

I have come across an interesting scenario recently while working on one of the customer issues. The SSIS package fails during the execution, even though the Test Connection and Preview Data in the OData Source Task succeeds while connecting to the Dynamics AX OData Source.

Here is the error message we got:

[OData Source [2]] Error: Cannot acquire a managed connection from the run-time connection manager.

[SSIS.Pipeline] Error: OData Source failed validation and returned error code 0xC020801F.

 

Let me give you a quick background of the issue. The SSIS 2014 Package fails during the execution while connecting to Dynamics AX Online OData Source with the below error:

The Fiddler trace looks like below

You can clearly see that HTTPS Handshake failed while trying to access the OData Source URL.

 

We can resolve this using the below steps:

  1. Install Microsoft .NET 4.6 and above on your computer (https://technet.microsoft.com/en-us/library/security/2960358.aspx).
  2. Enforce TLS 1.2 on your machine through registry settings. In an elevated command prompt run the following commands:
  • reg add HKLM\SOFTWARE\Microsoft\.NETFramework\v4.0.30319 /v SchUseStrongCrypto /t REG_DWORD /d 1 /reg:64
  • reg add HKLM\SOFTWARE\Microsoft\.NETFramework\v4.0.30319 /v SchUseStrongCrypto /t REG_DWORD /d 1 /reg:32

 

If you have performed the above action plan and you are still experiencing the issue, then please contact the Microsoft CSS team for further investigation.

 

Author:      Mounika Narayana Reddy – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Using Amazon RedShift with Power BI

$
0
0

 

Over this blog, I would like to address on how to install the Amazon Redshift Driver on Windows Operating System and how to configure the Data Source Name(DSN) to be used with Power BI.

 

Creating a System DSN Entry for an ODBC Connection on Microsoft Windows.

 

After you download and install the ODBC driver, you need to add a data source name (DSN) entry to the client machine. SQL client tools use this data source to connect to the Amazon Redshift database.

Note: If you have installed Power BI(64-bit) make sure to install the Amazon Redshift ODBC Driver (64-bit) and for Power BI (32-bit) install Amazon Redshift ODBC Driver (64-bit).

 

  • To create a system DSN entry

1. In the Start menu, in your list of programs, locate the driver folder or folders.

Note: If you installed the 32-bit driver, the folder is named Amazon Redshift ODBC Driver (64-bit). If you installed the 64-bit driver, the folder is named Amazon Redshift ODBC Driver (64-bit). If you installed both drivers, you'll have a folder for each driver.

2. Click ODBC Administrator, and then type your administrator credentials if you are prompted to do so.

3. Select the System DSN tab if you want to configure the driver for all users on the computer, or the User DSN tab if you want to configure the driver for your user account only.

4. Click Add. The Create New Data Source window opens.


    

 

5. Select the Amazon Redshift ODBC driver, and then click Finish. The Amazon Redshift ODBC Driver DSN Setup window opens.

 

 

6. Under Connection Settings, enter the following information:

 

Data Source Name: Type a name for the data source. You can use any name that you want to identify the data source later when you create the connection to the cluster.

Server: Specify the endpoint for your Amazon Redshift cluster. You can find this information in the Amazon Redshift console on the cluster’s details page.

 

Port: Type the port number that the database uses. By default, Amazon Redshift uses 5439, but you should use the port that the cluster was configured to use when it was launched.

Database: Type the name of the Amazon Redshift database.

Under Credentials, enter the following information:

User: Type the user name for the database user account that you want to use to access the database.

Password: Type the password that corresponds to the database user account.

Click Test. If the client computer can connect to the Amazon Redshift database, you will see the following message: Connection successful.

 

 

Extracting the data from the Amazon Redshift Database for Power BI Desktop.

  • Click on Get Data from the Power BI Desktop console. Then click on Other and further click on ODBC.

 

Click on connect.

Click on the drop down under From ODBC.

 

Select the DSN name which you have created and click on OK.

 

A prompt asking for credentials will pop-up. Pass on the database credentials. And click on Connect.

 

After connecting Load your data and Publish it in the Power BI Desktop.

 

 

Configuring the data source for the Power BI Service for scheduling a refresh.

  • Click on Settings icon and choose the Manage gateways

 

Click on ADD DATA SOURCE for configuring a data source for your dataset.

 

Fill in the required information and click on Add.

Note: The data source name should be the same as passed on to the dsn. And the connection string should be in the format: dsn=<dsnname>

 

  • Once you have configured the data source add the gateway to your dataset and you are ready to go.

 

Author:  Aishwarya Jaiswal – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway  – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

“Pivot Table Operation failed. We cannot locate a server to load the workbook Data Model” error when applying filter in Excel workbook accessed through Power BI Report server

$
0
0

 

Scenario and Issue: -

After configuring (by following https://docs.microsoft.com/en-us/power-bi/report-server/excel-oos article) Power BI Report Server to use Office Online Server (OOS) to host the Excel PowerPivot models, when you open the Excel workbook which has the PowerPivot model and try to apply filter or slicer, you get the below mentioned error.

 

Error: -Pivot Table Operation failed. We cannot locate a server to load the workbook Data Model.

 

Tested Environment: -

  1. Edition: Power BI Report Server - SQL Server Enterprise with Software Assurance
  2. Product Build:0.600.434
  3. SSAS Version: SQL Server 2017
  4. PBIRS Version: Power BI Report Server 2017 Microsoft 1.1.6514.9163 (October 2017)

 

Cause: -

We looked at the ULS logs of the OOS at “C:\ProgramData\Microsoft\OfficeWebApps\Data\Logs\ULS” path and found that “Check Administrator Access (Server Name\POWERPIVOT): Fail” call failed. This indicates the network service account doesn’t have admin privileges on PowerPivot instance.

 

In my case, I was able to resolve the issue by providing administrator permission to NT AUTHORITY\NETWORK SERVICE account on my SSAS PowerPivot instance.

 

To do this: -

  1. Connect to Power Pivot instance of SSAS from SSMS.
  2. Right click on it and then click on properties.

 

  1. Select the security tab and then click on Add.

 

  1. Provide the NT AUTHORITY\NETWORK SERVICE account and then ok.

 

  1. Verify that, Network account is added successfully as shown below.

 

 

Author:  Varun Kulkarni – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer: Sarath Babu Chidipothu – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

You cannot publish more than 200 Dataset/Report together within an Workspace in Power BI service

$
0
0

 

We have come across an interesting scenario recently while working on one of the customer issues. The issue was not being able to publish the PBIX file to Power BI service in one particular workspace.

 

The Power BI Desktop shows below error while publishing to service:

 

Error message:

 

Well to give you more insights on the Dataset: The dataset was not huge enough, it was in terms of few MB's of size.

We focused on the workspace for more details. The workspace has 198 Datasets already published within it.

The workspace already had a few datasets, Reports and Dashboards:

We already had the following:

-> No. of dataset: 198 datasets, 190 report and 105 dashboards.

 

The dataset was already published on service, few reports enhancements were made in Power BI desktop and published back to service. (Please note: It is not a necessity that the same dataset should already be present in same workspace)

 

We enabled the Power BI Desktop Tracing :

   ->     Screenshot for the enabling the Tracing in PBI Desktop :

 

 

The Error message in the Trace file found :

 

Look for the error stating the below :

\"httpStatusCode\":\"500\",\"exceptionType\":\"Microsoft.PowerBI.Client.Windows.PowerBIService.PowerBIServiceException\",\"errorCode\":\"ResourceLimitsPackageCountExceeded\",\"errorDetails\":\"{\\\"code\\\":\\\"ResourceLimitsPackageCountExceeded\\\",\\\"pbi.error\\\":{\\\"code\\\":\\\"ResourceLimitsPackageCountExceeded

 

If you see the same error in the trace, then it concludes that the workspace in which we are trying to publish the datasets has exceeded the its default limit.

 

The reason for the error:

The workspace allows only maximum of 200 datasets to be published with in it. This limit of 200 includes the combination of datasets and reports together. So, if you try to publish beyond this limit then you would get the above error in Power BI desktop.

 

We have a few workarounds to resolve this issue:

 

** Workaround  1:

-------------------------

-> Flush out the unwanted/unused dataset from the workspace.

 

** Workaround  2:

-------------------------

-> Create another workspace and use it particularly for large datasets.

 

** Workaround 3 :

-------------------------

-> You can also use the Get Data method to import local files into Power BI Service by keeping the PBIX files into OneDrive or any local folder or any other Power BI supportable data source locations.

 

Note: The Product team is aware of this limitation with in Power BI service UI and they are working to provide a resolution soon.

 

 

Author: Firdous Fathima  – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Configure Oracle data source for SQL Server Reporting Services (SSDT and Report Server)

$
0
0

In general, there are several links available to download the ODAC components from the Oracle official site.  It varies across the Oracle versions, and the type of installer that we would need. Most of the time, we look for the .exe installer to install the Oracle related drivers as we are more comfortable with the Windows MSI installers.

In this blog, we will go through the ODAC driver installation and configuration with the SQL Server Reporting services (SSRS) data source to connect Oracle data source using Oracle native driver.

 

Assumption:

  1. We would be using a single system to develop the report using SQL Server Data Tools (SSDT) and host the report in Report Server. It means SSDT and SSRS both are installed in the same system.
  2. SSDT version – Visual Studio 2017
  3. SSRS – SQL Server Reporting Services 2014.

 

Steps:

1. SSDT runs in 32 bit whereas SSRS runs in 64 bit. Since we are in the same system, we need to install both the bit-ness of Oracle drivers one by one.

We need to go to the Oracle official site to search for the drivers for the required version. Please note that we need to check for the installer which has the OLE DB/ ODP. Net (Oracle data Provider for .Net components.)

For 64-bit driver-

http://www.oracle.com/technetwork/database/windows/downloads/index-090165.html

(Download link available during the time the blog had been written)

 

For 32-bit driver- http://www.oracle.com/technetwork/topics/dotnet/utilsoft-086879.html

(Download link available during the time the blog had been written)

2. After uncompressing the file, you will see the installer file. Once you double click on that, you will get the information on GUI which is self-explanatory.

You need to note down where the oracle driver is getting installed, you need to place the tnsnames.ora file in that location which we will discuss later.  For me, the installation folder as

E:\app\client\xxxx\product\12.2.0\client_1 (32 bits)

E:\app\client\xxxx\product\12.2.0\client_2\ (64 bits)

Client_#, the number will change according to your installation order.

No need to update any environmental variable. During the installation, it was already done.  You can check the same going to the My computer -> Properties -> Advance System setting -> Environmental Variables -> System Variables -> Path

 

3. Once the 32 bits and 64 bits driver installations are done, you need to place the tnsnames.ora file in the following location

E:\app\client\xxxx\product\12.2.0\client_1\Network\Admin

E:\app\client\xxxx\product\12.2.0\client_2\Network\Admin

 

The format of tnsnames.ora file as below- Ref: https://docs.oracle.com/cd/F49540_01/DOC/network.815/a67440/appb.htm

<ServerName> =
  (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = ##.###.##.###)(PORT = 1521))
    (CONNECT_DATA =
      (SERVER = DEDICATED)
      (SERVICE_NAME = orcl)
    )
  )

 

 

4. So, we have now successfully installed the Oracle drivers. The next step is to test the connection from Visual Studio and SSRS.

 

5. Open the SSDT. I tested in SSDT 2017. Create a Report Server Project –> File-> New -> Project -> Reporting Services -> Report Server Project

 

 

6. After creating the Reporting Services Project, you need to do the test connection. You can do the test connection by following the below screenshot. Please note that here we are using the OLE DB driver.

 

7. The same test connection, we will now be doing in SSRS. To do that we need to register the OraOLEDB driver. Go to the command Prompt (Run as Admin) -> and then run the following command. The path might vary as per your installation directory. We need to register the 64 bits driver since SSRS run on 64 bits ( E:\app\client\xxxx\product\12.2.0\client_2\bin\OraOLEDB12.dll)

 

8. Once this is done, after deploying the project, we can do the test connection in SSRS

 

 

9. We can use the ODP.Net to connect to the SSRS as well. This is the default driver which shows in SSDT. We need to register the ODP.Net DLLs. Go to the 64 bits installation folder. For me it is -

E:\app\client\xxxx\product\12.2.0\client_2

Run the following commands using the command Prompt (Run as admin).

E:\app\client\xxxx\product\12.2.0\client_2\odp.net\bin\2.x>oraprovcfg /action:gac /providerpath:E:\app\client\xxxx\product\12.2.0\client_2\odp.net\bin\2.x\Oracle.DataAccess.dll
E:\app\client\xxxx\product\12.2.0\client_2\odp.net\bin\4>oraprovcfg /action:gac /providerpath:E:\app\client\xxxx\product\12.2.0\client_2\odp.net\bin\4\Oracle.DataAccess.dll

 

 

10. Once it is done, we can test the connection from SSRS, and as well as from SSDT.

 

This is how you would be able to test the connectivity from the SSRS to the Oracle database. If it is not working, you might need to test the Oracle connection outside of SSRS. If that does work, and only connections made from SSRS fails, then I would recommend you contact the Microsoft Support team.

 

 

Author:     Samarendra Panda - Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft


SSAS translation for client application

$
0
0

Tested on: Excel 2013, Excel 2016, Reporting Services 2012/2014/2016.

 

Hello Everyone, in this blog I will discuss on how to view SSAS translation in different client Application.

 

Before I begin with the details on this blog, if anyone doesn’t know what Translation in SSAS is, then I request you to go and check out the link below:

https://docs.microsoft.com/en-us/sql/analysis-services/translation-support-in-analysis-services

 

Requirement: I am sure everyone wants to view the Translation language of SSAS cube in the client application that they are using to pull the data from SSAS. So, how can we do it?

Well here I will talk about 3 client application- Excel, Reporting Services and Power BI.

 

For Excel:

 

  1. Open an Excel sheet, go to the Data section-> Get Data and choose from Analysis Services.
  2. Enter the server name, credentials and choose the database and cube that you want and click on Next.
  3. In the next page note down the File Name where the model will be saved (it will be in .odc extension) and click on Finish then Import the data.
  4. Now close the Excel and go to the folder where the model is saved.
  5. Open the .odc file in a notepad.
  6. In the Connection String section, you will have to add a parameter called “Locale Identifier”
  7. The default is 1033 which is English default (e.g.: Locale Identifier=1033)
  8. You need to add the value as per your translation language in SSAS. Please the link below to know the Locale Identifier code for each language.

https://msdn.microsoft.com/en-us/library/ms912047(v=winembedded.10).aspx

  1. Once the Locale Identifier is set, save the file.
  2. Now a new Excel file and go to the Data section.
  3. Click on Existing Connection and choose the connection which you have create recently.
  4. Once done import the data and you will see that the Language will show as per the SSAS translation language.

 

For Reporting Services:

 

  1. Go to create Data Source section.
  2. You can either choose to write the connection string or you can also choose to Build the connection string.
  3. You can do this for both shared Data source and embedded data source.
  4. While writing a connection string, apart from data source and Initial catalog, add the parameter “Locale Identifier=1034” (For my case I have used Spanish(Spain)).
  5. If you choose to build the connection string, then click on Advanced button and look for Locale Identifier parameter.
  6. Set the Locale Identifier value as per your translation language in SSAS by referring the link above.
  7. Once done, save the data source.
  8. Now while creating the data set you will see that the Language will show as per the SSAS translation language.
  9. Please note that the report must be created based on the SSAS translation and the translation cannot be changed while viewing the report.

 

For Power BI:

 

Well for Power Bi we don’t have official support yet from Microsoft and there is no ETA as in when it will be supported.

HTH

 

Author:      Jaideep Saha Roy – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Dynamic RLS support for Analysis Service Tabular Model based on multiple roles for each user

$
0
0

Tested on: SQL Server Analysis Service 2016 Tabular, Azure Analysis Service.

 

Hello everyone, I am sure that whenever you wanted to implement Row Level Security(RLS) for Analysis Services Tabular Mode you might be wondering how I will implement RLS when some of my user has multiple roles assigned. Well here is the solution for this issue.

Now before going on to the details of this blog, if Row Level Security in Tabular Mode is still an alien to you, I would recommend you pay a visit to the Microsoft document below which will give you a clear picture to implement Row Level Security.

https://docs.microsoft.com/en-us/power-bi/desktop-tutorial-row-level-security-onprem-ssas-tabular

Coming back to my question, let's assume that you have a two Fact tables namely FactInternetSales and FactResellerSales with a Dimension table named as DimSalesTerritory.

 

Requirement: A user will have access to FactInternetSales for one territory but FactResellerSales for a different territory.

Using the conventional mode of the RLS setup this requirement won't be possible, so I have a new way to set up the RLS.

 

Setup: From the SQL Server, to make this setup we have created a Dimension Table called DimUser as below:

Here I have used two Columns one for SalesTerritory Region and one of ReSalesTerritory Region for each of the user to assign the Territory according to the FactTables.

 

Project Creation: From the Visual Studio we have created Tabular Project and imported the tables as shown below:

 

Please note the relationship that I have built.

DimUserSecurity has a two relationship with DimSalesTerritory

DimUserSecurity.SalesTerritoryID --> DimSalesTerritory.SalesTerritoryKey

DimUserSecurity.ReSalesTerritoryID --> DimSalesTerritory.SalesTerritoryKey

 

Here one of the relation is Active and one is Inactive.

Now based on the user I have created a Role named SalesTerritoryUsers, given read permission to the model and added all the member there which are a part of DimUsers table.

Now for the Row Filters I have added DAX filter to each of the Fact Table

 

=FactInternetSales[SalesTerritoryKey]=LOOKUPVALUE(DimUserSecurity[SalesTerritoryID], DimUserSecurity[UserName], USERNAME(), DimUserSecurity[SalesTerritoryID], FactInternetSales[SalesTerritoryKey])

=FactResellerSales[SalesTerritoryKey]=LOOKUPVALUE(DimUserSecurity[ResalesTerritoryID], DimUserSecurity[UserName], USERNAME(), DimUserSecurity[ReSalesTerritoryID], FactResellerSales[SalesTerritoryKey])

 

This DAX filter will get the data for the logged in user, match the user with the DimUser table, pick the SalesTerritoryID or ReSalesTerritoryID from the DimUser table, match it with the FactInternetSales or FactResellerSales and it will get the data specific to the territory that is assigned the user.

Once everything is set I have saved the model and deployed it in my Analysis Service.

 

Result:

Now to test it, I have browsed the Model from Management Studio with the user Harpreet(xyz/harpsi) who has access to FactInternetSales for Australia region and FactResellerSales for Germany region.

Upon browsing the Fact Table based the SalesTerritory Region, it worked completely fine for me. Please refer the screenshot below.

 

Additional Requirement: Also let say that you have an additional requirement like mine where you want to give more than one Territory permission for one user in a Fact Table. This can also be done with this above approach. All you have to do is to add the Territory ID with the user details in the DimUserSecurity table. Please refer the screenshot below.

 

EmployeeID SalesTerritoryID ReSalesTerritoryID FirstName LastName UserName
1 1 6 Mani Jacob xyz\majac
2 2 7 Kane Conway xyz\kaneco
3 9 8 Harpreet Singh xyz\harpsi
3 3 NULL Harpreet Singh xyz\harpsi
2 NULL 6 Kane Conway xyz\kaneco

 

Here my user Harpeet has access to FactInternetSales for two Territory 9 and 3 which is Australia and Central whereas he has access to only one Territory for FactResellerSales.

This option is very helpful over some out of box requirement if you have user assigned to different role for different departments.

 

Hope this helps for you as well.

 

Author:      Jaideep Saha Roy – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Proactive caching with in-Memory tables as notification

$
0
0

 

Recently we encountered an issue using a Multidimensional Model where ROLAP and proactive caching were enabled on one of the partition. The notification was set to SQL Server table to track the changes and refresh the cache.

 

The behavior we noticed was that if a SQL Server Table was set to a Memory Optimized Table, we don’t see a notification within Analysis Services for any changes within the SQL Server Table. But if the SQL Server Table was not a memory optimized table, the notifications were sent back to Analysis Services and the cache was refreshed.

 

  • We used AdventureWorks to understand this behavior, where the partition: Customers_2005  (Measure group: Internet Customer) with ROLAP and proactive caching enabled.

 

 

  • After Deploying the model to SSAS 2016 instance and making changes within SQL server ([FactSalesQuota]), we could see this notification from the SSAS profiler trace :” A notification was received from an instance of SQL Server for the '[dbo].[ FactSalesQuota]' table
  • The data was changed on the SSAS as well:

 

Before the changes SSAS DB:

Sales Amount Quota : 154088000

After the changes SSAS DB (Sales Quota partition converted to ROLAP):

Sales Amount Quota : 154088004.7

 

  • We converted the FactSalesQuota table to an in-memory table and tested the behavior. This time, we didn’t see any notification triggered from the analysis services profiler trace
  • Took a profiler on the SQL and SSAS simultaneously and we see the below query getting triggered whenever there is a change in the table on SQL side profiler:

 

DECLARE @OlapEvent BIGINT;SELECT @OlapEvent = ObjIdUpdate(2);SELECT (@OlapEvent & convert(bigint, 0xffff000000000000)) / 0x0001000000000000 AS Status, (@OlapEvent & convert(bigint, 0x0000ffff00000000)) / 0x0000000100000000 AS DbId, @OlapEvent & convert(bigint, 0xffffffff) AS ObjId;

 

  • But when we convert the table to an in-memory table. The query was not getting triggered and we are not seeing any notification back on SSAS as well.
  • After more research it seems this query keeps running in suspended state all this while.
  • For normal [FactSalesQuota] table (not in-memory)  we see the below:

 

SQL Profiler : I could see the notification query is getting triggered:

 

SSAS Profiler: SSAS is receiving a notification:

 

 

  • Once we convert the FactSalesQuota tables as in-memory table, I still see the notification query running in suspended state.

 

 

 

 

  • But after we make the change to the table, the notification query is not triggered.

 

 

No notification seen in SSAS:

 

 

  • We verified this behavior with our PG Team and understood that we rely on the SQL server notification to know if and when any changes have been made to the SQL table and only then do we initiate a refresh cache.

 

 

 

 

Conclusion:

 

Proactive caching with SQL server set for notification doesn't will not work for in-memory tables in SQL server. This is a limitation from the SQL side itself.

 

 

 

Author:      Chandan Kumar – Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Invalid Column __$command_id issue during CDC implementation in SSIS package

$
0
0

 

In this blog, we will be addressing the problem with CDC implementation in SSIS package which goes with the error ‘invalid column name __$command_id’.

 

CDC stands for the change data capture. It was introduced in SQL server 2012. In SSIS package we have CDC control flow task and CDC source and CDC destination in Data flow task. CDC processing logic split into two packages – an Initial Load package that will read all the data in the source table, and an Incremental Load package that will process change data on subsequent runs.

 

Issue:

While previewing the CDC source task in an incremental package load, we get the error: ‘invalid column name __$command_id’

 

 

 

Solution:

  1. Ensure we are on the latest the latest version of SQL server.
  1. If the issue persists,
  • Download the Attunity dlls from the following site based on your SQL Server/SSIS version.
  • Place the dlls in the right location and GAC
    • We can find 4 dll in $\bin on the downloaded folder:
      • Attunity.SqlServer.CDCControlTask.dll
      • Attunity.SqlServer.CDCDesign.dll
      • Attunity.SqlServer.CDCSplit.dll
      • Attunity.SqlServer.CDCSrc.dll
  • You need to replace each of them to:
    • X86 folder
    • X64 folder
  • Further GAC them so that they appear on the GAC_MSIL folder

 

 

Detailed locations are here:

 

 

Please reach-out to the SQL Server Integration services support team if you are still facing any issues

 

 

 

Author:      Chetan KT - Technical Advisor, SQL Server BI Developer team, Microsoft

Reviewer:  Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

Connecting from SSRS 2016 to SSAS using HTTP/MSMDPump and Basic Authentication – “Object reference not set to an instance of an object”

$
0
0
Viewing all 80 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>