Wednesday, March 20, 2024

Power BI deployment premium backend Services error

Issue:- Deployment pipeline fails from UAT to Production for the semantic model with the below error, 

An error occurred in Power BI Premium backend services. Other As Persistence error. 

The artifact couldn't be deployed to the target workspace. Try deploying the content again.



Cause - Power BI forums or any other resource doesn't seem to have an explanation for this as the backend services are handled by Microsoft. I read through multiple forums as to see what is happening here with little luck. The only thing I could infer is that this seems to be happening for datamodels with two or more sources, which seems to be the case with us.

The guess is that Microsoft is already aware of the same as this has started happening since 2023 as per some posts I saw on the community. 

Resolution: This is more of a work around than a solution suggested by one of the users at this community forum page, which also worked for us. 

  • Deploy the power bi dataset to the workspace and let this error out. 
  • Post deployment pipeline being triggered, take the original Power BI semantic model or dataset pbix file and publish it directly to another workspace, in my case I published it to "My workspace".
  • Post manual deployment of semantic model/dataset PBIX to a seperate workspace, go back to the original deployment pipeline and then repeat the deployment from lower environment to the environment where deployment failed last time, this should work. Worked for us.

Weird as it may sound, it seems to be working for us and a few other people.

References - https://community.fabric.microsoft.com/t5/Service/An-error-occurred-in-Power-BI-Premium-backend-services/m-p/3074121 

Tuesday, February 13, 2024

Synapse Devops deployment error - service principal expiration

Issue: Deploying code from development environment in synapse to higher environments using Azure Devops failed due to the below error 

Encountered with exception:Error: Get workspace location error: Could not fetch access token for Azure. Verify if the Service Principal used is valid and not expired. For more information refer https://aka.ms/azureappservicedeploytsg

An error occurred during execution: Error: Get workspace location error: Could not fetch access token for Azure. Verify if the Service Principal used is valid and not expired.


Cause
: If you are using Azure Devops, you should know that Azure devops needs access to all the environments you are moving code and it uses a service principal to get access to the resources. When your IT admin creates a service principal which is used for Azure Devops, they normally set up a period till when the secret expires. For us we went over the threshold and hence the error. 

Resolution: Asking your Windows AD/IT admin to create a new passcode value and using it in your devops by following below steps

Go to Azure Devops

Navigate to Project Settings > Pipelines > Service Connections and then to edit connection and this opens up a promt as shown below, here update the service prinicpal key you received from your Windows AD/Entra Admin, paste, verify and save it.



Once you have given the new service principal key, we should be ready to start deploying again. 

Wednesday, February 7, 2024

Working with com.crealytics.spark.excel package for excel files in Azure Synapse

This is a post to help atleast some of you who is trying to get the com.crealytics.spark.excel package up and running in your synapse workspace and on your spark pool. I will try to explain it in the most simplest of steps. 

Step 1 - Go to MVN repository and download the latest jar file for the crealytics excel spark package.



Step 2 - Once the file is downloaded go to your Synapse workspace and to the Manage tab, then to the Workspace packages tab


Step 3 - Upload jar file to workspace packages and it should up  on the list with provisioning status as succeded, see below.



Step 4 - Once the package is uploaded, go to Manage > Sparkpool > Packages and select the spark-excel_2.12-3.5.0_0.20.3.jar from the list. Important that session level packages are allowed and the spark pool is restarted after this step. See screenshots below,


That's it on the configuration side, now on your notebook, you could have a code snippet like below to read from an excel file.

df = spark.read.format("com.crealytics.spark.excel").option("header", "true")
/ .option("inferSchema", "true").load(ReadPath)

where ReadPath contains the path to the excel in your datalake. You can play around with more options on this piece of code. Hope this helps, please let us know in comments.


Note:- If you have higher environments, make sure you repeat the steps there. 

Friday, January 26, 2024

Copy files from Azure Data Lake to a SharePoint Site

Use Case: We have output files from a use case processed using PySpark in Synapse and loaded into a path in Azure Data Lake as CSV and Excels, but we wanted to put these files into a SharePoint site which was easier for users in the organization to access. 

Immediately as we started researching, we faced a limitation(as of 26-01-2024) in achieving this use case with ADF inside Azure Synapse Analytics, SharePoint site is not yet supported as a sink in ADF/Synapse.

The only other alternative to doing this using a logic app, the steps are fairly simple.

On a high level the steps are as below,

List Blobs from Azure Data Lake - Here you define the storage account, the connection method( we used access key)


In the next step, we have a for each control loop which iterates the output from the list blobs step, then gets the blob content( pass the path to the file ) and then this blob conent is passed on to a SharePoint 'Create File' step. 

It is important that the path is set up correctly and that the security credentials used have adequate access to the SharePoint. 



Thing to note: The 'Create file' comes with it's set of limitations as all things Microsoft do, where it can move a file of size greater than 1 GB, this is something we will have to live with for now( update as of 26th Jan). 

Friday, November 24, 2023

Find Log Analytics Key of Azure Log Analytics Instance

 Issue: Find the keys to connect to a log analytics instance in Azure.

Solution:  You can find this under Log Analytics Workspace > Agents > Log Analytics Agent Instructions