Query Office 365 and Azure AD Logs with Azure Sentinel / Log Analytics and KQL

A few months ago I shared a tweet with a few quick links for learning about Kusto Query Language (KQL) and Azure Log Analytics.  Since that time Azure Sentinel (which sits of top of Azure Log Analytics) has been released to general availability (GA).  In this post I’ll build on that tweet and share a number of resources for starting out with Azure Sentinel / Azure Log Analytics and KQL.

Before you continue with this post I highly recommend reading MVP Tobias Zimmergren’s post on Monitoring Office 365 tenants with Azure Sentinel.

Monitoring Office 365 tenants with Azure Sentinel
https://zimmergren.net/use-azure-sentinel-to-digest-office-365-audit-logs/

Background

Most Microsoft cloud services emit logs for audit, operational, etc. purposes.  These logs are useful for gaining insights into who is using the service and how they are using it but sometimes it is not always easy to query these services.  You might be restricted to only a few thousands records at a time, a limited set of filters, or other constraints.  Azure Log Analytics and KQL make it possible to query a large number of records (in my experience millions to hundreds of millions) in a short time period (seconds in most cases instead of minutes or hours).

Kusto Query Language (KQL)

Over the years I’ve used T-SQL to query SQL Server when needed, but I am by no means an expert in the T-SQL language or concepts.  Having a basic understanding of T-SQL did make it easier for me to understand the entry level concepts of KQL such as filtering, ordering, grouping, and more.  Intermediate to advanced concepts like time based aggregations or self referential queries took a little more time to understand but my Data & AI PFE peer Ken Kilty provided a lot of good advice in this space.

Kusto Query Language overview
https://docs.microsoft.com/en-us/azure/kusto/query/

SQL to Kusto query translation
https://docs.microsoft.com/en-us/azure/kusto/query/sqlcheatsheet

Limiting Costs

Azure Sentinel (and by proxy Azure Log Analytics) is charged in 2 ways:

  • Ingestion of data
    • Reserved capacity
    • -or-
    • Per GB
  • Retention of data

Azure Sentinel pricing
https://azure.microsoft.com/en-us/pricing/details/azure-sentinel/

For all Office 365 data the ingestion of data is free.  Azure AD audit logs and sign-in logs will be charged according to the reserved capacity or pay-as-you-go per GB model.

Retention of data in an Azure Sentinel enabled workspace is free for the first 90 days.  Beyond the first 90 days pricing is per GB per month.

Ex. Storing Office 365 logs for 9 months, a customer would only be charged for (9 months – 3 free months) = 6 paid months.

Azure Monitor pricing
https://azure.microsoft.com/en-us/pricing/details/monitor/

If you are looking to test out the service “for free” it is possible to configure an Azure Sentinel enabled workspace to ingest Office 365 data and limit the retention of data to < 90 days.  Once you get comfortable with the data schema and writing queries you can increase the retention period.

Sample Queries

In the course of working with customers on monitoring their Office 365 environments I and my teammates have developed a number of KQL queries to find “interesting” data points.  I’ve shared these queries in the following GitHub repo.  Note that these sample queries are provided as-is with no warranty.  If you have any queries of your own that you would like to contribute feel free to submit a pull request (or open an issue) to the repo for review.

Office 365 and Azure AD sample KQL Queries
https://github.com/BrianTJackett/log-analytics-samples

Conclusion

Last year I had never heard of Log Analytics, Azure Sentinel, or KQL.  This year I am seeing it pop up in so many places inside and outside of Microsoft.  I am very eager to see where these technologies go and spread the word about them.  I truly see interesting problems that can be solved with enough data and the right query.  Hopefully this post will give you a nudge in the right direction to start (or continue) looking at these technologies.

-Frog Out

How To Self Destruct Azure Resource Groups with Alert and Logic App

I was reading a blog post by Mark Heath on This resource group will self destruct in 30 minutes which leverages an open source Azure CLI extension to automatically delete resource groups.  This extension accomplishes the deletion by scheduling a logic app in the same resource group.  I was curious if I could accomplish the same effect without the need to leverage the Azure CLI (i.e. for any resource group created via portal, PowerShell, etc.)  In this post I’ll show how to configure a “self destruct” mechanism using an Azure Alert and a Logic App.

Solution

Here are the high level steps to follow.

  1. Create Resource Group – Create resource group will host the subsequent resources
  2. Create Logic App – Create logic app to trigger on Azure Activity Log HTTP request (will fill out more later)
  3. Activity Log alert – Create Azure Activity Log alert to trigger logic app
  4. Trigger Alert – Create resource group which triggers Activity Log alert and consequently logic app
  5. Gather JSON schema – Get JSON schema for HTTP request sent to logic app
  6. Finalize Logic App – Add JSON schema from alert HTTP request and action to remove resource group to logic app

1) Create Resource Group

If you don’t already have one, create a resource group to host the resources for the self destruct solution.

2) Create Logic App

Create a logic app in the resource group from previous step.  Add a trigger for When a HTTP request is received.  We will not fill out any of the details for the trigger at this time.  Click Save.

AzureRGSelfDestruct8.jpg

3) Activity Log alert

Navigate to Azure Monitor (can search from the top search bar or go through All Services) for the subscription where the solution will be deployed.

AzureRGSelfDestruct10.jpg

Click Alerts on the left hand navigation.  Click New Alert Rule.

AzureRGSelfDestruct11

Define the resource to be monitored:

  • Resource:
  • Condition:
    • Signal Type: Activity Log
    • Monitor Service: Activity Log – Administrative
    • Signal Name: Create Resource Group (subscriptions/resourceGroups)
    • Alert Logic – Status: Succeeded
  • Action Groups:
    • Action Group Name: SelfDestructResourceGroup
    • Short Name: SelfDestruct
    • Subscription:
    • Resource Group:
    • Action Group Action: LogicApp
      • Subscription:
      • Resource Group:
      • Logic App:
AzureRGSelfDestruct2.jpg
Condition – configure signal logic part 1
AzureRGSelfDestruct3.jpg
Condition – configure signal logic part 2
AzureRGSelfDestruct13
Create new alert action group with a logic app action
AzureRGSelfDestruct4.jpg
Create a new alert action to go in action group
AzureRGSelfDestruct5.jpg
Overview screenshot for rule creation

4) Trigger Alert

Create a new resource group that will then trigger the alert defined in step 3 and consequently fire the HTTP trigger for the logic app defined in step 2.

5) Gather JSON Schema

Navigate to the logic app defined in step 2.  Assuming the logic app was successfully triggered, under Run history select the successful logic app execution.  On the “logic app run” screen expand the trigger and click on Show raw outputs.

AzureRGSelfDestruct14

Save this JSON for use in the next step.  If you are unable to collect this JSON schema you can use the sample below which is already coverted to the final format needed.

Note: the resoureceGroupName element is buried many levels deep.  We will query for this when needed.

{
“properties”: {
“body”: {
“properties”: {
“data”: {
“properties”: {
“context”: {
“properties”: {
“activityLog”: {
“properties”: {
“authorization”: {
“properties”: {
“action”: {
“type”: “string”
},
“scope”: {
“type”: “string”
}
},
“type”: “object”
},
“caller”: {
“type”: “string”
},
“channels”: {
“type”: “string”
},
“claims”: {
“type”: “string”
},
“correlationId”: {
“type”: “string”
},
“description”: {
“type”: “string”
},
“eventDataId”: {
“type”: “string”
},
“eventSource”: {
“type”: “string”
},
“eventTimestamp”: {
“type”: “string”
},
“httpRequest”: {
“type”: “string”
},
“level”: {
“type”: “string”
},
“operationId”: {
“type”: “string”
},
“operationName”: {
“type”: “string”
},
“properties”: {
“properties”: {
“responseBody”: {
“type”: “string”
},
“serviceRequestId”: {},
“statusCode”: {
“type”: “string”
}
},
“type”: “object”
},
“resourceGroupName”: {
“type”: “string”
},
“resourceId”: {
“type”: “string”
},
“resourceProviderName”: {
“type”: “string”
},
“resourceType”: {
“type”: “string”
},
“status”: {
“type”: “string”
},
“subStatus”: {
“type”: “string”
},
“submissionTimestamp”: {
“type”: “string”
},
“subscriptionId”: {
“type”: “string”
}
},
“type”: “object”
}
},
“type”: “object”
},
“properties”: {
“properties”: {},
“type”: “object”
},
“status”: {
“type”: “string”
}
},
“type”: “object”
},
“schemaId”: {
“type”: “string”
}
},
“type”: “object”
},
“headers”: {
“properties”: {
“Connection”: {
“type”: “string”
},
“Content-Length”: {
“type”: “string”
},
“Content-Type”: {
“type”: “string”
},
“Expect”: {
“type”: “string”
},
“Host”: {
“type”: “string”
},
“User-Agent”: {
“type”: “string”
},
“X-CorrelationContext”: {
“type”: “string”
}
},
“type”: “object”
}
},
“type”: “object”
}

6) Finalize Logic App

Edit the Logic App.

Inside the HTTP request trigger either…

  • click “sample payload to generate schema” paste in your sample JSON payload from step 5.

-or-

  • paste the sample I provided into the Schema input.

Next add an action for Delete a resource group (currently in preview at time of writing).

AzureRGSelfDestruct12

Fill in the following values:

  • Subscription:
  • Resource Group: Expression = “triggerBody().data.context.activityLog.resourceGroupName”

AzureRGSelfDestruct7.jpg

The final logic app will look similar to the following:

AzureRGSelfDestruct6.jpg

Test Solution

Test out the solution by adding a new resource group to the monitored subscription.  Check the Azure Monitor alerts to see if the Activity Log entry for successful resource group creation triggered the Logic App.

AzureRGSelfDestruct16

Then verify the logic app execution history shows the resource group deletion was successful.

AzureRGSelfDestruct15

Next steps

More than likely you will not want to delete the resource group right after you create it.  In that case you can add a Delay action which pauses the logic app for a specified number of minutes (ex. 60 minutes).  Additionally you could apply conditional logic to check if the name of the resource group matches (or doesn’t match) a specific pattern.  There are many additions you can add to personalize this solution to your needs.

Conclusion

In this post I walked through an adaptation of the Azure CLI extension that Mark Heath linked to.  We leveraged an Azure Monitor alert together with a logic app to provide an option to self destruct any resource group no matter where or how it was created.  If you have any feedback or additional suggestions feel free to leave them in the comments.

-Frog Out

Sample Deploying Multiple Azure ARM Templates Using External Links

In this post I’ll share a set of linked Azure ARM templates (link) that can be used to deploy a number of Azure resources together.  This sample uses externally linked ARM templates using a parent to child relationship as well as including a few “complex” scenarios like conditional logic and assigning permissions through role assignments.  Note there are also two ways to deploy these templates: 1) at the subscription level (owner) and 2) at the resource group level (contributor).

<Update 2019-04-09>After reading this article on ARM template lifecycle management do’s and dont’s I’m rethinking using externally linked ARM templates.  Below example still contains them but I may update to combine into a single ARM template at a later date once I’ve had a chance to test.</Update>

Background

For a recent customer project we had a need to deploy the following resources.

  • Resource group
    • Azure Storage Account
    • Azure Function App
      • Azure Application Insights
    • Azure Log Analytics workspace
    • Azure Automation Account

A few of these resources have dependencies on each other such as the Function App requiring a storage account to store solution data into blobs / queues, Automation Account requiring Log Analytics for pushing diagnostic stream data to a workspace, etc.  While it is possible to deploy all of these resources in one (very) large ARM template, we also wanted to be able re-use pieces and parts of this solution in future projects that are going to have a similar (but slightly different) architecture.  We decided to create a set of parent-child ARM templates to mimic the first 2 levels of the above hierarchy.

Solution

The sample files are stored in my Blog-Samples repo under the ARM-External-Templates folder.  There is a brief README file to walk through the necessary steps to update parameter files, upload the linked templates to a storage account, etc.  This is not a fully documented process at the moment but if you wish to submit a PR or open an issue I’ll update more as I have time.

A few key pieces to highlight.

  • The “Contributor” scenario assumes that the deployment account has contributor permissions to a specific resource group or subscription (more common for production or similar environments)
  • The “Owner” scenario assumes that the deployment account has owner permissions to the entire subscription (ex. development environment)
  • The “Owner” scenario optionally deploys role assignments (queue sender, workspace reader, and runbook operator) to various resources only if the corresponding parameter values are not empty (otherwise skip assignment)
  • The Automation Account is pre-populated with a PowerShell runbook and also configured to send runbook job and stream output to the provisioned log analytics workspace

Sample output

After running the deployment script the deployment and resource group should look as follows.

LinkedARMTemplates1.jpg

LinkedARMTemplates2.jpg

 

Conclusion

The linked ARM templates from this sample are meant for illustration purposes only.  Hopefully you’ll find them helpful in terms of what is possible from a deployment and configuration perspective.  If you have any feedback or suggestions please submit a PR on the repo or open an issue on GitHub.  Good luck with automating your Azure deployments.

 

-Frog Out

Azure Functions Calling Azure AD Application with Certificate Authentication

Calling the Microsoft Graph, SharePoint Online, or other resource via an Azure AD Application is a fairly straightforward process when you use client ID + secret for the authentication mechanism.  You can think of the client ID and secret as a username and password for authentication.  Note that anyone who has that client ID + secret can log in as that Azure AD App and perform the actions that it has been granted.  In an enterprise or secure environment certificate authentication is a more secure authentication option as it requires physically having the certificate which will only be deployed in a private fashion.  In this post I’ll walk through how to deploy and leverage the necessary components to accomplish this.  This example is part of a larger Azure Functions sample that I plan to release at a later date but the snippets below could be adapted for other hosting platforms.

 

Components Needed

  • Certificate (self-signed or generated from a PKI-type infrastructure)
  • Azure AD Application (using V1 in this example) with Microsoft Graph OAuth permissions
  • Azure Function

 

Solution Overview

  1. Create certificate (self-signed in this example)
  2. Create Azure function
  3. Create Azure AD application registration
  4. Add certificate metadata to Azure AD application
  5. Deploy certificate to Azure Function certificate store
  6. Authenticate to Azure AD application using certificate

 

1) Create Certificate

If you are on Windows 8+ there is a PowerShell commandlet to create self-signed certificates easily.  If not you’ll need to leverage MakeCert.exe or another certificate generating mechanism (ex. New-SelfSignedCertificateEx, documentation).  Here is a sample of the PowerShell option.

 

# process for Windows 8+ type OS
$ssc = New-SelfSignedCertificate -CertStoreLocation $CertificateStoreLocation -Provider $ProviderName `
    -Subject "$CertificateSubject" -KeyDescription "$CertificateDescription" `
    -NotBefore (Get-Date).AddDays(-1) -NotAfter (Get-Date).AddYears($CertificateNotAfterYears) `
    -DnsName $CertificateDNSName -KeyExportPolicy Exportable

# Export cert to PFX - uploaded to Azure App Service
Export-PfxCertificate -cert cert:\CurrentUser\My\$($ssc.Thumbprint) -FilePath $certificatePFXPath -Password $CertificatePassword -Force

# Export certificate - imported into the Service Principal
Export-Certificate -Cert cert:\CurrentUser\My\$($ssc.Thumbprint) -FilePath $certificateCRTPath -Force

 

2) Create Azure Function

You can create an Azure Function from the Azure Portal (reference), Azure CLI (reference), or through tools / extensions built into Visual Studio 2017 (reference) / Visual Studio Code.

 

3-4) Create Azure AD Application and Add Certificate to Azure AD Application

Here is a sample for creating an Azure AD application using Azure PowerShell.  In this example the certificate is added (-KeyCredentials) to the Azure AD application at time of creation, but it could also be added after the fact through the Azure Portal or PowerShell as well.

 

# prepare certificate for usage with creating AAD app
$KeyStorageFlags = [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable, `
    [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::MachineKeySet, `
    [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::PersistKeySet
$certFile = Get-ChildItem -Path $CertificatePFXPath
$x509 = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$x509.Import($certFile.FullName, $CertificatePassword, $KeyStorageFlags)
$certValueRaw = $x509.GetRawCertData()

$validFrom = $x509.NotBefore
$validTo = $x509.NotAfter
$keyId = [guid]::NewGuid()

$keyCredential = New-Object -TypeName "Microsoft.Open.AzureAD.Model.KeyCredential"
$keyCredential.StartDate = $validFrom
$keyCredential.EndDate= $validTo
$keyCredential.KeyId = $keyId
$keyCredential.Type = "AsymmetricX509Cert"
$keyCredential.Usage = "Verify"
$keyCredential.Value = $certValueRaw

$aadApp = New-AzureADApplication -DisplayName $AADAppName -Homepage $HomePage -ReplyUrls $ReplyUrls `
    -IdentifierUris $IdentifierUri -KeyCredentials $keyCredential

 

5) Deploy certificate to Azure Function

While there is a native way to upload a certificate to an Azure App Service via the Azure CLI and the Azure Portal there is not a direct way via PowerShell.  I was able to mimic an option with PowerShell by adding an SSL binding with a certificate and then immediately removing the SSL binding while not deleting the certificate (“-DeleteCertificate $false”).  Below are examples for both options.

Note: In both examples below the password will be entered as cleartext instead of using a SecureString or other encrypted mechanism. This could pose a security risk but I haven’t found an alternative as of yet.

PowerShell

$BSTR = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($certificatePassword)
$ClearTextPassword = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto($BSTR)

New-AzureRmWebAppSSLBinding -ResourceGroupName $resourceGroupName -WebAppName $webAppName -Name $webAppDNSName -CertificateFilePath (Get-ChildItem .\$certificatePFXPath) -CertificatePassword $ClearTextPassword
Remove-AzureRmWebAppSSLBinding -ResourceGroupName $resourceGroupName -WebAppName $webAppName -Name $webAppDNSName -DeleteCertificate $false -Confirm:$false -Force

 

Azure CLI

az webapp config ssl upload --certificate-file "(certPath)" --certificate-password "(certPassword)" --name "(certName)" --resource-group "(resourceGroup)"

6) Authenticate to Azure AD application using certificate

The Azure Function code can authenticate to the Azure AD application using the certificate that was deployed in step 5.  Below is a sample of the code used to retrieve the certificate.  Since Azure Functions can be run locally or in Azure this will work locally if the certificate has been deployed to the certificate store or in Azure when deployed to the App Service.


public static X509Certificate2 GetCertificate(string thumbprint)
{
X509Store store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
try
{
    store.Open(OpenFlags.ReadOnly);

    var col = store.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, false);
    if (col == null || col.Count == 0)
    {
        return null;
    }
    return col[0];
}
finally
{
    store.Close();
}
}

 

Below is a sample of using the certificate to authenticate to SharePoint Online, but this could easily point to a different resource such as Microsoft Graph, Exchange Online, etc.

var url = Environment.GetEnvironmentVariable("tenantRootUrl");
var thumbprint = Environment.GetEnvironmentVariable("certificateThumbprint");
var resourceUri = Environment.GetEnvironmentVariable("resourceUri");
var authorityUri = Environment.GetEnvironmentVariable("authorityUri");
var clientId = Environment.GetEnvironmentVariable("clientId");
var ac = new AuthenticationContext(authorityUri, false);
var cert = GetCertificate(thumbprint);  //this is the utility method called out above
ClientAssertionCertificate cac = new ClientAssertionCertificate(clientId, cert);
var authResult = ac.AcquireTokenAsync(resourceUri, cac).Result;

#next section makes calls to SharePoint Online but could easily be to another resource
using (ClientContext cc = new ClientContext(url))
{
    cc.ExecutingWebRequest += (s, e) =>
    {
        e.WebRequestExecutor.RequestHeaders["Authorization"] = "Bearer " + authResult.AccessToken;
    };

    #make calls through the client context object
    #…
}

 

Conclusion

This process is part of a much larger solution used to make authenticated calls to an Azure AD application from an Azure Function.  I am working on publishing that solution as a sample for others to reference.  I am hopeful that I’ll have something available within a month.  For the time being feel free to reference the above steps and code snippets for use in your own project.  Feel free to contact me or leave a comment if you have questions or feedback.

 

-Frog Out

Calling Azure AD Secured Azure Function Externally From JavaScript

My customer recently had a need to securely call an HTTP trigger on an Azure Function remotely from an arbitrary client web application.  In this scenario securely meant ensuring that the user has logged into Azure Active Directory (AAD), but any number of authentication providers could be used.  The SharePoint Patterns and Practices (PnP) team had posted a video (SharePoint PnP Webcast – Calling external APIs securely from SharePoint Framework) that used the SharePoint Framework but my team needed to do this from vanilla JavaScript.  Many thanks to the PnP team and my peer Srinivas Varukala for their inspiration and code samples.

Overview

The key components to this solution involve the following:

  • Azure AD app registration (used to enforce authentication on Azure Function)
  • Azure Function configured to enforce Azure AD authentication
  • Client web application with JavaScript code to call the Azure Function

Azure Function

In the Azure Portal create a new Azure Function.  Choose an HTTP Trigger and use the language of choice (I’m using C# script in this example).  The Azure Function will validate if a claims principal exists on the incoming request and then output to the logs the name of the user if authenticated.

<Update 2018-05-02>

Note: the Azure portal currently does not support the headers required for CORS (cross-origin resource sharing) requests that contain credentials.  Feedback (source) has been provided to the Azure App Service team to support this but was declined.  As such the manual processing of CORS  requests is not supported at this time.  You will need to determine if this workaround works for you or not.

</Update 2018-05-02>

<Update 2019-06-25>

Thanks to comment from Michael Armstrong who pointed out that CORS with credential headers is now supported in Azure App Service.  I have validated that this will work with the updated configuration detailed in the post now.

</Update 2019-06-25>

using System.Net;
using System.Security.Claims;
using System.Threading;
public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log)
{
  log.Info("C# HTTP trigger function processed a request.");
// check for authenticated user on incoming request
  if (!ClaimsPrincipal.Current.Identity.IsAuthenticated)
  {
    log.Info("Claims: Not authenticated");
  }
  else
  {
    log.Info("Claims: Authenticated as " + ClaimsPrincipal.Current.Identity.Name);
  }

  var resp = req.CreateResponse(HttpStatusCode.OK, "Hello there " + ClaimsPrincipal.Current.Identity.Name);

  // manually process CORS request (no longer needed with 
  //if (req.Headers.Contains("Origin"))
  //{
  //  var origin = req.Headers.GetValues("origin").FirstOrDefault();
  //  resp.Headers.Add("Access-Control-Allow-Credentials", "true");
  //  resp.Headers.Add("Access-Control-Allow-Origin", origin);
  //  resp.Headers.Add("Access-Control-Allow-Methods", "GET, POST, OPTIONS");
  //  resp.Headers.Add("Access-Control-Allow-Headers", "Content-Type, Set-Cookie");
  //}
  return resp;
}

Azure AD App

In order to enforce Azure AD authentication on the Azure Function an Azure AD app registration needs to be created.  Log into the Azure AD admin portal.  Under Azure Active Directory –> App Registrations create a new app registration.

SecureCallAFFromJS1

Take note of the Application ID (also known as client ID) for the application created.  This will be used later in the Azure App Service authentication / authorization configuration.

SecureCallAFFromJS2

The default permissions for the Azure AD app registration (delegated: sign in and read user profile) will be sufficient.

SecureCallAFFromJS3

SecureCallAFFromJS4

Enforce authentication

Return to the Azure Function and navigate to the Platform features –> Authentication / Authorization screen.  Turn App Service Authentication to On, set “Action to take…” to “Log in with Azure Active Directory”, then click the Azure Active Directory authentication provider to configure it as follows.

SecureCallAFFromJS5

Fill in the Application ID / Client ID from the previously created Azure AD app registration.  Specify the IssuerUrl of the Azure AD domain (typically https://login.microsoftonline.com/TENANT_NAME_GOES_HERE.onmicrosoft.com).

SecureCallAFFromJS6

Remove CORS configuration

As noted previously, the Azure portal currently (as of writing May 1, 2018) does not support Azure App Service processing CORS requests that contain credentials.  As such removing all domains from the CORS configuration in Azure Portal is unsupported.  Please validate if this workaround works for you or not.

<Update 2019-06-25>

Update CORS configuration

Now that Azure App Service supports processing CORS requests that contain credentials, update the CORS settings with the allowed origins that will be calling into your App Service (Azure Function).

Note that the values in below screenshot are using values from running my sample app in debug mode.  You’ll want to use the published location for a production environment.

SecureCallAFFromJS16

</Update 2019-06-25>

 

Client code

In this example I started with a .Net Framework MVC project from Visual Studio 2017 v15.6.7 but the code could be hosted on any page with HTML, JavaScript, and a logged in user to Azure AD.  Note that the MVC project allows enforcing Azure AD authentication which is what I was most interested in.

SecureCallAFFromJS8

SecureCallAFFromJS9
HTML snippet to include inside of an IFRAME element with source pointing to root of Azure Function.

SecureCallAFFromJS13

<Update 2019-04-11> After the IFRAME has loaded and authenticated you should see a cookie tied to the domain hosting the Azure Function.  See following for successful setting of the auth cookie.

SecureCallAFFromJS15

If this cookie is not set (ex. cookie size is too large, request / response headers deny the cookie, cookies not allowed by policy, etc.) the outgoing AJAX request will not be able to satisfy the authentication requirement which will result in an error.  See following for sample issues that could be encountered when setting the auth cookie.

SecureCallAFFromJS14

</Update 2019-04-11>

JavaScript snippet to call HTTP trigger of Azure Function.

Note that in a production scenario you would want to ensure that the IFRAME has loaded fully (and thus authentication cookie set) prior to the Azure Function being called.

  $(document).ready(function () {
    $("#btnExternalCall").click(function (e) {

      var serviceURL = "https://NAME_OF_AZURE_FUNCTION_GOES_HERE.azurewebsites.net/api/HttpTriggerCSharp1";

      $.ajax({
        url: serviceURL,
        type: "GET",
        xhrFields: {
          withCredentials: true
        },
        crossDomain: true,
        success: function (data) {
          alert("Success: " + data);
        },
        error: function (ex) {
          alert("Failure getting user token");
      }
    });
  });
});

Testing

When all has been configured you can test scenario.  If you enter F12 developer tools from your browser of choice you should see the authentication cookie for both the client web application as well as the Azure Function domains.

SecureCallAFFromJS10

After issuing the call to the HTTP Trigger on the Azure Function you should see that the call was indeed authenticated and the ClaimsPrincipal is returned.

SecureCallAFFromJS11SecureCallAFFromJS12

Conclusion

This is a very powerful capability being able to ensure that an HTTP trigger on an Azure Function only allows authenticated users to call the endpoint.  Hopefully this post helps others who have a similar need.  Please leave and questions or feedback in the comments below.

-Frog Out

Slides from SharePoint Cincy 2018 Conference

This post is a few days late as I’m catching up from being out of town for vacation and at the SharePoint Cincy 2018 Conference.  A big thank you to all who attended my session, the organizers (especially Sean McDonough), sponsors, other speakers, and anyone else who helped put on the conference.  Below are my slides from my session.  Feel free to reach out with any questions or comments.

Dipping Your Toe into Cloud Development with Azure Functions

Slides

      -Frog Out