Speaking at SharePoint Cincy 2018

I’m excited to be speaking at SharePoint Cincy 2018.  I believe this is the 5th year I’ve spoken there and it has always been a good conference to meet with attendees and hear other good content.  Below is an abstract for the session that I’ll be presenting on cloud development using Azure Functions (a recent area of big interest I’ve been working with a customer on).  There is still time to register.  Feel free to use my promo code Jackett2018 for a discount.  If you’re attending the conference feel free to stop by and say hi.

SpeakerBadgeSPCincy2018Jackett

SharePoint Cincy 2018

Website: http://www.sharepointcincy.com/

Registration: http://events.constantcontact.com/register/event?llr=vwg9epoab&oeidk=a07ef00osd9d81e26f2

Title: Dipping Your Toe into Cloud Development with Azure Functions

Abstract: Those on-prem custom solutions (ex. timer jobs, batch processes, etc.) need to be re-written for SharePoint Online. Where do you host them so that you don’t DoS the proxy? How do you properly secure public endpoints for Azure resources? What authentication will you use against SharePoint Online? In this session we will introduce Azure Functions and related services as an option for replacing on-prem solutions while keeping in mind security, architecture, authentication, scalability, and more. We’ll also walk through a real-world scenario calling Office 365 APIs using an authenticated Azure AD app. Prior experience with Azure is helpful but not required.

Slides and Demo Files from SPTechCon DC 2017

   Big thanks to Stacy and her crew of organizers, all of the attendees, and fellow speakers at SPTechCon DC 2017.  This was one of the best ones I’ve attended from an engagement and networking perspective.  Below are my slides and source code.  Feel free to let me know about any follow up questions or comments.

PowerApps and Microsoft Flow for Developers

GitHub link to demo project files

https://github.com/BrianTJackett/BTJ.PowerApps.AzureDBSample

Slides

Intro to Power BI for Office 365 Developers

Slides

Sample Financial Data file

http://go.microsoft.com/fwlink/?LinkID=521962

Blog posts on SMAT data report

https://aka.ms/SMAT2013BTJpart1

https://aka.ms/SMAT2013BTJpart2

      -Frog Out

SharePoint CSOM to Traverse All Sites in SharePoint Online

[Update 2018-07-31] Special thanks for my peer John Ferringer for pointing out an issue with my prior CSOM snippet. Updating CSOM code to handle large number of sites as previous sample would truncate results if large number of sites were traversed.[/Update]

In the past I’ve written posts for “PowerShell to Enumerate SharePoint 2010 or SharePoint 2013 Permissions” or “PowerShell Script To Traverse All Sites In SharePoint 2010 (or 2007) Farm” to assist with traversing through all sites within a SharePoint on-prem farm.  In this post I’ll share a snippet I recently used for traversing through all site collections and subsites within a SharePoint Online tenant.

 

Background

If you’ve worked with the SharePoint Online Management Shell you may know that originally it was not able to retrieve Personal Sites (also known as My Sites / OneDrive for Business sites) in SharePoint Online.  As far as I’m aware this was primarily a limitation of the underlying client side libraries (Microsoft.SharePoint.Client.*).  Fast forward to a recent release (I don’t have the specific one but I can confirm it is in the 16.1.6621.1200 release of the Microsoft.SharePointOnline.CSOM NuGet package) and now it is supported to retrieve Personal Sites.  In the management shell this is accomplished by calling the following:

 

Note: If you do not see the below Gist please refer to code at this location: PS-Get_All_Sites_SPO.ps1


Connect-SPOService -Url '<tenantAdminUrl>'
Get-SPOSite -Limit all -IncludePersonalSite $true

 

The problem is that if you want to traverse all site collections in SharePoint Online in client side object model (CSOM) you would need to know how the SharePoint Online Management Shell implements that inclusion of Personal Sites with the rest of site collections.  In order to find this out I used a disassembler (ILSPY in my case, but there are many alternatives available as well) on the underlying libraries to recreate the process in my own code.

 

Solution

The resulting CSOM that I came up with is below.  Feel free to borrow this and use in your own code but note that it is provided as-is with no warranty.

Note: If you do not see the below Gist please refer to code at this location: CSOM_Traverse_All_Sites_SPO.txt

 


List <SiteProperties> list = new List <SiteProperties>();
SPOSitePropertiesEnumerable ssp = null;
SPOSitePropertiesEnumerableFilter sspFilter = new SPOSitePropertiesEnumerableFilter();
SharePointOnlineCredentials creds = new SharePointOnlineCredentials("myUsernameGoesHere", securePassword);
using (ClientContext cc = new ClientContext("myURLGoesHere"))
{
cc.Credentials = creds;
String nextIndex = null;
Tenant tenant = new Tenant(cc);
//loop through all site collections including personal sites (even though not being used)
//borrowed this code from after decompiling SPO Management Shell assemblies
sspFilter.IncludePersonalSite = PersonalSiteFilter.Include;
sspFilter.IncludeDetail = true;
do
{
sspFilter.StartIndex = nextIndex;
ssp = tenant.GetSitePropertiesFromSharePointByFilters(sspFilter);
cc.Load(ssp);
cc.ExecuteQuery();
list.AddRange(ssp);
nextIndex = ssp.NextStartIndexFromSharePoint;
}while(nextIndex != null);
foreach (SiteProperties sp in list)
{
//DO YOUR WORK HERE FOR EACH SITE COLLECTION, such as looping through subwebs
cc.Load(cc.Web, w => w.NoCrawl,
w => w.Webs,
w => w.Url);
cc.ExecuteQuery();
//check subweb(s)
foreach (var subweb in cc.Web.Webs)
{
cc.Load(subweb, sw => sw.NoCrawl,
sw => sw.Url);
cc.ExecuteQuery();
}
}
}

Conclusion

In this post I shared a snippet for traversing all site collections in SharePoint Online with C# CSOM code.  In my daily job I’ve been using this in combination with Azure Functions for a number of interesting interactions with a SharePoint Online tenant.  More to come on those scenarios in future weeks.  For now let me know in the comments if you have any questions or issues implementing the above snippet in your own code.  Happy coding!

 

-Frog Out

How To Analyze SharePoint 2013 Migration Assessment Tool (SMAT) Data in Power BI – Part 2

<Update 2017-02-28> Updated PBIX file with “Top N site collections by risk page” based on feedback from Microsoft peers.  Also minor updates to documentation for report creation. </Update>

In this set of posts I will walk through a process for consuming SMAT output files into a Power BI report.  This post will give a high level overview of the report pages to create and Power BI visuals used.  In order to limit the length of this post (I was at 60+ screenshots before even finishing) I am providing a separate more in-depth Word document with the steps to create the report pages contained.

  • Part 1 – importing the data and then querying / modeling the data
  • Part 2 – designing the Power BI Report

In a previous post I talked about the SharePoint Migration Assessment Tool for SharePoint 2013 being released to web.  Please read that post for context on working with SMAT in general.

Disclaimer: I am by no means a Power BI expert.  In fact this report is the first time I’ve spent any significant time developing a report.  Most of the technical credit goes to my coworkers Ken Kilty and Gene Livshin.

Note: Power BI Desktop is being updated regularly.  For the purposes of this article I am using the January 2017 build (version 2.42.4611.701 64-bit).

 

Sample report and data files (updated 2017-02-28)

In-depth Power BI report creation documentation (document in progress, not finalized).

Sample SMAT source files (CSV, PBIX, walkthrough DOCX)

For the PBIX file be sure to update the source of data file via the Edit Query screen on the SiteAssessmentReport query as follows:

SMATPowerBIReport101

SMATPowerBIReport100

 

Overview

The SMAT tool outputs a summary CSV file along with 1 detailed CSV per risk being analyzed.  Reading and interpreting hundred or thousands or tens of thousands of rows in a CSV file is not optimal.  Instead it will be helpful to model and report on that data in an easier to consume fashion.  At a high level our process will involve the following:

  1. Import Data [part 1] – Import SMAT output files into Power BI
  2. Query Data [part 1] – Create multiple queries on SMAT output files to filter, aggregate, and enhance into different perspectives
  3. Design Reports [part 2] – Design dashboard pages to consume the query outputs

 

Design Reports

Continuing from the queries that were modeled in part 1 now it is time to create reports to visualize the SMAT data to tell a story.  As mentioned in the previous post I had a number of questions I wanted to answer.

  • Content database statistics
    • Number of sites per database?
    • Size (storage) of databases?
  • Web application statistics
    • Number of site collections and subsites?
    • Size (storage) of site collections?
  • Web application risks
    • Breakdown view of risks per web application?
    • Slicer by web application and risk type
  • Site collection risks
    • Number of risks per site collection?
    • Slicer by risk type

 

These questions can be answered across a number of different pages that focus on each.  Start by creating 5 total pages with the names such as the following:

  1. Content DB overview
  2. Site collection overview
  3. Risk overview
  4. Risks by web application
  5. Risks by site collection

 

SMATPowerBIReport31

 

**Mea Culpa – PLEASE READ**

Before moving into the Power BI report I have to issue a mea culpa.  As I reviewed the report pages I found that a few of my queries required additional columns or small modifications in order to improve usability or readability of the reports.  These changes involved new calculated columns (convert MB to GB) and a conditional column to group sites by the number of risks relative to each other (small, medium, large, etc.).  Please reference the sample files to see these changes.

 

Content DB overview page

The content DB overview page will give an overview size of the content database (storage used) and can be filtered by individual server if multiple database servers are used.  Notice that databases have been re-ordered by their size rather than default sorting by name.

SMATPowerBIReport41

 

Site collection overview page

The site collection overview page will give an overview of the number site collections in a given web application compared to the size (storage) of site collections / web applications.  Notice that a pattern is immediately visible when viewing this page.  The majority of site collections are contained within sub0.domain.com while sub1.domain.com contains the bulk of remainder.  Inversely sub1.domain.com is where the bulk of the storage is used while sub0.domain.com only uses a fraction of the storage.

SMATPowerBIReport57

 

Risk overview page

The Risk overview page will give a high level view of risks per site collection and by risk type.  The histogram visual used (more on this below) filters the groups of site collections by ranges of how many risks were identified.

SMATPowerBIReport80

Add custom histogram visualization

The Risk overview page makes use of a custom visual called Histogram from the Power BI gallery.  Again thanks to Ken Kilty for this advice.  Knowing the number of risks found per site could be useful, but more likely it would be better to see trends and distributions of risks per grouping of sites and look for anomalies.  How does one group sites together?  Put them into “buckets” based on how many risks the sites have.  Ex. Column one is count of sites with 0-100 risks, column two is 101-200 risks, and so on.  In the reporting world a histogram visual can provide this for us.

Navigate to “https://app.powerbi.com/visuals/” and search for “Histogram”.  There is a custom visual provided by Microsoft (v0.3.1 as of the time of this writing).  Download that visual file.  Back in Power BI on the visuals gallery click the ellipses (…) in bottom right and then select “Import a custom visual”.  Navigate to the location where you downloaded the histogram custom visual and select it.

SMATPowerBIReport58

 

Risks by web application page

The Risks by web application report page will show a breakdown of risks per web application and then within the web application and also a breakdown of the risk types.  Filter by individual web application or risk name.  Note that there is a report level filter for the supporting query so that any risk name with no occurrences is hidden.

SMATPowerBIReport98

 

Risks by site collection page

The Risks by site collection page provides an overview of risk types, number of risks identified for selected risk type, and number of site collections that have the selected risks.  Filter by risk name to find the number of identified risks and sites where that risk was identified.  Note that there is a report level filter for the supporting query so that any risk name with no occurrences is hidden.

SMATPowerBIReport99

 

Top N site collections by risk page

The Top N site collections by risk page provides an overview of the top N (in my example N = 25, more on this below) site collections with the highest number of risks.  Filter by risk name to find identified risks across these top 25 sites or filter by site collection to find risks on selected site collection.  Note that there is a report level filter for the supporting query so that any risk name with no occurrences is hidden.

SMATPowerBIReport130

 

Add query parameter for number of site collections

One of the features of the Top N site collections by risk page is to sort site collections by total number of risks and only keep the top N records.  As such it is preferable to use a query parameter within Power BI desktop to allow for a variable number of site collections to be shown.  Power BI Desktop introduced query parameters with the April 2016 update.  Query parameters allow a report creator to define what is essentially a variable to then be used in data source dialogs (primarily filtering operations).  See below for an example of defining query parameter and usage in filter for top N rows.

SMATPowerBIReport128

SMATPowerBIReport129

 

Conclusion

I will be one of the first ones to admit that at the start of this project I knew next to nothing about proper dashboard creation and ways to model data in different ways.  Speaking as a general developer and not as a Microsoft employee, I was very impressed with the capabilities Power BI has to offer.  Given a sufficiently large and diverse enough set of data it was a relatively fast process to transform the SMAT output files into a usable (and now re-usable) format which can then show insights into the data.  I know for my customer and many others this means they can make business critical decisions about what, when, and where to migrate their SharePoint data to SharePoint Online.  If you have any feedback on this series of blog posts or suggestions for additional report pages please let me know in the comments.

 

-Frog Out

How To Analyze SharePoint 2013 Migration Assessment Tool (SMAT) Data in Power BI – Part 1

In this set of posts I will walk through a process for consuming SMAT output files into a Power BI report.

  • Part 1 – importing the data and then querying / modeling the data
  • Part 2 – designing the Power BI Report (and also provide sample files to use)

In a previous post I talked about the SharePoint Migration Assessment Tool for SharePoint 2013 being released to web.  Please read that post for context on working with SMAT.

Disclaimer: I am by no means a Power BI expert.  In fact this report is the first time I’ve spent any significant time developing a report.  Most of the technical credit goes to my coworkers Ken Kilty and Gene Livshin.

Note: Power BI Desktop is being updated regularly.  For the purposes of this article I am using the January 2017 build (version 2.42.4611.701 64-bit).

Overview

The SMAT tool outputs a summary CSV file along with 1 detailed CSV per risk being analyzed.  Reading and interpreting hundred or thousands or tens of thousands of rows in a CSV file is not optimal.  Instead it will be helpful to model and report on that data in an easier to consume fashion.  At a high level our process will involve the following:

  1. Import Data – Import SMAT output files into Power BI
  2. Query Data – Create multiple queries on SMAT output files to filter, aggregate, and enhance into different perspectives
  3. Design Reports – Design dashboard pages to consume the query outputs

 

Import Data

Importing data into Power BI is a fairly simple process.  After launching Power BI Desktop click the Get Data link from the welcome screen (or click the Get Data button in top ribbon menu).

SMATPowerBIReport1

 

Select the CSV data type.

SMATPowerBIReport2

 

Navigate to the folder where the SMAT output files are located.  In this walkthrough use the SiteAssessmentReport.csv file.

SMATPowerBIReport3

 

A preview of the data should appear.  I have run a PowerShell script to clean up the data from any customer sensitive information (ex. server names, database names, URLs, users, etc.)

Note: At a later date I may blog out that PowerShell script and link it back to this article.  If this is data scrubbing script is of interest please let me know in comments or contact me to prioritize this.

SMATPowerBIReport4

 

If everything was successful a list of fields should appear on the right hand of the screen.  These fields will correspond to the comma separated fields in the CSV file.

SMATPowerBIReport5

 

Query data

As mentioned earlier the raw CSV data by itself is not easily readable to gain insights about the farm.  The data needs to be filtered and queried differently to support the report that will be built.  I can not stress enough that having a set of questions you are trying to answer is extremely important.  It is not sufficient to start designing reports without some initial planning.  Think of this step as modeling the data in different ways to get different perspectives on the data.  Initially my personal goals with this process were to identify the following items:

  • Content database statistics
    • Number of sites per database?
    • Size (storage) of databases?
  • Web application statistics
    • Number of site collections and subsites?
    • Size (storage) of site collections?
  • Web application risks
    • Breakdown view of risks per web application?
    • Slicer by web application and risk type
  • Site collection risks
    • Number of risks per site collection?
    • Slicer by risk type

Having these goals / questions in mind I diagramed how to split the data into different queries that could then answer those questions.

SMATPowerBIReport6

 

SiteAssessmentReport Query

Power BI has a nice set of features that allows taking a set of data, making some changes, and then creating a reference (shallow copy) or duplicate (deep copy) that altered set of data.  As such I loaded the CSV data and then referenced that data into 3 separate queries, 1 of which was further split into multiple queries for risk data by web application and by site collection.  The following are example steps to accomplish this.

After the data has been loaded from previous step click on Edit Queries from the ribbon menu.

SMATPowerBIReport7

 

Notice that the first query for the loaded CSV file includes a few modifications to promote the header row and change datatype of columns that should be numbers.  Leave those steps in place.

SMATPowerBIReport8

 

Create three references from the source data load query (SiteAssessmentSummary) and rename the referenced queries appropriately.  In this example I named them RiskFilteredSource, DatabaseMetrics, and WebApplicationMetrics (respectively the 3 boxes in the 2nd row on my hierarchy diagram above).

SMATPowerBIReport9

SMATPowerBIReport10

 

RiskFilteredSource Query

The next step is to start adding modifications to the queries to shape the data into the different results which will be reported on.  Start with the RiskFilteredSource.  Only the SiteURL and individual risk columns (Alerts, Apps, etc.) are necessary.  Rather than removing the columns that aren’t needed it is recommended to select the columns that are needed (see reference for more information on why).  Click Choose Columns from the ribbon menu.  On the following dialog box select the columns for SiteUrl  and all of the individual risk columns (Alerts, Apps, etc.) ending with WorkflowRunning2013.

Note: the set of individual risks in the CSV will be dependent on which ones were enabled when the SMAT tool was run.  This examples uses the out of the box settings as of SMAT v1.1 release.

SMATPowerBIReport11

 

The query result should look like the following with only the columns selected still showing.  That is all for the RiskFilteredSource query.

SMATPowerBIReport12

 

DatabaseMetrics Query

Next will be modifications to the DatabaseMetrics query.  Similar to the previous query start by selecting only the columns relating to databases (e.g. ContentDbName, ContentDbServerName, and ContentDBSizeInMB) by using the Select Columns button from the ribbon menu.

SMATPowerBIReport13

 

There will likely be duplicate rows in the result because each row in the original CSV corresponds to a single site collection and each content database can contain 1 or more site collections.  In order to clean up the duplicate rows select all three columns and then (with columns still selected) click Remove Rows –> Remove Duplicates from the ribbon menu.

SMATPowerBIReport14

 

The DatabaseMetrics query result will contain only a single record per content databases with the associated server name and database size.

SMATPowerBIReport15

 

WebApplicationMetrics Query

The last of the original four queries, WebApplicationMetrics, will take extra steps to shape the data as intended.  The SiteUrl column is not in the format needed since it contains full site collection URLs.  For the purposes of this example assume that the authority (ex. subdomain.domain.com) portion of URL represents an individual web application.  Continuing with that assumption the SiteUrl column can be manipulated to show data per web application.  Click Choose Columns from the ribbon menu and select only the columns for SiteUrl, SiteCollectionSizeInMB, SiteCollectionCount, and SubWebCount.

Note: if the farm contains host named site collections (HNSCs) then the assumption regarding “URL authority = web application” may be invalid.

SMATPowerBIReport16

 

Next remove the scheme (ex. http:// or https:// most likely) from the SiteUrl column by clicking the Replace Values button from the ribbon menu.  For Replace With specify an empty string (“”).  In my example all sites are using https:// but a farm may have a mix of both http:// and https:// so an additional Replace Values step may be needed.  I’m not aware of a way to combine both into one operation but if a reader knows please do share in the comments.

SMATPowerBIReport17

 

In order to separate out the authority portion of URL from the rest of the site collection (e.g. the path) click the Split Column –> By Delimiter button from the ribbon menu.  Choose a custom delimiter of forward slash (/) and choose to only split at the left most delimiter.

SMATPowerBIReport18

SMATPowerBIReport19

 

To make it easier to read these newly split columns click “Rename column” from the ribbon menu and rename the first column to SiteUrlDomain and the second column to SiteUrlPath.  Alternately the Rename column functionality can be found by right-clicking the column and choosing Rename from the list of actions.

SMATPowerBIReport20

 

The final modification for the WebApplicationMetrics query will be to group the data by multiple aggregations.  Select the SiteUrlDomain column and click Group By in the ribbon menu.  Make sure that SiteUrlDomain is selected for the top Group By box.  Proceed to add additional aggregations for three columns:

New Column Name Operation Column
SiteCollectionSizeInMB Sum SiteSizeInMB
SiteCollectionCount Count Rows
SubWebsSum Sum NumOfWebs

 

SMATPowerBIReport21

 

The results for the WebApplicationMetrics query should look similar to the following.

SMATPowerBIReport22

 

RiskByWebApp Query

Referring back to the original planning hierarchy there are two final queries to be created: “Risks by Web Application” and “Risks by Site Collection”.  These queries are children from the “Filter for Risk Related Columns”.  This translates to creating 2 references from RiskFilteredSource query.  Rename these referenced queries to RiskByWebApp and RiskBySiteCollection.

SMATPowerBIReport23

 

Starting with RiskByWebApp perform the similar steps used with WebApplicationMetrics.

  1. Replace the https:// or http:// scheme
  2. Split SiteUrl column on leftmost “/” delimiter

Since this query is focused on web application based information the “path” portion of URL (ex. “/sites/HRMarketing”) is not needed.  Click Choose Columns and select all columns except “SiteUrl.2”.

SMATPowerBIReport24

 

After the columns have been chosen rename the “SiteUrl.1” column to “SiteUrlDomain”.  The data should look similar to the following at this point.

SMATPowerBIReport25

 

Now is where some of the magic happens.  Credit for this specifically goes to Ken Kilty and Gene Livshin.  I want to aggregate the individual risk counts per web application while also maintaining the individual risk name.  Welcome to Unpivot Columns (<insert fireworks and fanfare>).  Think of the unpivot columns operation as transpose for a key-value pairing.  Select all of the columns for the individual risks (Alerts, Apps, etc.) but be sure to not select SiteUrlDomain.  Then click Unpivot Columns.

SMATPowerBIReport26

 

The resulting query will have SiteUrlDomain, Attribute, and Value for columns.  Rename Attribute and Value to RiskName and RiskCount respectively.

SMATPowerBIReport27

 

Notice that RiskName values will repeat for if there are multiple site collections in that web application.  Click Group By from the ribbon menu.  Specify SiteUrlDomain and RiskName for the top grouping dropdowns (add a new grouping for RiskName since the default is one grouping.)  For the New Column Name, Operation, and Column specify RiskCountAggregated, Sum, and RiskCount respectively.

SMATPowerBIReport28

 

The RiskByWebApp query result is finally in a usable format now.  See the following for an example of the output.

SMATPowerBIReport29

 

RiskBySiteCollection Query

The RiskBySiteCollection query will mirror the RiskByWebApp query in every step except there will not be a split column or the clean up of columns immediately afterwards.  The order of steps will be as follows:

  • Source
  • Replaced value – remove scheme from SiteUrl
  • Unpivoted Columns – select all columns except for SiteUrl for the unpivot
  • Rename Columns – Attribute –> RiskName, Value –> RiskCount
  • Grouped Rows – group by SiteUrl and RiskName, sum RiskCount into RiskCountAggregate.

 

The RiskBySiteCollection query should end up looking like the following:

SMATPowerBIReport30

 

Conclusion

Quite a bit of work goes into transforming and modeling the initial CSV data into more usable formats.  Hopefully this post has been helpful in illustrating steps that can be taken to plan out the end goal as well as model the queries of the SMAT output data.  In Part 2 of this post I will walk through creating a number of Power BI report pages that leverage what has been created thus far.  I’ll also attempt to share out sample source CSV data and a Power BI PBIX file but no guarantees.  Please share any feedback or questions in the comments.

 

Continue reading part 2 of this series.

 

-Frog Out

SharePoint 2013 Migration Assessment Tool Released To Web

   Microsoft quietly released the Release to Web (RTW) version of the SharePoint Migration Assessment Tool (SMAT) for SharePoint 2013 on Jan 20, 2017.  This is an update from the Release Candidate (RC) version that was released in Fall 2016.  I haven’t seen any announcements regarding this upgrade so I wanted to share here.

 

Background

  For those who are not familiar with SMAT it is a command line tool that will check a SharePoint farm for risks migrating to SharePoint Online.  Aside from that primary purpose it is also useful for an overview audit of the farm (name and size of content databases, name and size of site collections, site collection admins and owners, and more.   There is a backing configuration file (scandef.json) which can enable or disable various checks to be scanned against a SharePoint farm.  At the time of writing there is only a version compatible with SharePoint 2013 but there is potential for a SharePoint 2010 and / or 2016 compatible version in the future.

   In terms of the checks scanned for they include:

  • Large lists
  • Checked out files
  • Full trust solutions
  • Non-default master pages
  • Unsupported web templates
  • … and more

 

   SMAT will need to run on a server within the SharePoint farm and be run as the farm administrator.  Run time will vary depending on the size and configuration of the farm.  In a tiny lab farm I had created it took minutes to run but in sizable lab farm (multiple TBs of data and tens of thousands of site collections) it took over 24 hrs.  The output from the tool will be 1 summary CSV along with 1 detail CSV per check scanned (~30).

 

What’s New

  Comparing the documentation from the RC version to the RTW version I am seeing a few new risks that are scanned but I didn’t see them output in the CSV files.  I may need to be make changes to the config file in order to include the new checks.  Other than that the execution of the tool and general process is the same as the RC version.

 

Next Steps

   So what can you do with these CSV files?  By themselves the output CSV files can be difficult to read to get a good picture of the overall migration risks of a farm.  One option is to use these output files in conjunction with the Microsoft FastTrack Center and their SharePoint 2013 to SharePoint Online Migration Offer.  See this link for details on that offer and more.

   Another option is to analyze the CSV files on your own with tools such as Excel, SQL Server, or Power BI.  Personally I am very new to Power BI but with a little research and consulting with fellow PFEs I was able to generate a useful report that aggregated the risk data and filtered it into very usable visualizations.  I will write a follow up post about the process I followed to model and visualize these output files.

 

Conclusion

   If you are are hosting a SharePoint 2013 farm and looking to migrate to SharePoint Online or simply want to audit your farm for risks and general data I encourage you to take a look at the SharePoint Migration Assessment Tool.  Look for my next blog post where I’ll dig into interpreting the data with Power BI to find useful insights and more.

 

      -Frog Out