SharePoint CSOM to Traverse All Sites in SharePoint Online

   In the past I’ve written posts for “PowerShell to Enumerate SharePoint 2010 or SharePoint 2013 Permissions” or “PowerShell Script To Traverse All Sites In SharePoint 2010 (or 2007) Farm” to assist with traversing through all sites within a SharePoint on-prem farm.  In this post I’ll share a snippet I recently used for traversing through all site collections and subsites within a SharePoint Online tenant.

Background

   If you’ve worked with the SharePoint Online Management Shell you may know that originally it was not able to retrieve Personal Sites (also known as My Sites / OneDrive for Business sites) in SharePoint Online.  As far as I’m aware this was primarily a limitation of the underlying client side libraries (Microsoft.SharePoint.Client.*).  Fast forward to a recent release (I don’t have the specific one but I can confirm it is in the 16.1.6621.1200 release of the Microsoft.SharePointOnline.CSOM NuGet package) and now it is supported to retrieve Personal Sites.  In the management shell this is accomplished by calling the following:

Connect-SPOService -Url ‘<tenantAdminUrl>’

Get-SPOSite -Limit all -IncludePersonalSite $true

   The problem is that if you want to traverse all site collections in SharePoint Online in client side object model (CSOM) you would need to know how the SharePoint Online Management Shell implements that inclusion of Personal Sites with the rest of site collections.  In order to find this out I used a disassembler (ILSPY in my case, but there are many alternatives available as well) on the underlying libraries to recreate the process in my own code.

Solution

   The resulting CSOM that I came up with is below.  The yellow code is processing all site collections within SharePoint Online while the green code is processing sites (subsites / webs / “not site collections”).  Feel free to borrow this and use in your own code but note that it is provided as-is with no warranty.

SPOSitePropertiesEnumerable ssp = null;

SPOSitePropertiesEnumerableFilter sspFilter = new SPOSitePropertiesEnumerableFilter();

SharePointOnlineCredentials creds = new SharePointOnlineCredentials(“myUsernameGoesHere”, securePassword);

using (PnPClientContext cc = new PnPClientContext(“myURLGoesHere”))

{

    cc.Credentials = creds;

   Tenant tenant = new Tenant(cc);

    //loop through all site collections including personal sites (even though not being used)

    //borrowed this code from after decompiling SPO Management Shell assemblies

    sspFilter.IncludePersonalSite = PersonalSiteFilter.Include;

    sspFilter.IncludeDetail = true;

    sspFilter.StartIndex = null;

    ssp = tenant.GetSitePropertiesFromSharePointByFilters(sspFilter);

    cc.Load(ssp);

    cc.ExecuteQuery();

    foreach (SiteProperties sp in ssp)

    {

        //DO YOUR WORK HERE FOR EACH SITE COLLECTION, such as looping through subwebs

        cc.Load(cc.Web, w => w.NoCrawl,

                                    w => w.Webs,

                                    w => w.Url);

        cc.ExecuteQuery();

        //check subweb(s)

        foreach (var subweb in cc.Web.Webs)

        {

            cc.Load(subweb, sw => sw.NoCrawl,

                                        sw => sw.Url);

            cc.ExecuteQuery();

        }

    }

}

Conclusion

   In this post I shared a snippet for traversing all site collections in SharePoint Online with C# CSOM code.  In my daily job I’ve been using this in combination with Azure Functions for a number of interesting interactions with a SharePoint Online tenant.  More to come on those scenarios in future weeks.  For now let me know in the comments if you have any questions or issues implementing the above snippet in your own code.  Happy coding!

      -Frog Out

Controlling Office 365 Admin Access with Azure AD Privileged Identity Management (PIM)

   Controlling, monitoring, and revoking access to privileged accounts can be a difficult process.  Recently my coworker Ken Kilty shared with me a new service for Azure Active Directory called Privileged Identity Management (Azure AD PIM).  After spending some time with it I wanted to share with a broader audience since I had never heard of it previously.

image

 

Overview

   Please read the What is Azure AD Privileged Identity Management first for a good overview of implementation, example scenario, and additional links to resources.  Note that Azure AD PIM requires Azure AD Premium P2 licenses.  If you would like to test this out there is a free 30 day trial of Azure AD Premium P2 for up to 100 users.

   Granting administrator access, for any application or server, to users should always be done with caution.  Sometimes what starts out as a temporary elevation of permissions turns into a permanent assignment.  Azure AD PIM answers many of the tough questions for Azure AD, Office 365, and related services such as:

  • Who has admin access to <service X>?
  • How do I grant truly temporary access to <service Y>?
  • How can I review all current admins to see if they still need admin access?

   The goal with Azure AD PIM is to allow administrators to define either permanent or “eligible” assignment of specific elevated permissions within Azure and Office 365.  Currently there are 21 roles that can be managed such as Global Administrator, Password Administrator, SharePoint Service Administrator, Exchange Administrator, and more.  See Assigning administrator roles in Azure Active Directory for a more complete listing of roles.  Users who are defined as “eligible” will be able to elevate themselves to roles they have been assigned for a set number of hours (1-72) defined by a Azure AD PIM administrator.  During this role elevation process the “eligible” user will need to verify their identity through a text / call verification or multifactor authentication (MFA) mechanism.  One of the key advantages is that this entire interaction is tracked and auditable.  Administrators can even require an incident or service ticket number prior to elevation and receive alerts when elevation requests are processed.

 

Conclusion

   I have seen privileged role access handled in many different ways at customers over the years.  Having a consistent and auditable process ensures that changes can be tracked and users who no longer need elevated permissions can be removed.  In the time I’ve tested out Azure AD Privileged Identity Management I am very happy with the overall process and review options.  One word of advice for users elevating yourself.  You will need to log out and log back in in order to update your claim token with the new elevated role claims.  Give Azure Active Directory Privileged Identity Management a try and share any feedback in the comments below.

 

      -Frog Out

SharePoint 2013 Migration Assessment Tool Released To Web

   Microsoft quietly released the Release to Web (RTW) version of the SharePoint Migration Assessment Tool (SMAT) for SharePoint 2013 on Jan 20, 2017.  This is an update from the Release Candidate (RC) version that was released in Fall 2016.  I haven’t seen any announcements regarding this upgrade so I wanted to share here.

 

Background

  For those who are not familiar with SMAT it is a command line tool that will check a SharePoint farm for risks migrating to SharePoint Online.  Aside from that primary purpose it is also useful for an overview audit of the farm (name and size of content databases, name and size of site collections, site collection admins and owners, and more.   There is a backing configuration file (scandef.json) which can enable or disable various checks to be scanned against a SharePoint farm.  At the time of writing there is only a version compatible with SharePoint 2013 but there is potential for a SharePoint 2010 and / or 2016 compatible version in the future.

   In terms of the checks scanned for they include:

  • Large lists
  • Checked out files
  • Full trust solutions
  • Non-default master pages
  • Unsupported web templates
  • … and more

 

   SMAT will need to run on a server within the SharePoint farm and be run as the farm administrator.  Run time will vary depending on the size and configuration of the farm.  In a tiny lab farm I had created it took minutes to run but in sizable lab farm (multiple TBs of data and tens of thousands of site collections) it took over 24 hrs.  The output from the tool will be 1 summary CSV along with 1 detail CSV per check scanned (~30).

 

What’s New

  Comparing the documentation from the RC version to the RTW version I am seeing a few new risks that are scanned but I didn’t see them output in the CSV files.  I may need to be make changes to the config file in order to include the new checks.  Other than that the execution of the tool and general process is the same as the RC version.

 

Next Steps

   So what can you do with these CSV files?  By themselves the output CSV files can be difficult to read to get a good picture of the overall migration risks of a farm.  One option is to use these output files in conjunction with the Microsoft FastTrack Center and their SharePoint 2013 to SharePoint Online Migration Offer.  See this link for details on that offer and more.

   Another option is to analyze the CSV files on your own with tools such as Excel, SQL Server, or Power BI.  Personally I am very new to Power BI but with a little research and consulting with fellow PFEs I was able to generate a useful report that aggregated the risk data and filtered it into very usable visualizations.  I will write a follow up post about the process I followed to model and visualize these output files.

 

Conclusion

   If you are are hosting a SharePoint 2013 farm and looking to migrate to SharePoint Online or simply want to audit your farm for risks and general data I encourage you to take a look at the SharePoint Migration Assessment Tool.  Look for my next blog post where I’ll dig into interpreting the data with Power BI to find useful insights and more.

 

      -Frog Out

My Experience Configuring Cloud Hybrid Search Service Application for SharePoint

   In this post I’ll talk through my personal experience deploying the new cloud hybrid search service application for SharePoint 2013 (also available in SharePoint 2016).  By no means am I an expert on this topic (especially in many of the supporting technologies such as AAD Connect, AD FS, etc.) but this is more meant to increase exposure to this new offering.  For an overview of cloud hybrid search and more information about actual implementation (which I will refer back to later) please read through Cloud Hybrid Search Service Application written by two of my Microsoft peers Neil and Manas (they are the true experts).

 

Components

   Here is a list of the high level components I used for my deployment.

Note: My Azure VM configuration is not using best practices for where or how to deploy different services.  Also my mention of GoDaddy and DigiCert are purely for example purposes and not an endorsement for either company.  I just happen to use their services and products in this scenario.

  • Office 365 (O365) trial tenant (sign up for one here)
  • 4 Azure VMs
    • A1 – Active Directory Domain Services (AD DS)
    • A1 – Active Directory Federation Services (AD FS)
    • A2 – Azure Active Directory Connect (AAD Connect), Web Application Proxy (WAP)
    • A4 – SQL Server 2014, SharePoint 2013 farm with Service Pack 1 and at least Aug 2015 CU
  • Custom domain (purchased through GoDaddy but any domain registrar should work)
    • Note: Office 365 does have a partnership with GoDaddy so configuration may be easier due to automated updates that can be performed
    • Additionally I was able to modify public DNS records through GoDaddy to allow federated authentication through AD FS
  • SSL wildcard certificate purchased from DigiCert
    • Only required if want to allow Office 365 user to open / preview a search result that resides on-prem with Office Online Server (new version of Office Web Apps Server 2013, not discussed in this post)
    • I also used this certificate for other purposes such as securing AD FS communication and implementing Remote Desktop Gateway (the latter is unrelated to this post)
  • Custom result source to display O365 search results in my on-prem farm

 

   Next we’ll take a look at some of these components more in depth.

 

SharePoint Server

   The new cloud hybrid search service application is available in SharePoint Server 2013 with the August 2015 CU or later.  I have heard from my peers that there are some issues with cloud hybrid search as of the October, November, and December 2015 CUs.  As such use either the August or September 2015 CUs at the time of this writing (Dec 8, 2015) or wait until the Jan 2016 CU which should contain the fix (link).  The SharePoint Server 2016 IT Preview 1 also supports cloud hybrid search although I have not tested it out myself.

 

Cloud Search Service Application

   To provision a cloud hybrid search service application the property CloudIndex on the service application must be set to True.  This property is a read-only property and can only be set at creation time.  As such you will need to create a new search service application in order to utilize the cloud hybrid search service.

   I have not tested creating a new cloud hybrid search service application using a restored backup admin database from an existing search service application.  The thought behind this would be to retain at least a portion of your existing search service application.  If you try this and have any findings let me know in the comments below.

 

Custom Domain

   A custom domain is not a requirement for cloud hybrid search.  I used one so that I could allow end users (demo accounts) to log into Office 365 as a federated user “someUser@<fakecompany>.com” rather than the default domain “someUser@<O365TenantDomain>.onmicrosoft.com”.

 

AAD Connect

   In order to search for on-prem content that has been indexed by Office 365 the user will need to have an account that is synchronized to Azure Active Directory / Office 365.  This allows the search service in Office 365 to show content based on the Access Control List (ACL) defined on-prem.

   There are multiple options available for synchronizing accounts between on-prem and AAD but the predominate ones include DirSync, AAD Sync, and AAD Connect.  Since AAD Connect is the future looking tool of choice of these three I decided to use it.  AAD Connect automates many of the tedious tasks of configuring federated authentication by stepping through a wizard.

   That said I did run into a number of issues during configuration due to missing certificates, invalid permissions, or other steps I missed or was unaware of.  If I got part of the way through the configuration and ran into a failure that I couldn’t recover from then I had to uninstall AAD Connect (do not remove all prerequisites when prompted), wipe out the contents of “<install drive>:Program FilesMicrosoft Azure AD SyncData”, and then re-install.

 

Display Search Results On-Prem

 

***PLEASE READ AS THIS IS IMPORTANT***

    The default scenario for cloud hybrid search is to index both on-prem and O365 content which are then queried in O365.  It is possible to create or modify an on-prem result source to use the remote index from your Office 365 tenant which allows for querying and display the combined search results on-prem.  The problem though is that when you query for and click results on-prem the search analytics click data is not incorporated back to the cloud index to further influence search results.

Ex. I queried for “SharePoint” in on-prem search center and clicked the 4th result on result page.  Multiple other users also searched for “SharePoint” and clicked the same 4th result.  SharePoint search (via timer jobs and other background processes) incorporates that click data and adjusts the 4th result to now appear higher in rankings upon subsequent search queries.

   I have unsuccessfully tested a few options to manually pass the search click data up to SharePoint Online.  These include creating a ClientContext object and calling the RecordPageClick() method on SearchExecutor, modifying the display template page, and more.  I did hear from a SharePoint MVP that successfully tested out a way to push search analytics data between on-prem and O365 but it took a fair amount of customizations to accomplish.  If I find out any additional information, workaround, or updates on this topic I’ll update this post to reflect that.

 

Example

   As you can see from the below screenshots I can initiate a search query from on-prem or O365 (respectively) and get the same combined result set.

 

OnPremResults

 

SPOResults

 

 

Conclusion

   Due to my prior inexperience around AD FS, Web Application Proxy, AAD Connect, and other applications it took me a few days to get everything working end-to-end.  After that small hurdle I was very excited to be seeing combined on-prem and O365 search results in both on-prem and O365.  Do note though the section above calling out the current issue with search analytics data not being sent back and forth.  Aside from that I am looking forward to testing this out with customers and reaping the many benefits such as inclusion of content in the Microsoft Graph (formerly Office Graph) / Delve and other O365 only offerings.

 

      -Frog Out

Wrap Up from Microsoft Ignite 2015

   This was the first year of the Microsoft Ignite conference which merged a number of previous conferences including TechEd, SharePoint Conference, Project Conference, and more.  With over 23,000 attendees, a new venue, and numerous Microsoft senior leadership and product group in attendance (including CEO Satya Nadella himself) this was definitely a huge event.  Rather than re-capping the event itself I wanted to take a minute to mention a few items that I heard / saw at the conference.  I am still downloading and viewing a number of sessions that I couldn’t attend (same time as another session or room was at capacity) but these are highlights that I wanted to share with others.

 

Recap

  • No “internal” FIM in SharePoint 2016 – SharePoint 2016 will not ship with a version of the Forefront Identity Manager product included.  This is a fairly big deal for any customers that are using the “SharePoint Synchronization” option (allows for import and export of content to / from SharePoint) for the User Profile Sync in 2010 or 2013.  Your options in 2016 will be the Active Directory Import (same as 2007 and re-introduced in 2013) or “external” FIM which is installed and managed outside of SharePoint Server.  See the following resources for more details and how to install FIM 2010 R2 + SP1 with SharePoint 2013 so that you can start planning today if you do need the full features of syncing data into and out of SharePoint.

What’s New for IT Professionals in SharePoint Server 2016 (session recording with announcement)

Configuring SharePoint 2013 for the Forefront Identity Manager 2010 R2 Service Pack 1 Portal (install overview)

  • Project Siena – Project Siena looks like a viable alternative (not replacement) for many (smaller) custom development scenarios.  Essentially it is an app that lets you build other apps.  I do not see this replacing InfoPath, Lightswitch, and half a dozen other technologies that have popped up over the past few years but I do see a promising future for this technology (HTML5 + JS based, similar to many other tech stacks that Microsoft is promoting).  Note that it is still in a beta release last time I checked but the fact that it caters to the Excel power user with similar syntax merged with an easy drag and drop interface feels like this could gain traction better than some other tools.  If you aren’t familiar with Project Siena you really need to see it to understand it.

Microsoft Project Siena: Build Apps and Create New Mobile Solutions (session recording with demos)

Microsoft Project Siena (Beta) (product site)

  • New SharePoint hybrid search option – Hybrid search is receiving a huge update / upgrade later this year.  In it’s current (May 2015) form SharePoint hybrid search involves separate search service applications / indices for on-prem farms and Office 365 / SharePoint Online.  If you query one source you can federate the query to the other and get results in a separate result block.  The problem though is that configuration can be fairly complex, search results aren’t integrated (in-line with each other), and you likely have a large number of servers on-prem for the search service.  Later this year (target timeframe, subject to change) Microsoft will release an update which will allow an on-prem “cloud search service application” to crawl and parse content but then push the metadata up to Office 365 for indexing, querying, etc.  The massive benefit of this is that your on-prem content will then be able to be used in other services like Delve, Office 365 data loss prevention (DLP), and others that currently have no expected on-prem release (or won’t be supported until future releases of SharePoint). Additionally you will need a much smaller on-prem server footprint to support search (the example given was going from 10+ search servers down to 2).  This is a big win in my opinion and I can’t want to test it out when it is released.

Implementing Next Generation SharePoint Hybrid Search with the Cloud Search Service Application (session recording)

  • Nano Server – Nano Server is a new installation option for Windows Server 10 (Server 2016 or whatever the final name ends up as) akin to Server Core in the past.  There were a number of sessions that talked about how small the footprint of Nano Server will be (400MB, yes MB compared to 8+ GB of server + GUI “full” edition).  The changes that this introduces not only affect performance but also re-architecting tools to work remotely (there is no local logon or UI for Nano Server, everything must be done remotely).  Things like Event Viewer, Task Manager, Local Services, etc. can be accessed remotely in a web UI similar to the “new” Azure Portal UI (super slick, take a look).  This may sound scary to some admins who are used to having RDP or locally logging on to a server but listen to Jeffrey Snover’s take on this.  We are IT Professionals and this is a technology that will reduce number of reboots, make servers more secure, reduce infrastructure footprint, and have numerous other benefits.  You owe it to yourself and your company to learn about this and see if it will work for the services you provide.

Nano Server (session recording)

Nano Server: The Future of Windows Server Starts Now (session recording)

Remotely Managing Nano Server (session recording)

  • PowerShell – Getting to see Jeffrey Snover (inventor or PowerShell) and Don Jones (first follower of PowerShell, see the link in slide deck) geek out about PowerShell was one of the best sessions I got to see at Ignite.  Hard to describe in words hence I recommend go watch the recording.  Jeffrey had some great advice about using PowerShell as a tool to explore and dive into problems or scenarios you are trying to solve.  That sense of adventure can be a motivating force for your personal and professional careers.  It was really inspiring and I love the fact that Jeffrey (and Don’s) mindset is spreading to so many others these days.

Windows PowerShell Unplugged with Jeffrey Snover (session recording)

 

   On a side note I also wanted to mention one of the obvious but not always talked about benefits of going to a conference like this in-person.  During the week I was able to introduce myself to a number of presenters that I had previously not met.  Some were MVPs, fellow Premier Field Engineers (PFEs), product group members, and more.  The connections you make can last for years and provide an invaluable network for sharing information and getting assistance when you are in need.  I even got a PowerShell sticker directly from Jeffrey Snover himself (another personal highlight).

WP_20150511_001

 

Conclusion

   This is just a short list of some of the sessions that I attended along with highlights or key points that I wanted to share.  If I find anything else significant from the recordings I am going back to watch I’ll update this post.  For now though go check out the recordings above or the hundreds of other ones that are up on Channel 9.  I encourage you to attend next year when Ignite 2016 will be in Chicago again May 9-13.

 

      -Frog Out

I Contributed to the Office App Model Samples Project

 

<Update 2014-08-18> The Office App Model Samples project has been transitioned over to the Office 365 Developer Patterns & Practices GitHub repo.  Please use that location going forward for any references.</Update 2014-08-18>

 

  During the SharePoint Conference 2014 I had the pleasure of meeting Vesa Juvonen (@vesajuvonen) and Steve Walker (Linked In) after their session “Real-world examples of FTC to CAM transformations” (video).  This was a very valuable session to attend discussing examples of full trust code (FTC) solutions that were re-implemented / re-imagined as app model apps.  They also mentioned a new CodePlex project gathering community app model samples called Office App Model Samples (Office AMS).

 

   Over the past few years I’ve been toying around with various PowerShell scripts to enumerate permissions in an on-premise SharePoint farm (Enumerate SharePoint 2010/2013 Permissions, Enumerate SharePoint 2007 Permissions).  I was curious to see if it was possible to enumerate permissions in a SharePoint Online tenant as well.  I had tried using the official SharePoint Online Management Shell commandlets, Gary LaPointe’s custom SharePoint Online commandlets, and my own client side object model (CSOM) PowerShell queries with no luck.  Looking through Gary’s source code though I found a way to get the permission information I needed via C# code and CSOM.  This felt like a great idea to submit to the OfficeAMS project.

 

   I’m happy to announce that my submission Core.PermissionListing is now published in the OfficeAMS project.  Keep in mind this is a rough proof of concept.  The sample iterates through all non-My Site site collections (something I borrowed from another OfficeAMS solution) in a SharePoint Online tenant and lists out the permissions assigned to groups or users and specifies the permission assigned.  The output could definitely be cleaned up but that will be an effort for a later date.  Hopefully you will find this and other app model samples useful.  If you’d like to contribute or improve upon a solution you find please contact Vesa, Steve, or myself.

 

      -Frog Out