Find Azure AD Error Descriptions

 

Recently I was working with a customer to troubleshoot Azure AD authentication errors logging into a custom application.  I knew that there is a support page for Azure AD Authentication and authorization error codes, but as the article points out “[e]rror codes and messages are subject to change”.  More interestingly they also linked to a page where you can get current information on error codes: https://login.microsoftonline.com/error.

 

Sample response:

AADErrorCodes1

 

Programmatic Response

If you would like to programmatically retrieve the output you can pass in the code…:

  1. as a query string parameter
  2. as a form-data submission on the body of the request.

See the sample screenshot below.  Only one option is necessary.

AADErrorCodes2

As you may notice, the response after submission is an HTML response and not JSON, XML, or another text format.  If you would like to see alternate output formats please upvote this Azure feedback suggestion to add JSON support.

Error details website – JSON support
https://feedback.azure.com/forums/169401-azure-active-directory/suggestions/40310266-error-details-website-json-support

 

Conclusion

In this post I showed a quick tip on how to retrieve current information on Azure AD authentication / authorization error codes.  Additionally you can retrieve it programmatically if needed as well as a feedback suggestion to upvote for additional output formats.  Hopefully this can help save you time troubleshooting scenarios should you need it.

-Frog Out

Workaround for No Locations Available with Azure DevTest Labs

   In this post I’ll walk through a workaround to the “There are no locations available. You may not…” error when trying to provision a new instance of Azure DevTest Labs in the current preview (as of 2015/10/12).

 

Problem

   A few weeks ago during AzureCon 2015 there was an announcement that the new DevTest Labs offering was available in preview.  For those of you unfamiliar Dev Test Labs allows an administrator to set quotas for money used per month, size of VMs available, automatic shut down times for VMs, and more.  I immediately tried to register and followed the instructions to wait 30-60 minutes.  Later on I saw the DevTest Labs section available in the Create blade (note this requires using this link from the above page which as far as I can tell includes the “Microsoft_Azure_DevTestLab=true” querystring parameter to “enable” the DevTest Labs pieces in the UI).  When I attempted to create a new instance of a DevTest Labs I ran into an error stating that “there are no locations available”.

   I waited a little while longer and refreshed the browser but still had the same issue.  Today even days / weeks later no change and still the same error.  Thankfully I ran across a support forum post that led me in the right direction to resolve the issue.

Can’t create new lab in Azure DevTest Labs preview

https://social.microsoft.com/Forums/en-US/0ad3218b-6d18-44ac-915c-5ccd15b14f33/cant-create-new-lab?forum=DevTestLabs

 

Workaround

   As a fellow forum poster “runninggeek” mentioned there was an issue with the Microsoft.DevTestLab provider in my subscription.  Others who registered after me did not have this problem as a problem with the registration backend was fixed shortly after this announcement went out.  Here is the PowerShell script I ran to workaround my issue.  You can also download from my OneDrive.

 

 

Switch-AzureMode AzureResourceManager 
Add-AzureAccount 

# if you have multiple subscriptions tied to account may need to select specific one for below command 
Get-AzureSubscription | Select-AzureSubscription 
Unregister-AzureProvider -ProviderNamespace Microsoft.DevTestLab 
# confirm that provider is unregistered 
Get-AzureProvider -ProviderNamespace microsoft.devtestlab 
Register-AzureProvider -ProviderNamespace Microsoft.DevTestLab 
# confirm that provider is at least registering (mine took 1 minute to fully register) 
Get-AzureProvider -ProviderNamespace microsoft.devtestlab 

   Essentially you need to connect in Azure Resource Manager mode and unregister the Microsoft.DevTestLab provider.  Wait until the provider is unregistered and then re-register the provider.  Close all browser sessions logged in to the Azure Portal and re-launch it from the appropriate link.

 

Conclusion

   Hopefully very few people ran into this issue as it appears to be caused by the timing of when you registered for the Azure DevTest Labs preview.  Thanks to “runninggeek” for pointing me in the right direction to resolve this.  I provisioned an instance of DevTest Labs this afternoon and starting to pour through documentation and the initial set of offerings.

 

      -Frog Out

Getting Started with ProcMon for Troubleshooting Long Running Processes

   In this post I will cover a few tips for getting started with using ProcMon (Process Monitor in the Sysinternals Suite) for troubleshooting long running processes.  Note that I am not an expert in ProcMon by a long shot, so this is more of a selfish post to remind myself of some key settings to be sure to configure.

   Side note.  You can run the Sysinternals tools from the web at Sysinternals Live without needing to download the tools to your local machine.  This is useful if your customer / organization doesn’t allow installing / running 3rd party tools or has concerns about running them directly on a machine.

 

Background

   Skip down to the Tips section if you don’t want to read the back story.

   At least once a month I have a customer scenario where two or more applications are not playing nice with each other.  A .Net website and anti-virus software, SharePoint and server backup software, etc.  Usually the problem involves one piece of software placing a write-lock or exclusive hold on a file / registry entry while the other software expects the same.  In such a scenario we need to monitor a file / registry entry from a fresh start (restart application pool, process, etc.) until the file / registry access error happens which could be hours or days.  Since we are monitoring for such a long time we want to make sure that ProcMon only captures the data that we need as well as be mindful of memory / disk space usage.

 

Tips

1) Start ProcMon with no tracing

   If you start up ProcMon by double clicking the executable ProcMon will start capturing data immediately.  Instead launch it from the command line with the /noconnect parameter.

c:sysinternals> procmon /noconnect

 

2) Specify a backing file

    By default ProcMon will store event data in virtual memory.  Since we could capturing hours or days worth of data it might be preferred to store that to a disk with lots of free space (multiple GBs or more depending on expected duration and number of events). 

   Navigate to File –> Backing Files… for these settings.

ProcMonTips1

 

   Change the radio button from “Use virtual memory” to “Use file named:” and then specify the filename you would like to use as a backing file.  .PML is the default extension used so I followed that convention.

ProcMonTips2

 

3) Apply filters

    By default all processes accessing any files and / or registry locations will be monitored.  Instead we want to filter events based on our criteria.  The most common scenarios that I run into (in order that I see them) are 1) filtering for location when we don’t know the process that is locking it or 2) filtering for a specific process when we know the process but not the location that is being locked.

   Click on the filter icon in the top menu or click Filter –> Filter… to access these settings.

ProcMonTips3

 

    In the example below we filter for any path that begins with “c:MyAppFolder”.  By doing this we include any subfolders of our application folder.

ProcMonTips4

 

4) Drop filtered events

   By default ProcMon will capture all events whether they are filtered or not.  To save on memory (or file space if you use a backing file) you can drop filtered events.  There is a chance that if you created your filter incorrectly you may miss the events needed for troubleshooting but by keeping your filter broad enough that shouldn’t be an issue.

   Click on Filter and select the Drop Filtered Events menu item.  If there is a check mark next to the menu item then you have correctly configured ProcMon to drop filtered events.

ProcMonTips5

 

Conclusion

   In this post I walked through a few quick tips for configuring ProcMon when you need to troubleshoot a long running process.  Hopefully these tips will help you avoid out of memory issues or having to parse through hundreds of thousands of events.

 

    Special thanks to my peer and teammate Ken Kilty for providing the research and background info for this blog post.

 

       -Frog Out

SharePoint 2007 Content Deployment Job Error “Specified argument was out of the range of valid values”

   This week I ran into an interesting error with a customer.  The customer has defined a SharePoint 2007 content deployment path to push content from one SharePoint 2007 farm to another SharePoint 2007 farm.  They can complete 1 full deploy and 1 incremental deploy, but then all incremental deploys error with the following message: “Specified argument was out of the range of valid values”.

ContentDeploymentFailure1r

 

Cause

   In order to get more insight into the error I used ULSViewer to inspect the ULS logs on the server (ULSViewer is not required to read the ULS logs, it just makes it easier to sort through them).  Looking at the selected log message there was an error when attempting to clean up the content deployment job reports list.

ContentDeploymentFailure2r

 

Solution

   The solution to this issue is fairly simple.  On the customer farm they had set the number of reports to keep at 1.  The default value is 20 but they had changed to 1 to conserve space which was limited on this server.  Apparently this caused an issue in which the timer job would error when attempting to clean up the reports list.  We changed the value to 2 and tried to re-run the job.  The job completed successfully for all that we tried.  Problem solved. ContentDeploymentFailure3r

 

SharePoint 2010 Consideration

   After testing this out in SharePoint 2007 I decided to try this in SharePoint 2010.  Thankfully it appears this has been corrected in 2010.  I set the number of reports to 1 and was able to execute the job multiple times.

ContentDeploymentFailure5r

 

ContentDeploymentFailure4r

 

Conclusion

   If you are using content deployment jobs in SharePoint 2007 be sure that you set the number of reports to keep to a number greater than 1.  Apparently there is a issue with a value of 1 where the timer job cannot clean up the reports list due to an out of range exception.  This does not appear to be an issue in SharePoint 2010.

 

      -Frog Out

Fixing “Failed to create field: Field type is not installed properly” Error on SharePoint 2010 List

Upgrading a SharePoint farm can reveal hidden issues that may not be causing any visible consequences in your current environment.  This was especially the case in a recent customer visit to assist with an upgrade from SharePoint 2007 to SharePoint 2010.  During the upgrade we encountered the error “Failed to create field: Field type <field name> is not installed properly” while attempting to upgrade hundreds of SharePoint lists and document libraries.  The issue my customer faced related to a single 3rd party company’s custom field which I have removed to keep confidentiality.  Below is a recap of what we encountered and how we worked around the issue.  I am also making available the PowerShell script I wrote to find all instances of a field name in a web application.

Problem

The field that was causing the issue was a hidden column (SPFIeld.Hidden = true, MSDN link) that was installed by a 3rd party solution the customer no longer wanted to use.  The customer had uninstalled the 3rd party WSP file, but the custom field still existed on hundreds of lists and document libraries.  Unfortunately because the column was hidden it didn’t show up in the list settings page.

When the customer tried to upgrade the environment they had an entry in the ULS logs with the below information.  The message is the key piece of information.

Area SharePoint Foundation
Category Fields
Event ID 8l1l
Level High
Message Failed to create field: Field type <field name removed> is not installed properly.  Go to the list settings page to delete this field.
Correlation ID <correlation ID>

As the error message stated we could go to the list settings page to delete the field.  Unfortunately the error message (and the next ULS entry with the stack trace) didn’t specify which lists contained the problematic field.  I thought it best to use PowerShell to traverse the thousands of lists and libraries in the environment to find which ones contained a field matching the problem field name.

In order to do this I repurposed an script I had written to display all site collection administrators in a web application.  I thought finding instance of the field name would be as easy as calling SPList.Fields.Contains(“field name looking for”) but I was wrong.  The problem was that the SPFieldCollection enumerator was not able to successfully enumerate the fields on any lists that contained the problematic field.  I received an error similar to what I saw in the ULS logs about the problematic field not installed properly.

Receiving an error was not entirely useless though.  I knew that receiving an exception while enumerating the list fields would point me to the lists that I needed to correct.  I decided to use the Try / Catch construct to capture the exception and add that list (and it’s parent SPWeb) to an array of results.  Below is a slightly modified version of the script that I ended up using.  You can download the script below as well.

###############################################################
#SP_Display-InstancesOfErroringFieldNameInWebApp4.ps1
#
#Author: Brian T. Jackett
#Last Modified Date: Dec. 7, 2011
#
#Traverse the entire web app site by site to display
# instances of a field name on lists.  Expectation is that
# enumeration of list will produce an error that can be
# captured.  Does not work against external lists at this time.
###############################################################

$url = read-host -Prompt "Enter a web application URL"

$result = @()

Start-SPAssignment -Global
$webApp = Get-SPWebApplication $url

foreach($site in $webApp.Sites)
{
    write-debug "Site: $($site.url)"
    foreach($web in $site.allwebs)
    {
        write-debug "   Web: $($web.name)"
        foreach($list in $web.lists)
        {
            try
            {
                #if an error occurs when enumerating through the list fields then a result
                $list.Fields | out-null
            }
            catch
            {
                write-debug "      List Error: $($list.title)"

                #disable allowing multiple content types on list (used later in blog post)
                $list.ContentTypesEnabled = $false
                $list.Update()

                #store the web URL and list title with the erroring field
                $result += @{$($web.url)=$($list.title)}
            }
        }
    }
}

write-output "****Results****"
$result

Stop-SPAssignment -Global

The output of the script showed that the customer had hundreds of lists and libraries that contained the problematic field.  Since this was affecting hundreds of lists I thought it would be more efficient to use PowerShell to loop through all lists that were experiencing this issues and remove the field using the server Object Model API for SPField.Delete().  Unfortunately when I used that API method I received an error message stating that it is not possible to remove a SharePoint field that is hidden.  Frustration ensued.

Solution

Since programmatic deletion of the field was not possible  I resorted to following the error message instructions of removing them by hand on the list settings page. I was able to delete the field from about 20% of the lists using this method (see following screenshot.)

DeleteInvalidSPListField1

The remaining 80% of the problematic lists I received an error page stating that the list settings page was not viewable because the problem field was not installed properly.  More frustration ensued.

At this point I looked for a pattern between lists that I could view the settings pages and those that I could not view.  One pattern did emerge.  Lists that I could view did not allow multiple content types.  Lists that I could not view had the setting for “Allow management of content types” set to Yes (see following screenshot.)

DeleteInvalidSPListField2

    Now because I was not able to view the list settings page I couldn’t turn off management of content types through the UI.  I was able to modify the setting using PowerShell and the server Object Model API for SPList.ContentTypesEnabled though.  I updated my above script to include the below lines.

$list.ContentTypesEnabled = $false

$list.Update()

After the lists were updated to disallow multiple content types I was then able to view the list settings page through the UI.  This allowed us to clean up an additional 75% of the lists.  Sadly there were a few dozen lists that were still unable to view the list settings page.  For those isolated list instances it was decided to either migrate content from the existing 2007 farm by hand (download and re-upload) or have users delete and recreate the content.

Conclusion

In order to fix the lists the contained the problematic field we had to go through the UI to manually remove the field.  To get a listing of the affected lists I used the PowerShell script above to find all lists that could not enumerate through their fields.  Additionally most lists required that I disabled multiple content types on the lists.  This may be an unacceptable option in your environment but that was an acceptable loss in this environment.  Beyond that there were still lists that were not able to be saved in their current form.

Whenever you install or use any 3rd party solutions know that there can be risks associated with them.  Despite my customer having uninstalled the solution and tested the migration previously this issue still occurred.  With all of the support staff involved we ended up losing days worth of time due to this problematic field.  Hopefully if you run into this same scenario you can use this audit script and processes to troubleshoot your issue more quickly.

-Frog Out