In past years I set goals at the beginning of the year and then recapped my progress on them the following year (see my retrospectives from 2010, 2011, 2012, 2013). Unfortunately I posted my goals for 2014 but then never followed up (as I came to realize last week). As such I felt that this might be a good time to switch things up. Personally I found that my goals were either repeating themselves or becoming too formulaic. Instead I’ll be focusing more on writing about my past year’s accomplishments and share out a few things I’m interested in.
2015 was a big year. It was the first full year with our daughter Clara and first full year living in our new house. We also completed a number of home projects including a new patio (previous one starting to sink in places) and remodeled our master bath. Glad to have both of those behind us but already finding new things that need to be fixed / replaced for 2016. The joys of home ownership.
On the technology side I’ve been digging into Azure Infrastructure as a Service (IaaS) and given a number of internal and external presentations on this topic. Additionally I’ve been following along the progression of PowerApps (read my Start Learning About PowerApps post here). SharePoint 2016 will be releasing in 2016 and I’ve been lucky to have access to early bits to put them through their paces. This is all part of my process to continue learning new things and also partially my natural desire to tinker with cool technology. Walt Disney put it best when he said “[w]hen you’re curious, you find lots of interesting things to do.”
One side interest of mine has always been personal productivity and ways to track it. Many systems exist such as Getting Things Done (which I’ve read David Allen’s book Getting Things Done a few years ago), Kanban, and more. Recently I’ve taken up using Trello as my personal (non-work) task tracking system. I like the concept of being able to create columns / lists for my daily tasks. I have a backlog and 2-4 days (columns) of lists. I move cards from my backlog to my daily column once they are completed. I can quickly and easily archive a daily list to keep things tidy but also see prior days for a quick retrospective. My target is to complete at least 2 tasks each day. It is a work in progress but so far after 2+ weeks it is working better than any prior system I’ve tried. See below for an example of my recent tasks.
Over the past 6+ years I’ve started getting back into reading pretty heavily. At first it started with audiobooks during the 30-45 min commute to various customers when I was a consultant with Sogeti. I listened to some excellent audiobooks including The Lord of the Rings trilogy, the Asimov Robot series, and more. I chose to get my audiobooks from the local library which while it had a number of excellent choices was still limited.
When I joined Microsoft I was traveling at least 2-5 times a month, usually driving or flying anywhere from 1-7 hours. At this point my reading started shifting more towards physical books. There is something about holding a physical book in your hands that resonates with me. Perhaps it harkens back to my grade school days and summers reading. Either way there are a number of used books stores and libraries that provide plenty of options.
A few of the recommendations from the last year or two:
During a summer vacation in 2009 (I distinctly remember the occasion) my oldest brother turned me on to a podcast called Stuff You Should Know. I had started listening to audiobooks not too long before this so I was getting used to audio content but wasn’t fully ingrained in it as a medium. Things changed after I started listening to this podcast. Josh and Chuck (after a brief stint with a different starting duo) mix a blend of information, entertainment, inside jokes, and levity to a huge variety of topics (they have amassed over 700 episodes). My wife also enjoys listening to them during car trips.
Over time I added other podcasts to during long drives, workouts, or relaxing at home. Some podcasts haven’t kept my interest and I’ve stopped listening but these current shows are a mix of entertainment and technology information.
Hopefully by reading this retrospective and sharing of interests you are inspired to reflect on your own past year. I find it invigorating and recharging looking back at the past year’s progress while looking forward to what can be accomplished the next year. If you have any recommendations on books, podcasts, or other technology feel free to share. Thanks for reading.
Note: Amazon links are referral links on my blog and go towards paying for hosting, domain registration, and writing this blog which is done on my own time.
In this post I’ll talk about PowerApps, a new enterprise service for building enterprise applications and share resources on where to find out more information.
Note: PowerApps is currently in private preview and is subject to change after this article is posted. As such this article may contain out of date information by the time you read this. Additionally I am a Microsoft employee but the views and opinions expressed in this article are my own and not reflective of Microsoft or the PowerApps product group.
On Nov 30th, 2015 at the European Convergence conference Microsoft unveiled a new enterprise service for building enterprise applications called PowerApps. At a high level PowerApps allows power users and developers to build scalable applications that connect to numerous services (Office 365, SalesForce, OneDrive, Dropbox, etc.) using PowerPoint and Excel-like tools to be consumed on Windows, iOS, and Android. These applications can be built once and then consumed on any platform. No need to re-compile, design separate UIs per platform, etc. like you see with the current state of most mobile or web development.
I’ll briefly walk through some of the highlights for different components or aspects of which to be aware.
Tools and client player
Currently it is possible to create and consume PowerApps apps on Windows and iOS. I can’t speak to the final plans but it is my understanding that it is on the roadmap to be able to create and consume on all platforms including Windows (PC and mobile), iOS, Android, and web. You will not be limited to only consuming from the platform that you created on though.
PowerApps is designed to be able to author apps using Excel and PowerPoint type skills. There is no need to code your solution. That said if you are a developer and wish to code backend interactions or create a custom API to connect to that is available (with the Enterprise plan, more on that below).
Out of the box PowerApps ships with a dozen or so connectors for pulling or pushing data to the following sources. By configuring a connector to these services you can perform simple CRUD (create, read, update, and delete) operations on data in these sources.
- Dynamics CRM Online
- Google Drive
- Microsoft Translator
- Office 365 Outlook
- Office 365 Users
- OneDrive [consumer version]
- SharePoint Online
Establishing a connection to these services is as simple as logging into the service. Once you establish a connection it is persisted to the PowerApps cloud and will be available on any device that you log into your account.
Speaking of logging into accounts, authentication for PowerApps is handled by Azure Active Directory. As such you will need to have an Azure Active Directory identity / domain in order to utilize PowerApps. Thus you can view PowerApps as an enterprise solution more than a consumer solution even though you do have access to consumed focused connections (e.g. Twitter, OneDrive, etc.).
PowerFlows (also called Logic Flows) are still a work in progress but the goal is to provide simple yet robust workflows for data. Think along the lines of If This Then That (IFTTT, www.ifttt.com) which is a popular website for connecting data from disparate sources and taking action when specific triggers are met. Ex. when the forecast is predicting rain tomorrow send me a text message and put a calendar entry on my calendar to bring an umbrella to work. IFTTT also integrates with home automation software, smartphone devices, design websites, and more.
On the PowerFlows side you can define a triggers and then take actions based on that incoming trigger. Ex. when a new tweet from Twitter contains specific data create a new entry in a SharePoint list, send me an email, and then create a case in Salesforce. When used in conjunction with apps from PowerApps this can be a powerful complimentary toolset.
When it comes time to share your PowerApps app with others you can simply type in their email address and share it with them. No need to worry about downloading the application, incompatibility of the OS, or other traditional blockers for enterprise applications. In the enterprise plan it is possible to restrict access to the app so that only specific users are able to view and access your app.
Speaking of plans there are 3 plan levels. They are as follows.
- Free – create and use unlimited apps, 2 connections to SaaS data per user, shared infrastructure
- Standard - create and use unlimited apps, unlimited connections to SaaS data per user, shared infrastructure
- Enterprise - create and use unlimited apps, unlimited connections to SaaS data per user, dedicated infrastructure, app governance, API management
The last piece of the Enterprise plan is interesting to me. This allows an organization to package up an API to line of business (LOB) or other data (i.e. SQL Server, on-prem SharePoint, etc.) and publish it to Azure. That data source can then be consumed by PowerApps apps.
Sign up for preview
PowerApps is currently in private preview but I encourage everyone to request an invite to gain access at the following URL. Note that you may not be accepted in right away but you will be added to the list for future inclusion.
Request invite to PowerApps
I am very excited to see PowerApps finally come to private preview. I have been following Project Siena (precursor to PowerApps) for over a year now and tinkering around with the alpha and beta builds of both. There is no release date yet for PowerApps but I encourage you and your organization to sign up for the preview and take a look at the videos and tutorials linked below.
Lastly a few parting thoughts. Think of all of the LOB apps that exist in your company or organizations and all of the time, effort, and money that goes into developers and / or designers creating and maintaining those applications. Many of these applications are simple data entry, approval workflow, or similar type applications. By exposing enterprise data in a structured and secure manner you can empower end users to create those types of applications much more quickly while freeing up resources and people for other business needs.
Introducing Microsoft PowerApps
Microsoft PowerApps main site (and registration)
Microsoft PowerApps tutorials
Microsoft PowerApps videos on Channel 9
Microsoft takes the wraps off PowerApps
In this post I’ll talk through my personal experience deploying the new cloud hybrid search service application for SharePoint 2013 (also available in SharePoint 2016). By no means am I an expert on this topic (especially in many of the supporting technologies such as AAD Connect, AD FS, etc.) but this is more meant to increase exposure to this new offering. For an overview of cloud hybrid search and more information about actual implementation (which I will refer back to later) please read through Cloud Hybrid Search Service Application written by two of my Microsoft peers Neil and Manas (they are the true experts).
Here is a list of the high level components I used for my deployment.
Note: My Azure VM configuration is not using best practices for where or how to deploy different services. Also my mention of GoDaddy and DigiCert are purely for example purposes and not an endorsement for either company. I just happen to use their services and products in this scenario.
- Office 365 (O365) trial tenant (sign up for one here)
- 4 Azure VMs
- A1 - Active Directory Domain Services (AD DS)
- A1 - Active Directory Federation Services (AD FS)
- A2 – Azure Active Directory Connect (AAD Connect), Web Application Proxy (WAP)
- A4 - SQL Server 2014, SharePoint 2013 farm with Service Pack 1 and at least Aug 2015 CU
- Custom domain (purchased through GoDaddy but any domain registrar should work)
- Note: Office 365 does have a partnership with GoDaddy so configuration may be easier due to automated updates that can be performed
- Additionally I was able to modify public DNS records through GoDaddy to allow federated authentication through AD FS
- SSL wildcard certificate purchased from DigiCert
- Only required if want to allow Office 365 user to open / preview a search result that resides on-prem with Office Online Server (new version of Office Web Apps Server 2013, not discussed in this post)
- I also used this certificate for other purposes such as securing AD FS communication and implementing Remote Desktop Gateway (the latter is unrelated to this post)
- Custom result source to display O365 search results in my on-prem farm
Next we’ll take a look at some of these components more in depth.
The new cloud hybrid search service application is available in SharePoint Server 2013 with the August 2015 CU or later. I have heard from my peers that there are some issues with cloud hybrid search as of the October, November, and December 2015 CUs. As such use either the August or September 2015 CUs at the time of this writing (Dec 8, 2015) or wait until the Jan 2016 CU which should contain the fix (link). The SharePoint Server 2016 IT Preview 1 also supports cloud hybrid search although I have not tested it out myself.
Cloud Search Service Application
To provision a cloud hybrid search service application the property CloudIndex on the service application must be set to True. This property is a read-only property and can only be set at creation time. As such you will need to create a new search service application in order to utilize the cloud hybrid search service.
I have not tested creating a new cloud hybrid search service application using a restored backup admin database from an existing search service application. The thought behind this would be to retain at least a portion of your existing search service application. If you try this and have any findings let me know in the comments below.
A custom domain is not a requirement for cloud hybrid search. I used one so that I could allow end users (demo accounts) to log into Office 365 as a federated user “someUser@<fakecompany>.com” rather than the default domain “someUser@<O365TenantDomain>.onmicrosoft.com”.
In order to search for on-prem content that has been indexed by Office 365 the user will need to have an account that is synchronized to Azure Active Directory / Office 365. This allows the search service in Office 365 to show content based on the Access Control List (ACL) defined on-prem.
There are multiple options available for synchronizing accounts between on-prem and AAD but the predominate ones include DirSync, AAD Sync, and AAD Connect. Since AAD Connect is the future looking tool of choice of these three I decided to use it. AAD Connect automates many of the tedious tasks of configuring federated authentication by stepping through a wizard.
That said I did run into a number of issues during configuration due to missing certificates, invalid permissions, or other steps I missed or was unaware of. If I got part of the way through the configuration and ran into a failure that I couldn’t recover from then I had to uninstall AAD Connect (do not remove all prerequisites when prompted), wipe out the contents of “<install drive>:\Program Files\Microsoft Azure AD Sync\Data”, and then re-install.
Display Search Results On-Prem
***PLEASE READ AS THIS IS IMPORTANT***
The default scenario for cloud hybrid search is to index both on-prem and O365 content which are then queried in O365. It is possible to create or modify an on-prem result source to use the remote index from your Office 365 tenant which allows for querying and display the combined search results on-prem. The problem though is that when you query for and click results on-prem the search analytics click data is not incorporated back to the cloud index to further influence search results.
Ex. I queried for “SharePoint” in on-prem search center and clicked the 4th result on result page. Multiple other users also searched for “SharePoint” and clicked the same 4th result. SharePoint search (via timer jobs and other background processes) incorporates that click data and adjusts the 4th result to now appear higher in rankings upon subsequent search queries.
I have unsuccessfully tested a few options to manually pass the search click data up to SharePoint Online. These include creating a ClientContext object and calling the RecordPageClick() method on SearchExecutor, modifying the display template page, and more. I did hear from a SharePoint MVP that successfully tested out a way to push search analytics data between on-prem and O365 but it took a fair amount of customizations to accomplish. If I find out any additional information, workaround, or updates on this topic I’ll update this post to reflect that.
As you can see from the below screenshots I can initiate a search query from on-prem or O365 (respectively) and get the same combined result set.
Due to my prior inexperience around AD FS, Web Application Proxy, AAD Connect, and other applications it took me a few days to get everything working end-to-end. After that small hurdle I was very excited to be seeing combined on-prem and O365 search results in both on-prem and O365. Do note though the section above calling out the current issue with search analytics data not being sent back and forth. Aside from that I am looking forward to testing this out with customers and reaping the many benefits such as inclusion of content in the Microsoft Graph (formerly Office Graph) / Delve and other O365 only offerings.
In this post I’ll walk through a workaround to the “There are no locations available. You may not…” error when trying to provision a new instance of Azure DevTest Labs in the current preview (as of 2015/10/12).
A few weeks ago during AzureCon 2015 there was an announcement that the new DevTest Labs offering was available in preview. For those of you unfamiliar Dev Test Labs allows an administrator to set quotas for money used per month, size of VMs available, automatic shut down times for VMs, and more. I immediately tried to register and followed the instructions to wait 30-60 minutes. Later on I saw the DevTest Labs section available in the Create blade (note this requires using this link from the above page which as far as I can tell includes the “Microsoft_Azure_DevTestLab=true” querystring parameter to “enable” the DevTest Labs pieces in the UI). When I attempted to create a new instance of a DevTest Labs I ran into an error stating that “there are no locations available”.
I waited a little while longer and refreshed the browser but still had the same issue. Today even days / weeks later no change and still the same error. Thankfully I ran across a support forum post that led me in the right direction to resolve the issue.
Can’t create new lab in Azure DevTest Labs preview
As a fellow forum poster “runninggeek” mentioned there was an issue with the Microsoft.DevTestLab provider in my subscription. Others who registered after me did not have this problem as a problem with the registration backend was fixed shortly after this announcement went out. Here is the PowerShell script I ran to workaround my issue. You can also download from my OneDrive.
# if you have multiple subscriptions tied to account may need to select specific one for below command
Get-AzureSubscription | Select-AzureSubscription
Unregister-AzureProvider -ProviderNamespace Microsoft.DevTestLab
# confirm that provider is unregistered
Get-AzureProvider -ProviderNamespace microsoft.devtestlab
Register-AzureProvider -ProviderNamespace Microsoft.DevTestLab
# confirm that provider is at least registering (mine took 1 minute to fully register)
Get-AzureProvider -ProviderNamespace microsoft.devtestlab
Essentially you need to connect in Azure Resource Manager mode and unregister the Microsoft.DevTestLab provider. Wait until the provider is unregistered and then re-register the provider. Close all browser sessions logged in to the Azure Portal and re-launch it from the appropriate link.
Hopefully very few people ran into this issue as it appears to be caused by the timing of when you registered for the Azure DevTest Labs preview. Thanks to “runninggeek” for pointing me in the right direction to resolve this. I provisioned an instance of DevTest Labs this afternoon and starting to pour through documentation and the initial set of offerings.
Thank you to all of the attendees at my “Running your Dev / Test VMs in Azure for Cheap” presentation at SharePoint Saturday Cincinnati 2015 (or as the locals liked to call it ScarePoint Saturday Spookinnati due to the Halloween theme.) The slides and scripts from my presentation are below. Enjoy.
A big thank you to everyone that attended my PowerShell for Your SharePoint Toolbelt at Dogfood Con this year. We packed quite a few demos into the 60 minute session. My slides and demo scripts are below.
Demo PowerShell Scripts
In a few weeks I’ll be speaking at SharePoint Saturday Cincinnati as well as Dog Food Con. I’ve spoken at both conferences in the past and am excited to be accepted to speak at both again. If you are in the Ohio area and can attend I highly recommend registering. Here are the abstracts for both sessions and hope to see you in my session if you can join.
Dog Food Con
When: Oct 7-8, 2015
Title: PowerShell for Your SharePoint Tool Belt
Abstract: PowerShell is becoming the command line interface for all Microsoft server products including SharePoint. If you haven’t started using PowerShell you will want to add it to your set of tools in your tool belt. In this demo heavy session we will show tips and tricks for using the PowerShell console and ISE, traverse through all sites in a farm, create reports, and create a secure remote connection with whitelisted commands through constrained endpoints. We will also cover some of the more intermediate to advanced techniques available within PowerShell that will improve your work efficiency. This session is targeted to administrators and developers and assumes a basic familiarity with PowerShell.
SharePoint Saturday Cincinnati
When: Oct 10, 2015
Title: Running Your Dev / Test VMs in Azure for Cheap
Abstract: With an MSDN subscription you can run your dev / test SharePoint environment in Azure IaaS for less than the cost of a cup of coffee each day. In this session we will overview the basics of Azure IaaS (Infrastructure as a Service), the pieces you will use to be successful deploying SharePoint in Azure (including the new Azure Resource Manager templates), and how to use resources as efficiently as possible to reduce your costs and boost your farm performance. This session is targeted to SharePoint developers and administrators. Prior knowledge of Azure is helpful but not a requirement.
In this post I will cover a few tips for getting started with using ProcMon (Process Monitor in the Sysinternals Suite) for troubleshooting long running processes. Note that I am not an expert in ProcMon by a long shot, so this is more of a selfish post to remind myself of some key settings to be sure to configure.
Side note. You can run the Sysinternals tools from the web at Sysinternals Live without needing to download the tools to your local machine. This is useful if your customer / organization doesn’t allow installing / running 3rd party tools or has concerns about running them directly on a machine.
Skip down to the Tips section if you don’t want to read the back story.
At least once a month I have a customer scenario where two or more applications are not playing nice with each other. A .Net website and anti-virus software, SharePoint and server backup software, etc. Usually the problem involves one piece of software placing a write-lock or exclusive hold on a file / registry entry while the other software expects the same. In such a scenario we need to monitor a file / registry entry from a fresh start (restart application pool, process, etc.) until the file / registry access error happens which could be hours or days. Since we are monitoring for such a long time we want to make sure that ProcMon only captures the data that we need as well as be mindful of memory / disk space usage.
1) Start ProcMon with no tracing
If you start up ProcMon by double clicking the executable ProcMon will start capturing data immediately. Instead launch it from the command line with the /noconnect parameter.
c:\sysinternals> procmon /noconnect
2) Specify a backing file
By default ProcMon will store event data in virtual memory. Since we could capturing hours or days worth of data it might be preferred to store that to a disk with lots of free space (multiple GBs or more depending on expected duration and number of events).
Navigate to File –> Backing Files… for these settings.
Change the radio button from “Use virtual memory” to “Use file named:” and then specify the filename you would like to use as a backing file. .PML is the default extension used so I followed that convention.
3) Apply filters
By default all processes accessing any files and / or registry locations will be monitored. Instead we want to filter events based on our criteria. The most common scenarios that I run into (in order that I see them) are 1) filtering for location when we don’t know the process that is locking it or 2) filtering for a specific process when we know the process but not the location that is being locked.
Click on the filter icon in the top menu or click Filter –> Filter… to access these settings.
In the example below we filter for any path that begins with “c:\MyAppFolder”. By doing this we include any subfolders of our application folder.
4) Drop filtered events
By default ProcMon will capture all events whether they are filtered or not. To save on memory (or file space if you use a backing file) you can drop filtered events. There is a chance that if you created your filter incorrectly you may miss the events needed for troubleshooting but by keeping your filter broad enough that shouldn’t be an issue.
Click on Filter and select the Drop Filtered Events menu item. If there is a check mark next to the menu item then you have correctly configured ProcMon to drop filtered events.
In this post I walked through a few quick tips for configuring ProcMon when you need to troubleshoot a long running process. Hopefully these tips will help you avoid out of memory issues or having to parse through hundreds of thousands of events.
Special thanks to my peer and teammate Ken Kilty for providing the research and background info for this blog post.
<Update 2015/7/28 2:30pm> I received clarification that the SharePoint product group does support installing .Net 4.6 onto an existing SharePoint 2013 farm server. It is the installer for SharePoint 2013 that will fail to detect .Net 4.5 if .Net 4.6 is already installed and thus throw an error. A future update should correct this scenario with the installer.
On a related note I was able to successfully uninstall .Net 4.6 from a server (remove the KB as mentioned at bottom of this post) and then install SharePoint 2013.
Quick publish on this item and I’ll update once I have more details. One of my customers is exploring Visual Studio 2015 / .Net 4.6 which was just released a week or two ago. During some testing I found out that (as of July 28 2015 when this is published) you cannot install SharePoint 2013 binaries onto a server that has .Net 4.6 (or Visual Studio 2015 which includes .Net 4.6) installed. I received the below error message.
Since .Net 4.6 is an in-place upgrade of .Net 4/4.5/4.5.1/4.5.2 SharePoint has an issue with finding .Net 4.5.x after applying 4.6. I am testing out removing the associated KB for .Net 4.6 to see if this is reversible should you accidentally deploy this to a dev / test farm. I’m also testing if you can install .Net 4.6 / Visual Studio 2015 onto an existing SharePoint 2013 farm.
Removing associated KB…
- On Windows Vista SP2 / Windows 7 SP1/ Windows Server 2008 SP2 / Windows Server 2008 R2 SP1, you will see the Microsoft .NET Framework 4.6 as an installed product under Programs and Features in Control Panel.
- On Windows 8 / Windows Server 2012 you can find this as Update for Microsoft Windows (KB3045562) under Installed Updates in Control Panel.
- On Windows 8.1 / Windows Server 2012 R2 you can find this as Update for Microsoft Windows (KB3045563) under Installed Updates in Control Panel.
Download .Net framework
Hopefully this helps someone before they run into issues with their farm. Feel free to leave a comment if you find out any additional details or workarounds.
This was the first year of the Microsoft Ignite conference which merged a number of previous conferences including TechEd, SharePoint Conference, Project Conference, and more. With over 23,000 attendees, a new venue, and numerous Microsoft senior leadership and product group in attendance (including CEO Satya Nadella himself) this was definitely a huge event. Rather than re-capping the event itself I wanted to take a minute to mention a few items that I heard / saw at the conference. I am still downloading and viewing a number of sessions that I couldn’t attend (same time as another session or room was at capacity) but these are highlights that I wanted to share with others.
- No “internal” FIM in SharePoint 2016 - SharePoint 2016 will not ship with a version of the Forefront Identity Manager product included. This is a fairly big deal for any customers that are using the “SharePoint Synchronization” option (allows for import and export of content to / from SharePoint) for the User Profile Sync in 2010 or 2013. Your options in 2016 will be the Active Directory Import (same as 2007 and re-introduced in 2013) or “external” FIM which is installed and managed outside of SharePoint Server. See the following resources for more details and how to install FIM 2010 R2 + SP1 with SharePoint 2013 so that you can start planning today if you do need the full features of syncing data into and out of SharePoint.
What's New for IT Professionals in SharePoint Server 2016 (session recording with announcement)
Configuring SharePoint 2013 for the Forefront Identity Manager 2010 R2 Service Pack 1 Portal (install overview)
- Project Siena – Project Siena looks like a viable alternative (not replacement) for many (smaller) custom development scenarios. Essentially it is an app that lets you build other apps. I do not see this replacing InfoPath, Lightswitch, and half a dozen other technologies that have popped up over the past few years but I do see a promising future for this technology (HTML5 + JS based, similar to many other tech stacks that Microsoft is promoting). Note that it is still in a beta release last time I checked but the fact that it caters to the Excel power user with similar syntax merged with an easy drag and drop interface feels like this could gain traction better than some other tools. If you aren’t familiar with Project Siena you really need to see it to understand it.
Microsoft Project Siena: Build Apps and Create New Mobile Solutions (session recording with demos)
Microsoft Project Siena (Beta) (product site)
- New SharePoint hybrid search option - Hybrid search is receiving a huge update / upgrade later this year. In it’s current (May 2015) form SharePoint hybrid search involves separate search service applications / indices for on-prem farms and Office 365 / SharePoint Online. If you query one source you can federate the query to the other and get results in a separate result block. The problem though is that configuration can be fairly complex, search results aren’t integrated (in-line with each other), and you likely have a large number of servers on-prem for the search service. Later this year (target timeframe, subject to change) Microsoft will release an update which will allow an on-prem “cloud search service application” to crawl and parse content but then push the metadata up to Office 365 for indexing, querying, etc. The massive benefit of this is that your on-prem content will then be able to be used in other services like Delve, Office 365 data loss prevention (DLP), and others that currently have no expected on-prem release (or won’t be supported until future releases of SharePoint). Additionally you will need a much smaller on-prem server footprint to support search (the example given was going from 10+ search servers down to 2). This is a big win in my opinion and I can’t want to test it out when it is released.
Implementing Next Generation SharePoint Hybrid Search with the Cloud Search Service Application (session recording)
- Nano Server – Nano Server is a new installation option for Windows Server 10 (Server 2016 or whatever the final name ends up as) akin to Server Core in the past. There were a number of sessions that talked about how small the footprint of Nano Server will be (400MB, yes MB compared to 8+ GB of server + GUI “full” edition). The changes that this introduces not only affect performance but also re-architecting tools to work remotely (there is no local logon or UI for Nano Server, everything must be done remotely). Things like Event Viewer, Task Manager, Local Services, etc. can be accessed remotely in a web UI similar to the “new” Azure Portal UI (super slick, take a look). This may sound scary to some admins who are used to having RDP or locally logging on to a server but listen to Jeffrey Snover’s take on this. We are IT Professionals and this is a technology that will reduce number of reboots, make servers more secure, reduce infrastructure footprint, and have numerous other benefits. You owe it to yourself and your company to learn about this and see if it will work for the services you provide.
Nano Server (session recording)
Nano Server: The Future of Windows Server Starts Now (session recording)
Remotely Managing Nano Server (session recording)
- PowerShell – Getting to see Jeffrey Snover (inventor or PowerShell) and Don Jones (first follower of PowerShell, see the link in slide deck) geek out about PowerShell was one of the best sessions I got to see at Ignite. Hard to describe in words hence I recommend go watch the recording. Jeffrey had some great advice about using PowerShell as a tool to explore and dive into problems or scenarios you are trying to solve. That sense of adventure can be a motivating force for your personal and professional careers. It was really inspiring and I love the fact that Jeffrey (and Don’s) mindset is spreading to so many others these days.
Windows PowerShell Unplugged with Jeffrey Snover (session recording)
On a side note I also wanted to mention one of the obvious but not always talked about benefits of going to a conference like this in-person. During the week I was able to introduce myself to a number of presenters that I had previously not met. Some were MVPs, fellow Premier Field Engineers (PFEs), product group members, and more. The connections you make can last for years and provide an invaluable network for sharing information and getting assistance when you are in need. I even got a PowerShell sticker directly from Jeffrey Snover himself (another personal highlight).
This is just a short list of some of the sessions that I attended along with highlights or key points that I wanted to share. If I find anything else significant from the recordings I am going back to watch I’ll update this post. For now though go check out the recordings above or the hundreds of other ones that are up on Channel 9. I encourage you to attend next year when Ignite 2016 will be in Chicago again May 9-13.