The new Windows Terminal was announced at Build this year and has been available (building from source code yourself) through the Microsoft/Terminal GitHub repo. A very early preview of the Windows Terminal is now available through the Microsoft Store (link). I haven’t verified but I believe you’ll need a few prerequisites in order to install. Out of the box the current preview of Windows Terminal offers:
PowerShell Core (v6)
Windows PowerShell (v5)
CMD (command prompt)
WSL (Windows Sub-system for Linux, legacy console)
After installing the Windows Terminal from the Microsoft Store, open the Settings from the dropdown menu (or click “Ctrl + ,”).
Find the “profiles” element in the settings file and add the following JSON snippet to the array. Verify the “commandline” and “icon” locations match where PowerShell v7 preview 1 is installed on your machine.
After you’ve added the above element you’ll now see PowerShell v7 Preview in your dropdown list of available shells with the proper icon. For an added bonus I also added Zsh (with the Oh My Zsh framework for auto-suggestions and more) based on seeing demos from Jeff Hollan showcasing that shell.
Below screenshot shows PowerShell v7 Preview in use and verifying the PSVersion.
I’m very excited to begin using the Windows Terminal more in my daily tasks. Being able to switch back and forth between multiple shells (specifically PowerShell v5 and v6/v7) is possible in Visual Studio Code but this will be a much easier solution for many of my scenarios. Hopefully someone else will find this helpful as well.
I was reading a blog post by Mark Heath on This resource group will self destruct in 30 minutes which leverages an open source Azure CLI extension to automatically delete resource groups. This extension accomplishes the deletion by scheduling a logic app in the same resource group. I was curious if I could accomplish the same effect without the need to leverage the Azure CLI (i.e. for any resource group created via portal, PowerShell, etc.) In this post I’ll show how to configure a “self destruct” mechanism using an Azure Alert and a Logic App.
Finalize Logic App – Add JSON schema from alert HTTP request and action to remove resource group to logic app
1) Create Resource Group
If you don’t already have one, create a resource group to host the resources for the self destruct solution.
2) Create Logic App
Create a logic app in the resource group from previous step. Add a trigger for When a HTTP request is received. We will not fill out any of the details for the trigger at this time. Click Save.
3) Activity Log alert
Navigate to Azure Monitor (can search from the top search bar or go through All Services) for the subscription where the solution will be deployed.
Click Alerts on the left hand navigation. Click New Alert Rule.
Define the resource to be monitored:
Signal Type: Activity Log
Monitor Service: Activity Log – Administrative
Signal Name: Create Resource Group (subscriptions/resourceGroups)
Alert Logic – Status: Succeeded
Action Group Name: SelfDestructResourceGroup
Short Name: SelfDestruct
Action Group Action: LogicApp
4) Trigger Alert
Create a new resource group that will then trigger the alert defined in step 3 and consequently fire the HTTP trigger for the logic app defined in step 2.
5) Gather JSON Schema
Navigate to the logic app defined in step 2. Assuming the logic app was successfully triggered, under Run history select the successful logic app execution. On the “logic app run” screen expand the trigger and click on Show raw outputs.
Save this JSON for use in the next step. If you are unable to collect this JSON schema you can use the sample below which is already coverted to the final format needed.
Note: the resoureceGroupName element is buried many levels deep. We will query for this when needed.
The final logic app will look similar to the following:
Test out the solution by adding a new resource group to the monitored subscription. Check the Azure Monitor alerts to see if the Activity Log entry for successful resource group creation triggered the Logic App.
Then verify the logic app execution history shows the resource group deletion was successful.
More than likely you will not want to delete the resource group right after you create it. In that case you can add a Delay action which pauses the logic app for a specified number of minutes (ex. 60 minutes). Additionally you could apply conditional logic to check if the name of the resource group matches (or doesn’t match) a specific pattern. There are many additions you can add to personalize this solution to your needs.
In this post I walked through an adaptation of the Azure CLI extension that Mark Heath linked to. We leveraged an Azure Monitor alert together with a logic app to provide an option to self destruct any resource group no matter where or how it was created. If you have any feedback or additional suggestions feel free to leave them in the comments.
In Linux and Unix there is a “touch” command which will update the timestamp of a file without modifying the contents. You can also create an empty file without having to open an application among other actions. In Windows there isn’t a direct equivalent, but you can get close by using the “copy” command with a “+” at the end of the filename while specifying no destination file. The + symbol points the copy operation back to the source file. This will update the timestamp while not modifying the contents of the file.
This process was helpful for my customer testing out automated CI/CD processes with Azure DevOps and Git. Hopefully this will be useful to someone else. Enjoy.
In this post I’ll share a set of linked Azure ARM templates (link) that can be used to deploy a number of Azure resources together. This sample uses externally linked ARM templates using a parent to child relationship as well as including a few “complex” scenarios like conditional logic and assigning permissions through role assignments. Note there are also two ways to deploy these templates: 1) at the subscription level (owner) and 2) at the resource group level (contributor).
<Update 2019-04-09>After reading this article on ARM template lifecycle management do’s and dont’s I’m rethinking using externally linked ARM templates. Below example still contains them but I may update to combine into a single ARM template at a later date once I’ve had a chance to test.</Update>
For a recent customer project we had a need to deploy the following resources.
Azure Storage Account
Azure Function App
Azure Application Insights
Azure Log Analytics workspace
Azure Automation Account
A few of these resources have dependencies on each other such as the Function App requiring a storage account to store solution data into blobs / queues, Automation Account requiring Log Analytics for pushing diagnostic stream data to a workspace, etc. While it is possible to deploy all of these resources in one (very) large ARM template, we also wanted to be able re-use pieces and parts of this solution in future projects that are going to have a similar (but slightly different) architecture. We decided to create a set of parent-child ARM templates to mimic the first 2 levels of the above hierarchy.
The sample files are stored in my Blog-Samples repo under the ARM-External-Templates folder. There is a brief README file to walk through the necessary steps to update parameter files, upload the linked templates to a storage account, etc. This is not a fully documented process at the moment but if you wish to submit a PR or open an issue I’ll update more as I have time.
A few key pieces to highlight.
The “Contributor” scenario assumes that the deployment account has contributor permissions to a specific resource group or subscription (more common for production or similar environments)
The “Owner” scenario assumes that the deployment account has owner permissions to the entire subscription (ex. development environment)
The “Owner” scenario optionally deploys role assignments (queue sender, workspace reader, and runbook operator) to various resources only if the corresponding parameter values are not empty (otherwise skip assignment)
The Automation Account is pre-populated with a PowerShell runbook and also configured to send runbook job and stream output to the provisioned log analytics workspace
After running the deployment script the deployment and resource group should look as follows.
The linked ARM templates from this sample are meant for illustration purposes only. Hopefully you’ll find them helpful in terms of what is possible from a deployment and configuration perspective. If you have any feedback or suggestions please submit a PR on the repo or open an issue on GitHub. Good luck with automating your Azure deployments.
In this post I’ll share a script I developed for a customer to find which connectors are used by which PowerApps apps. Currently this is not something available through the Power Platform Admin Center. Feel free to use this script as you see fit. This script is provided as-is without any warranties of any kind. If you update or adapt it and decide to re-post please provide attribution.
You may notice that the output contains a complex property for the Connections. It has been some time since I worked with formatting output in PowerShell. If you have an improvement to the formatting please share back suggestion and I’ll update the sample script.
In this post I’ll look ahead at what I have coming up in 2019 and what I look to accomplish.
In previous years I have typically blogged at the start and end of the year about my goals and retrospective. In mid-2018 I started capturing a monthly retrospective (see How I Do A Personal Monthly Retrospective). While I’m doing these personal retrospectives more frequently (monthly vs. yearly) they usually contain more personal things than I would feel comfortable sharing publicly.
Ahead in 2019
Baby number 3
The biggest thing I’m looking forward to in 2019 is that my wife Sarah and I are expecting our third child later this year. We’re finally starting to share the news outside of our immediate family. I’m very excited and happy that our family is growing and I look forward to sharing our love with our new baby.
For the past several years I’ve been running a few 5k races, quarter marathon races, and obstacle course / mud runs. This has been a good way to push myself to stay active and healthy. This year I’ve already signed up for a 5k and an obstacle course race in the first half of the year. I’ve already started training for both and it’s amazing how much better I feel physically and mentally after getting into a good workout routine. I find that I have more energy and motivation to accomplish things and hope that I can keep this up in the coming months.
My Catholic faith has always been a big part of my life, but this year especially is an important year as my wife is preparing to enter fully into the Catholic Church. I’m proud to support her on her faith journey as well as continuing to raise our children in our shared faith.
In 2018 I went through a program at Microsoft called Technical Leadership Development Program (TLDP). The purpose of this program is to identify individual contributors (ICs, not a manager of people) and grow their technical leadership skills to make a bigger impact inside and outside the company. While there were many takeaways from this program one of the big ones for me was the following progression of stages (starting at bottom and going up) for a technical leader:
Stage 4 – Drive Impact Through Strategy
Stage 3 – Drive Impact Through Others
Stage 2 – Master Individual Impact
Stage 1 – Grow Individual Impact
At the start of 2018 I would estimate that I was in the 2nd stage of mastering individual impact and not realizing (at the time) that the next stage was to move onto driving impact through others. Note that this progression to stage 3 doesn’t require moving into people management. Instead you can remain an individual contributor while still increasing impact through other people. I tested the waters with this process through the 30 Days of Microsoft Graph blog series and a few other side projects. I’m looking forward to continue this progression in 2019.
Each year on average I read 2-4 books, usually during slow weeks around holidays when I can devote more attention. In the past year I’ve found a number of great books in a number of topics including science fiction, religion, and technical leadership. I should probably join a Good Reads program but have not done so yet. Instead I usually end up hearing about a good book recommendation from a friend or coworker and then picking it up from the local library, borrowing from a friend, or similar. I’m not always able to reserve time each day / week, but I do find that I sleep better when I read for at least 10+ minutes before going to bed.
These are the things I’m currently looking forward to in 2019. Thanks to my mentor Sean McDonough for urging me to get this written. I know much will change throughout this year, especially once baby #3 arrives. Here’s to a successful start to the year and continued growth. If you have your own goals or plans for the year please share in the comments.
Microsoft Graph is the unified API for any developers working with data inside Office 365, Azure Active Directory (Azure AD), Windows 10, and more. In this post we’ll cover a quick introduction and share resources from 30 Days of Microsoft Graph blog series to show how to authenticate and to make calls against Microsoft Graph with C# and .Net Core (v2.1 as of the time of writing.) Each of the referenced articles aims to take 5-15 mins to get you up to speed as quickly as possible while also providing hands-on exercises. If you’d like to skip the background reading and start from scratch building a .Net Core console application that calls Microsoft Graph read through the README for the base-console-app within dotnetcore-console-sample.
Microsoft Graph overview
Microsoft Graph offers developers (and IT pros / admins) the ability to access data and insights in a number of services within Microsoft 365 services. This includes:
Office 365 services
Enterprise Mobility and Security services
Advanced Threat Analytics
Advanced Threat Protection
Windows 10 services
By providing a unified endpoint for accessing all of these services Microsoft Graph removes a number of barriers including:
Discovering the service-specific endpoint URL
Authenticating to each endpoint separately
Managing different permission models
Working with incompatible data formats
All requests made to Microsoft Graph are sent as REST calls to https://graph.microsoft.com and leverage a common authentication model based on Azure AD and OAuth permissions along with a consent framework for users or admins. The quickest way to see Microsoft Graph requests in action is to navigate to the Microsoft Graph explorer (https://aka.ms/ge, ge = Graph Explorer.) For more information on using Graph Explorer please read Day 3 – Graph Explorer from the 30 Days of Microsoft Graph series. Additionally you can make requests against Microsoft Graph using API development tools such as Postman. Please read Day 13 – Postman to make Microsoft Graph requests for more information on using PostMan with Microsoft Graph.
Getting started sample
Seeing requests and their responses in a browser or tool is useful, but making requests in code or scripts is the more common scenario for usage. In the examples below we will cover C# and .Net Core as .Net Core is available cross-platform, can be built in Visual Studio Code (also cross-platform), and offers many hosting options (console app, web app, serverless functions, and more.)
All requests to Microsoft Graph require an authenticated context, either delegated or app-only. Delegated is a union of the logged-in user’s context along with the application’s context. App-only (as the name implies) is only the application’s context without any user involvement. Please read Day 8 – Authentication roadmap and access tokens and Day 9 – Azure AD applications on V2 endpoint for more information about creating an Azure AD application and getting an authenticated context. On a similar note, you are highly encouraged to leverage Microsoft Authentication Library (MSAL) for creating your authentication context as this is the forward-focused version as opposed to the older Active Directory Authentication Library (ADAL).
Microsoft Graph SDK
While it is entirely possible to call the Microsoft Graph with an HttpClient (or similar) object, the Azure AD Identity and Microsoft Graph product groups recommend leveraging the Microsoft Graph SDK (Microsoft.Graph on Nuget.) This SDK provides a number of benefits including:
Strongly typed entities and Microsoft Graph responses
Fluent API syntax
In future releases Microsoft Graph SDK will also provide abstractions for authentication prerequisites, automatic handling of retry logic or error handling, and more.
As mentioned at the beginning of this post if you’d like to build a working application from scratch (or clone the repo and configure the necessary settings) you can find the base-console-app within dotnetcore-console-sample. Extracting the bare essential lines of code from this sample results in the following for authenticating to Microsoft Graph.
The following class implements the IAuthenticationProvider interface used for retrieving and then adding an Azure AD access token to subsequent requests to Microsoft Graph. An out of the box implementation of this class will be provided at a later date within the Graph SDK.
In this blog post we covered a quick introduction of Microsoft Graph and linked to additional resource within the 30 Days of Microsoft Graph blog series for additional background reading. We also covered a barebones implementation of calling Microsoft Graph in a C# .Net Core console application. Full instructions can be found on the base-console-app within dotnetcore-console-sample. Thank you for reading along and please open an issue on GitHub repo if you run into any issues with the sample project. Enjoy the rest of The Second Annual C# Advent.