If you’ve ever had a Patch My PC demo call or watched a Patch My PC video you’ve probably seen the Patch My PC demo environment. What you might not know about that demo environment is that there are EIGHTEEN of them, all in Offline Servicing Mode.
If you are lucky, the extent of your involvement with the service connection point in Configuration Manager has been: install it and then never think about it ever again. If you are only somewhat lucky, you’ve painfully flipped the connection point back and forth between offline and online mode.
If you’ve been extra unlucky, you have Configuration Manager in an airgapped network, and are either very familiar with the service connection toolon, or have found this post in the hope of finding a way to update configuration manager.
Table of Contents
What will I learn from this?
In this post, I will briefly explain what the service connection point is, and cover how to use the service connection tool in configuration manager to update your Offline Configuration Manager Environment.
You’ll also walk away with a PowerShell Script that will technically do all these actions for you.
As an extra treat I’ll showcase a few tips and tricks on how Patch My PC uses this script to prepare eighteen different demo environments for upgrade simultaneously.
What is the Service Connection Point
The Service Connection Point is a feature that was introduced with the release of Current Branch for Configuration Manager. The original idea of the service point was it would be able to better handle, and optimize the updating of a VERY complicated system. Configuration Manager Pre-1511 was not nearly as fun to update as it is now, and knowing what hotfix version, CU or other miscellaneous data could be challenging.
The Service Connection point is designed to optimize this by knowing what version you are currently on and what versions you are currently allowed to update to.
This isn’t all it does though; it also supports or enables features like the implementation of a Cloud Management Gateway (CMG), Discovery of users and Groups in Azure AD, and uploading usage data.
Let’s jump into how the service connection points states, impact the product.
If it’s offline, can’t you just make it online?
Sure, but only if you’re then willing to wait the required amount of time for Configuration Manager to calculate usage data, upload it, get a response and then prepare the required content for installation.
This can take anywhere from sixty minutes to eight hours to complete, and if you’re like me with eighteen labs to update, you probably want a little more precision.
Enter the Service Connection Tool
Microsoft recognized early on there are scenarios where companies, or entire industries wouldn’t be able to, or want to support what might be a tier 0 application with direct access to the Internet.
Now we can argue all day long regarding how well defined the list of required Internet access is. However, this argument right wrong or indifferent falls in front of regulations, and policies that are mandated by governing bodies.
As a result Microsoft designed the Service Connection Tool.
The Service Connection Tool allows an administrator to collect the data into a compressed CAB file, and then upload this data to Microsoft from an Internet connected device.
Using The Service Connection Tool
Microsoft defines using the service connection tool process as three distinct steps:
Prepare: Gather the data required.
Connect: Connect, and share the data with Microsoft
Import: Import the results of connect into ConfigMgr
Finding the Service Connection Tool
In order to use the service connection tool, you first must FIND the Service Connection Tool. By default, the required tooling lives in the CD.LATEST directory. If you’re like me and lazy, or maybe you’re just a consultant and new to the environment, you might not know where that directory is on a server. Fortunately, a little help from PowerShell will show us the way.
Running the Prepare Step
So you’ve found the directory where the Service Connection Tool lives! Now it’s time to gather our data.
In order to run this step it’s important you have an EMPTY directory created ahead of time. If you do not provide an empty directory, the gather process WILL fail. Additionally the directory MUST exist, the tool will NOT create the directory/path if it doesn’t exist.
This will in turn generate a usage CAB. The usage CAB then needs to be copied to a machine with access to the Internet (or at least all required Microsoft Services).
Running the Connect Step
You’re now on a machine with the Internet, and you’re ready to get your data.
You’ve copied your offline information over and you’ve created another empty directory to house the download content.
Once again: In order to run this step you MUST have an empty directory to copy the content to, otherwise it will fail.
This command will take a variable amount of time, and the download can be multiple GB in size.
Running the Import Step
Once you’ve downloaded the content you’ll need to copy the data to a location the Configuration Manager server has access to and run the import step.
Upon completion of the import step, the Configuration Manager Updates tab will then have the newest updates available.
You would then run the update like normal.
So, what does this look like for 18 Lab Environments?
I mentioned earlier we currently have 18 different Configuration Manager Lab environments. Updating these environments, is a bit different, as you probably don’t want to do all of this manually, or at least I don’t. Fortunately, PowerShell can help solve this.
Now, I can cheat as my “offline” service connection points DO all have Internet stats, but since we don’t want the download, or evaluation consuming resources during a demo, we keep them in offline mode.
While this huge block of code solves things for a single environment, how do we solve the challenge of needing to do this 18 times?
There are many ways to run PowerShell remotely on another machine. For today, we are going to use Invoke-command. Invoke-Command has a couple of features that make it ideal. First, we can pass functions that exist within memory through to other machines. Second we can create those runs, as JOBS which allows us to check on their state and see if they are done.
Note how we use “function:(Cmdlet we are passing)” the word function, is a keyword that indicates a cmdlet’s logic should be passed through the script block to the other machine.
This will then run the prep-work, download the content, and import it on a single remote machine! With a little bit of a nudge, and a for each loop we can make it run on all of the labs.
We can then check to see how the jobs of preparing the content are doing by running:
This will let us know when our machines are ready to start the next phase, of running PowerShell to start the upgrades, and a script to update the clients.
Jordan Benzing loves patching and has had the opportunity to present on stages all around the world including the Midwestern Management Summit in Minneapolis, on subjects such as reporting, patching, and that wonderful thing no one likes doing: documentation. Jordan has been an avid content creator, and educator since 2016. Jordan has been fortunate enough to earn the Microsoft MVP award from 2020-2023 in the Enterprise Mobility category. He also has six, yes that’s right, six dogs. Two Golden Retrievers, a Shiba Inu, two german shepherds, and a Belgian Malinois.