PSHSummit A Wrap Up

This year I had the privilege and honor of being selected to speak at the PowerShell Global Summit - it was a new experience for me but that seems to be the thing this year. New things (more on that later)

I will admit - as much as I prepared and tried to make the conversation a good one - I still feel as though I failed to do as well as I wanted to do. Regardless - I did en up hearing from a few that they thought it was good - so there is that (that darn imposter syndrome kicking in). And if I get selected for next year - I know I will be able to take some of the lessons from this year and make things (hopefully) better.

But there are a few things I learned from doing this year’s summit - but probably the best thing I learned is to take everything in no matter what. What I mean by that is yes attend the sessions - but those side sessions are the things that are the most important thing in the world. I learned so much on how others are implementing stuff in their environments. It was also nice to learn that I was not alone in some of the issues I was facing at my work - namely - blocking of WinRM on the work network. It was also great to connect with the others who are trying to lead PowerShell user-groups in their areas and hear what is and isn’t working for them.

It’s funny saying that the social aspect of the conference was one of the best things this year - especially from someone as introverted as myself. That being said - in short bursts I am able to handle being around multiple people at once. And it seems that being uncomfortable as I was at that time it helped to bring in the data that I needed to absorb from those around me. I was not comfortable so therefore I was paying attention to things more than usual and I was able to let this information be absorbed (for some reason) better.

So what did I learn? Be uncomfortable - even if for a little bit. You never know what you may learn when that happens.

Royal TS And Dynamic Folders

I have been using Royal TS since version 3 of the application. When I opened it up recently I was greeted with a message that the application has been updated to version 5. Well I’m always interested in the newest versions of applications, especially those that I use on a frequent basis, so I started reading the release notes for version 5.

New Icons. Check.

High DPI support. Check

At that point nothing jumped out at me saying that this required a full version update. That is until my eyes landed on the line Dynamic Folders and credentials. This piqued my curiosity so I downloaded version 5 and started looking at the documentation for the new RoyalJSON specification.

Beyond the specifications documentation, there are also examples for all of the supported script interrupters inside of RoyalTS itself. The examples do not show how to dynamically create JSON files but instead show how to create static ones. This is useful for learning how to create your object and is a good first step and a nice welcome example from RoyalTS.

The example provided by Royal for PowerShell shows how to create a credential object, a folder, and a connection to a computer through the terminal.

Using this as the basis with the knowledge of PowerShell I quickly created a script block which queried my test domain for my servers under the Enterprise Servers OU.

Using the example provided by RoyalTS as my guide, I noticed the hash table created by the example script had a key of Objects and the value was the array of computer objects in it. Doing this allows for the JSON output to be formatted correctly for RoyalTS.

Import-Module ActiveDirectory
[System.Collections.ArrayList]$array = @()
foreach ($computer in Get-ADComputer -SearchBase "ou=Enterprise Servers,dc=company,dc=pri" -Filter * -SearchScope subtree -Properties canonicalname)
{
    $array.add((
            New-Object -TypeName System.Management.Automation.PSObject -Property @{
                "Type" = "RemoteDesktopConnection";
                "Name" = $computer.name;
                "ComputerName" = $computer.name;
                "credentialName" = "mikes";
                "Path" = $computer.canonicalname.replace("/$($computer.name)", "")
            }
        )) | Out-Null
}
$array = $array | Sort-Object -Property path
$hash = @{ }
$hash.add("Objects", $array)
$hash | ConvertTo-Json

Something I found (and didn’t see documented - or maybe I missed it) is that RoyalTS will automatically create the full path for a computer object for you if you put the computer in a path that does not exist. In the case of my script - I am putting the computer in the same folder as it is found in Active Directory and RoyalTS is automatically creating the folder path for me without me having to define it first.

Annotation 2018-12-22 190023.jpg

Notice that the two folder structures match. The OU TestOU is not present in RoyalTS since no computer is defined under there - which is a result of my code and not RoyalTS. If I had written code to first generate the OU structure based on Get-ADOrganizationalUnit the items would indeed have been present in the OU list in RoyalTS.

By using the properties defined in the RoyalJSON specification, we are able to set the credentials that each connection uses either by name, id or explicitly. In the below video, I go over how to create a RoyalTS dynamic folder and shows that it can query AD to create a dynamic list of connections.

Using Credentials In Production Scripts

Sometimes you have to save credentials when running a scheduled task. There are ways to save the credentials in the windows credentials manager, but sometimes you need to store them as a file. In this post I go over how I go over how I store credentials securely in a file. While there are some limitations and things to remember, this is useful for working with scheduled tasks.

Read More

Remote PowerShell Commands With WinRM Disabled And Windows PowerShell

WinRM is extremely useful when using Windows Powershell - but what do you do when it is disabled? How do you work with PowerShell on remote machines without breaking security policy? I explore a method to be able to do this using existing remote technologies which will not make security upset at you.

Read More

Get-ADUser times out after 2 minutes

This past week I was approached by a coworker that had a script which was timing out in less than 30 minutes - the default ADWS timeout. They were using get-aduser with a few conditions in the filter parameter, resulting in only about 30 user accounts being returned. The query was randomly timing out and there was no immediate clue as to why in the error message. This is being run against a complex ad environment with 80k+ users in the environment. After doing some searching around the error results from the timeout between paginated results. After finding this page the answer became obvious.

Because there are so few results being returned due to the filter in such a large environment - and the default pagination size is 256 results the timeout of two minutes is being hit. In this case - since there were only 30 or less results the pagination limit of 256 was never hit and therefore if the query took more than two minutes an error was returned. The simple solution is to reduce the number of results returned on each “page” of results by setting the parameter ResultPageSize to a lower number. In this case, because they were only expecting 30 / 80k+ users to be returned we set the ResultPageSize parameter to 1. This means that for each result returned, the two minute time limit is once again refreshed preventing the query from timing out.