Showing posts from March, 2018

Configuring a Certificate on Exchange Receive Connector

Today's article is about configuring Exchange receive connectors with specific certificates. Out of the box, Exchange uses self signed certificates to provide TLS secured mail flow. This will definitely be an issue if you expose the SMTP protocol to client computers since they won't trust the certificate. In this article we are going to configure a certificate that was issued by a third part authority to the Client Frontend receive connector. We'll start with getting the thumbprint of the certificate using the Get-ExchangeCertificate cmdlet: [PS] C:\> Get-ExchangeCertificate Thumbprint                               Subject ----------                               ------- 241B864DC82C664FECBA18B8D54987AAFB65B4C2 CN=*, ... D4D210886B34E690191A1F008C78FDD0E7325DD4 CN=Exchange2013A 960171662EB261162F9C8CBE12E0B75D6F06ABB0 CN=Microsoft Exchange Server Auth Certificate 2690324B827A9F2B75D59104F81CAAA57CDD627B CN=WMSvc-Exchange2013A [PS]

PowerShell Dynamic Validate Set

When writing advanced PowerShell functions you may have to add a parameter with a validate set that is created dynamically. Let me use an example to make thing clearer. I was writing a function that has the ability to query different Active Directory forests. That function is actually a wrapper for the  Find-LDAPObject  cmdlet and provides a better interface by saving the configuration of each directory in a csv file. Each row of the csv file contains the name of the directory, the server and port to connect, the root DN of the directory and the credentials to use. What I wanted to achieve was to create a validate set that would contain all the names of directories in the csv file, so that the user would be able to choose amongst them. Martin Schvartzman has posted a great article on dynamic validate sets that proved to be very helpful! The way to create a dynamic validate set is to create the entire parameter in the function code. We'll start with the standard adv

Verify the Agent Connectivity to OMS Workspace

Now that we have configured our first agent to send data to the OMS workspace, how do we know that everything is working as it should? The first place to check the status of the agent is the agent itself. Open the Control Panel and search for the Microsoft Monitoring Agent. If you switch to the Azure Log Analytics tab, the status of the agent is shown as following: In case you need more details about the status of the agent and it's operations, events are written to the Operations Manager event log: The last way to check the connectivity of an agent to the workspace is to query the heartbeat logs on the workspace itself. The agents send heartbeat messages to the workspace and those messages are available to query. Here we see that the OMSGateway.lab.local computer has sent a heartbeat message on 2018-03-22 at 16:00 GTM. The IP address of the computer is also available along with the type and version of operating system and agent and other useful information. Th

Install and Configure the OMS Windows Agent

On the last article of the Azure OMS series, we configured OMS to collect the log entries from the Application and System logs and the values for a few Performance Counters. It's time to setup the first agent to send data to the workspace! All we need is the setup file that is available on the Azure portal, at the Advanced Settings blade of the OMS workspace. The first step of the agent installation would be to right click the file and then run it as administrator. After the UAC confirmation, the welcome screen should appear. Click Next to get to the Software License Terms and agree after reading them. The next screen will be installation folder selection. In most cases, the default location is going to be fine. Next, we have to configure the agent to connect to OMS by selecting the second checkbox. If you would like to also connect to SCOM, tick the third checkbox. On the next step, we have to specify the OMS workspace to connect to by using the Workspace I

Configuring Log and Performance Counter collection on the OMS Workspace

Following the introduction to the Azure Advanced Analytics article  where we have provisioned an OMS Workspace and took a quick look on it's settings, we are now going to add the windows logs we would like to collect along with some system performance counters. To get to the Data settings of the workspace, we are going to use the Advanced Settings option on the main workspace blade. Here, you have to type the name of the log and then click the plus sign in order to add it. As shown below, I've started with the application and system logs that contain very useful information about the system and the applications that may be running on it. When done adding logs, click the save button to save the configuration. Now that we have configured the logs to collect, we are going to move on to the performance counters by clicking on "Windows Performance Counters". The Advanced Analytics team suggests a few counters related to disk, memory, processor and network, so go

Updates on the CPolydorou.PSISE PowerShell module

The 1.2.1 version of the CPolydorou.PSISE module has just been published and contains updates on two of it's functions. The first function that has been updated is "New-PSISETab". I've added a new parameter called "NoSwitch" that when selected will set the active tab to the tab that the command was executed. This will allow you to keep working on the old tab after creating the new one. The other updated function is "Open-PSISEFile". I have a lot of script files that I'm using as templates that get saved every time I change something and run the script. To avoid changing the files, I've added a parameter called "AsCopy". If this switch is used, a new blank file is added to the PowerShell tab and the content is copied from the file specified in the "Path" parameter. This way, the script can be executed but the file is not updated. You can also open the file on another PowerShell tab by using it's name or it'

Introduction to Azure Log Analytics

For the last couple of years I had the change to work with OMS and take advantage of it's great features. To my opinion, log analytics is a great way to take control of operations and produce all kinds of reports related to the applications, the infrastructure that supports them and even the clients that are using them. I've decided to write a few articles on configuring the OMS workspace, an OMS Gateway, installing the agents on Windows and Linux computers and of course analyzing data and creating dashboards and graphs. I hope you have the chance to work with OMS on a production environment and I strongly recommend giving it a try on your lab using the free plan. We'll start with provisioning an OMS Workspace using the Azure Portal. After logging in, select "Create a resource" and then search for "Log Analytics". This will bring you to a page like the one below, to start the creation of the workspace. The next step is to name your workspa

Managing Active Directory User Certificates using PowerShell

I first came across user certificates when I was working with email certificates a few years ago and I have to admit that I had trouble updating the certificates on the objects! Most organizations have a Microsoft Active Directory Certification Authority that issues the certificates used internally. When a certificate is issued to a user, the Microsoft Certificate Service saves the public key in Active Directory. The userCertificate attribute is a multi-valued attribute that contains the DER-encoded X509v3 certificates issued to the user. Although we rarely need to pay attention to this attribute, there are cases where we have to update it. To make things easier, I've written PowerShell functions to Get, Remove, Import and Export the certificates on that field. To get the list of certificates for an object, use the Get-ActiveDirectoryObjectCertificate function: PS C:\> Get-ActiveDirectoryObjectCertificate -UserPrincipalName cpolydorou@lab.local DistinguishedName