Showing posts from April, 2018

Updating the product key and activating a Windows Server installation

A very common task when installing a Windows operating system is to activate the license, and update the product key if required. Although this is a very simple task using the control panel, it can be tricky the command line is all you've got, just like on server core installations. The slmgr script can provide information about the license status of a server, update the product key and activate the license. Let's take a look at the parameters of the script. To get a short description of the licensing status, use the "/dli" parameter: If you need a more detailed status, you can use the "/dlv" parameter: The "/xpr" parameter displays the version of the server and the status of the activation: The "/ipk" parameter allows us to change the product key of the server: After changing the product key, we have to activate the license using the "/ato" parameter: After updating the product key

DHCP Server Migration From Windows Server 2012 R2 to Windows Server 2016

With the end of main stream support coming up in a few months, I decided to start upgrading my lab environment to Windows Server 2016. I started with Active Directory and it's related services like DHCP. In this article we'll migrate the DHCP service from a machine running Windows Server 2012 R2 to one that's running 2016. The first step of the process would be to install and configure the DHCP server role on the new machine. I'm not going to describe the installation process since it's fairly simple, I will however show you the post installation configuration. The first screen is the description of the operation, followed by the authorization step: Here, we have to select the account that is going to be used in order to authorize the server: On the last step, the server is authorized and the groups used for delegating access are created: Now that our new DHCP server is ready, we have to migrate the configuration from the old one. The serv

CPolydorou.PSISE module updates

The latest release of my Powershell module for PowerShell ISE host contains a new function named "Insert-PSISEQuotes" that updates the text in the current open file. There are many times when I have been given a list of items by email, or I have copied them from another file or event written them line by line. In order to transfer these items to a variable, I usually copy the text in a new text file and then use the Get-Content cmdlet to get the contents of the file and store them in a variable for later use. The "Insert-PSISEQuotes" function will insert double quotes in the start and end of each line in the current file open in ISE followed by a comma and the end of the line. I can also add a variable assignment or just parentheses to make the text an array. Let's take a list of email addresses for example: You can convert the list to an array using the "Array" parameter: The "Variable" parameter will convert the text to an

Collecting IIS log files on OMS

Moving on to the next article about Azure OMS, we'll start collecting IIS log files in order to be able to examine the requests and report on website usage. In order to start collecting IIS logs, we first have to enable the feature on the workspace. Don't forget to click save after enabling the setting. Give it some time and the agents should pickup the change and start uploading the logs. Moving on to the query part, I'm going to use my Exchange servers again, since they have websites that I would like to get statistics for. First, we'll summarize the requests per website: Then, we will extract statistics regarding the user agents used: Finally, we'll extract the number of requests on each Exchange virtual directory: I should mention here that the IIS log files have to be stored in W3C format and custom fields or IIS Advanced Logging are not supported at this time. Have fun! Related articles      Introduction to Azure Advanced Analyti

Exchange Mailbox Database Log Copy and Replay Monitoring

The design of the latest versions of On Premises Exchange server has been more and more close to the Cloud version. One of the major changes was the introduction of the Database Availability Groups, that are similar to the SQL server availability groups in concept and allow us to have more than one copies of a mailbox database. This increases the availability of the mailbox service but introduces cost for the required storage. Although storage is cheap these days, there's no need to invest much in it for Exchange since the system will recover from a failed disk (always talking about the volumes holding the database files). However, you should always plan the hardware and system requirements using the Exchange Requirements Calculator. A hardware, software or network event may lead to missed logs between the database copies and result in copy and/or replay queue length. Depending on the time taken to resolve the issue and restore the connectivity, you may have to resume the m

Querying OMS for Events

On the previous article of the series, we queried the OMS workspace for performance statistics and created graphs to present the data. On this article we are going to query for events on the windows logs. The below query will return all the servers that were shut down unexpectedly during the last day: The "project" function allows us to pick the columns on the result. Role fail over in failover clusters is another thing I usually monitor: Since the events are written in the Microsoft-Windows-FailoverClustering/Operational log, we need to add it to the windows event logs that OMS is collecting first. You can also monitor trends in application events: As show in the above chart, there number of error events in the application log of one of my Exchange servers increased significantly a few days ago. Data can also be rendered in multiple dimensions. The below graph shows the number of the Error, Warning, Information and Success events for the last day, per

Querying OMS for Performance Data

On the previous articles of the OMS series, we configured the workspace and installed the agents on the machines. It's time to start examining the data accumulated on the workspace and create reports, diagrams and alerts. On this article we are going to create diagrams that show the system performance statistics of the servers. Let's start with a query that will display the total CPU utilization. We are querying the performance data produced by servers that their name starts with "DC" in order to get data related to the domain controllers. I would like to take a moment here and focus on the importance of the naming conventions. Having proper names for your machines is very important when in comes to OMS since the computer name is the easiest way to relate the data to the machine. On the third line of the query, we specify that we like to have the data for the "% Processor Time" counter and the "_Total" instance of the "Processor&q

Deploying the OMS Windows Agent using DSC

On this article we are going to deploy the OMS windows agent on a server using DSC. Before proceeding with the DSC part, there are some requirements that have to be met. We need to store the agent file on a file share that is going to be accessible by the server and we need to know the product id of the agent software in order to use it with the package resource. On my lab environment, I've configured a FreeNAS virtual machine to provide all kinds of file shares and I'm using a public SMB share to host the agent installation file. You can host it on any other location such as web, as long as there's a way to get it using DSC. I've also configured a DSC Pull server and I'm publishing the configurations there, but since it does not contain any custom configurations, you can publish it directly to the machines. An easy way to get it is to install the agent on a computer and then execute the following PowerShell command: PS C:\> Get-WmiObject Win32_Produc