Showing posts with label Microsoft. Show all posts
Showing posts with label Microsoft. Show all posts

Thursday, April 9, 2015

What you say? Windows Nano Server? Looking great, but…

As we all know by now Microsoft Announced Windows Nano Server edition to be released in the Next version of Windows Server yesterday. Here are some of my thoughts about it :).
In my years as a IT consultant, architect, trainer and scripting fanatic I love to see evolvements in the IT landscape.

As an early adopter for PowerShell back in 2006 when it was sort of launched with Exchange Server 2007 I really love to see what you can do with it. I wrote and still write a lot of advanced scripts to make our lives easier in the admin and migration process. I also am involved in the design and architecture of new Windows based infrastructure and trying to get customers or IT admins to adopt the new and rich possibilities the Windows Server and Windows client OS brings.


For some years now I’m trying to convince a lot of my customers to move on to use PowerShell and Windows Server Core edition (no personally not 2008 (I think this version was not really good), but since 2012), but still a lot of IT Pro’s and IT admins still don’t know how to use PowerShell and what you can do with it. To be more clear, I still see that IT admins for example are trying to change the same setting for 4000+ users within Active Directory. Not very efficient I think and also very fault sensitive. Therefore the adoption of Windows Server Core Edition is not happening as quickly as I would have wanted to see.

And this is where I have double feelings about Windows Server Nano. I personally I think it’s great to see you don’t have any overhead anymore in forms of a GUI, local logon, no more WOW64 support and many other stuff. I love to work with remote PowerShell. I think connecting to a server with RDP and then do your stuff on it is slow and you are limited to a maximum number of sessions. Also all these components require updating and installing a lot of hotfixes and patched and so on.
Removing all this overburden leaves you a much more hardened server, a lower footprint (less harddisk space, memory, etc). It also saves you the installation of a lot of updates that you don’t need and leave you with a lot of potential security risks if you don’t install them.

However, as I wrote earlier, in my experience is that the adoption of these new evolvements with the current generation of IT Pro’s (Yes unfortunately I now a lot of IT consultants and engineers that still don’t know how to use PowerShell) and IT admins is a thing to worry about while getting Windows Nano and Core edition server to be adopted.

I do hope it will be a success and personally I will adopt it and recommend it in my advise and designs to customers, but there needs to be a real change in the IT landscape and mindset of IT related people. I think…

Do you guys have any thought on this?

Let me know… I like to have interesting discussions about it :)


Reference:
http://blogs.technet.com/b/windowsserver/archive/2015/04/08/microsoft-announces-nano-server-for-modern-apps-and-cloud.aspx

Wednesday, November 27, 2013

Auto add specific Office 365 licenses to all users

Ever needed to only add Exchange Online licenses to thousands of users in Office 365? Well i did. Since i didn't want to click i created a powershell script for it.

I have created this particular script for an academic company. Therefore the plan and the service options are specifically for an educational company.

This scripts add the users based on the attribute "Office". If this attribute is empty it's an employee, if the attributed is not empty it's a student. The scripts saves the output to a logfile and also to the display (including a progressbar :)).

If you want to know what plans are available in your tenant, you can type in the below commands in a powershell session:
Import-Module MSOnline
Connect-MsolService -Credential (Get-Credential)
Get-MsolSubscription
 


Once you have the subscriptions you can narrow it down to only the services available in the subscription by using the below command:
Get-MsolSubscription -SubscriptionId  | select-object -ExpandProperty servicestatus
 
 
 
 
 

Here is the script. Have fun with it!
 
Import-Module MSOnline
Connect-MsolService -Credential (Get-Credential)
$GlobalLog = "D:\Scripts\Logs\$(gc env:computername)_$(get-date -format hhmm_ddMMyyyy).log"
$Date =  get-date -Format "hh:mm:ss - dd MMMM yyyy"
$Users = Get-MsolUser -All
$Location = "NL"
$LicOptions = 'SHAREPOINTWAC_EDU','MCOSTANDARD','SHAREPOINTSTANDARD_EDU'

#Faculty members
$Faculty = "<TenantID>:STANDARDWOFFPACK_FACULTY"
$PlanFc = New-MsolLicenseOptions -AccountSkuId $Faculty -DisabledPlans $LicOptions 

#Students
$Student = "<TenantID>:STANDARDWOFFPACK_STUDENT"
$PlanSt = New-MsolLicenseOptions -AccountSkuId $Student -DisabledPlans $LicOptions

function LogLine {
 Param(
 [string]$LogInput
 )
 Add-content $GlobalLog -value $LogInput
}

function LogHeader {
 LogLine "--------------------------------------------------------------------------------"
 LogLine "Adding user licenses:"
 LogLine "Time: $Date"
 LogLine "--------------------------------------------------------------------------------"
}

function LicAdd {
$Users | ForEach-Object -Begin {Clear-Host;$i=0;$Stu=0;$Fac=0} -Process `
{ 
 if ($_.IsLicensed -ne "TRUE")
 {
 Set-MsolUser -UserPrincipalName $_.Userprincipalname -UsageLocation $Location
   if ($_.Office -ne $null) 
   {  
       $User = $_.UserPrincipalName
    Set-MsolUserLicense -UserPrincipalName $User -AddLicenses $Student -LicenseOptions $PlanSt
    LogLine "Added student license for user $User"
    Write-Host "Added student license for user" $User -ForegroundColor 'Gray'
    #Start-Sleep -Milliseconds 30
     $Stu++
   }
   else 
   {  
    $User = $_.UserPrincipalName
    Set-MsolUserLicense -UserPrincipalName $User -AddLicenses $Faculty -LicenseOptions $PlanFc
    LogLine "Added faculty license for user $User"
    Write-Host "Added faculty license for user" $User -ForegroundColor 'Green'
    #Start-Sleep -Milliseconds 30
     $Fac++
   }
 }
 write-progress -activity "Adding licenses" -status "Progress:" -percentcomplete ($i/$users.count*100)
 $i = $i+1
} `
-end {}
LogLine "--------------------------------------------------------------------------------"
LogLine "Number licenses added for students: $Stu"
LogLine "Number licenses added for faculty employees: $Fac"
LogLine "--------------------------------------------------------------------------------"
LogLine ""
}

LogLine
LogHeader
LicAdd

Wednesday, June 5, 2013

Thinking Out of the Box: Exchange 2013 and backup

What else do you want to do on a sunny Wednesday afternoon then to write an article about Exchange Server 2013 and backup ;). No really it was a pretty long time ago that I posted a useful article about Exchange so I thought, why not write something about backup.

Last weeks I received a lot of questions from colleagues and customers about backup and disaster recovery in the new Exchange Server 2013. These questions really seemed to focus on the fact that organizations still have a pretty old understanding about backup and recovery. All customers still want to have item level backup while their data usage is growing.

So I thought, this is a good opportunity to write an article about backup and disaster recovery (DR) with Exchange Server 2013 (Exchange).

Introduction

First of all you can divide backup primarily into two main concerns:

  1. You'll probably need backup to perform a point in time restore based on a single item or complete mailbox.
  2. In any enterprise production environment you'll need a solution that provides you a solution to recover your data in case of an emergency.

In the old days the solution to the first concern in Exchange was to buy and implement a backup solution that provided you single item backup and recovery. This feature enabled IT organizations within a company to restore a single or multiple items back into a user's mailbox in case the user accidently deleted the item.

The demand for this solution was high so everybody implemented it and performed well within the requirements. However a few years ago mail data demand began to grow and backup time windows began to shrink because of hypes like "The new way to work" and/or "Work/Life integration". These hypes created more flexible work times and therefore a shorter backup windows. Also users kept their e-mail into their mailbox until the end of times.

These developments began to create some challenges for IT organizations to handle backup of mail data within the boundaries of time provided.

When the years went by Microsoft optimized it's database structure and implemented new features in Exchange to cope with these problems. This resulted in even bigger mailbox databases, but the mindset of organizations concerning the backup of mail data did not change. Even today customers want to have single item backup in their Exchange environment. And when you ask the question, how many times did you use this functionality the past year, they can't give you a real answer.

The second concern is how you need to cope with outage and emergency and getting you're data back (Disaster Recovery or Emergency Recovery). To describe this concern I'll give you a short explanation about DR.

DR can best be divided into two objectives:

  1. RPO (Recovery Point Objective) and
  2. RTO (Recovery Time Objective).
 
RPO
RPO is the maximum tolerable period in which data might be lost from an IT service due to a major incident. In other words how much data (measured in time) is an acceptable loss in case of an emergency.

RTO
RTO is the duration of time in which a business process must be restored after a disaster (or disruption) in order to avoid unacceptable consequences associated with a break in business continuity. In other words in how much time does the service(s) need to be restored in case of an emergency.

So how does this all related to Exchange Server 2013? Well I will try to explain this in the following paragraphs.

Backing up Exchange Server 2013

Third party backup solutions

At the moment of writing this article the support of third party backup solution/providers to backup Exchange Server 2013 is marginal. The following table gives you a better understanding of the most common ("enterprise ready") backup solutions and their support of Exchange Server 2013.

Note: From a Microsoft statement all backup solutions need to make use of the Volume Shadow copy Service(VSS) in order to create a successful and consistent backup. For more information about these requirements click here.




 

Solution

Supported?

Level

1.

Symantec NetBackup

Support from version 7.5.0.6.

Database

2.

Symantec BackupExec

Support from version 2012 Service Pack 2

N/A

3.

NetApp SnapManager

Supported in version 7 and higher

Database

4.

CommVault

Supported in version 9 and higher

Database

5.

VEEAM

No support. Support is going to be in version 7. Release date unknown

N/A

6.

HP Dataprotector

Support from version 8.

Database

7.

EMC Avamar/Networker

No support

N/A

8.

IBM Tivoly Storage Manager

No Support

N/A
 
As you can see there isn't much support from third party products for Exchange Server 2013 yet. Why suppliers of backup software don't have a solution yet is unclear. But the question is, is this a potential problem when you want your organization to move forward in implementing Exchange Server 2013? Personally I think not. Better saying, I personally don't think you'll need a third party backup solution at all! And why is that you say?

Well the explanation is pretty simple. In Exchange (of course if you design it properly) all features to eliminate both backup concerns are built into Exchange. In the next paragraphs I will go deeper into it, so keep on reading ;).
 
Exchange Item Restore
When you ask your customers or the management of your organization if it is really necessary to have their single items back from backup in case of a user error, they probably say yes. But if you ask them till what point in time, they most of the time don't have a direct answer. If you then ask them if they are comfortable to have a restore period of let's say 1 month for recoverable items they probably say that it is ok. You have to keep in mind restoring single items has limitations. In case of a single item restore (not possible yet in combination with Exchange Server 2013) this brings long backup times and probably performance loss.

Exchange however has the ability to keep deleted items for a specific period of time. This is called retention policies. By default all deleted item's (by means items that are removed from the users "Deleted Items" folder) are saved for 14 days. This means that users are able to restore them within 14 days themselves from within Outlook.

So to for fill the need to restore single items you can simply use or extend the retention policy for recoverable items. This is done on the database. You'll however have to keep in mind that you'll need to calculate this in your mailbox storage requirements design.

The advantage of this approach are numerous:

  • It saves you a lot of time to backup single items with any software;
  • It saves you storage in case of snapshot backups on storage level;
  • It saves you storage on your backup tapes;
  • It saves your IT Helpdesk the burden to answer call's about restore of single items;
  • And last but probably the most important, users don't have to call the IT department anymore. They can do it themselves! And that means, one step forward in pissing of users ;).

Exchange HA and Site Resiliency
Great! And what about Disaster Recovery I hear you say? Well Exchange has a built-in solution for that to. It will require you to think well about your design so I only describe the features and technologies needed to achieve the goal.

Since Exchange Server 2010 there is a new thing called Database Availability Groups  or DAG's. DAG's are the successor of the pain in the ass Continues Replication Cluster (CCR) which was available in Exchange Server 2007. Exchange Server 2013 the use of DAG's is continued and improved. With a DAG you can create High Available passive copies of your mailbox databases over up to 16 Exchange Mailbox Servers. The advantage of a DAG is that (although MS Cluster Services is still used on the background) the configuration is relatively simple. You'll need however extra storage for every copy of the database. It is also possible to divide your DAG's over separated Data Centers to ensure services continue to be available and data loss is kept at a minimum. This tackles your direct HA requirement.

But what if for whatever reason your active database gets corrupted? Are my passive copies then also affected? Uhhh yes they probably are. The reason for this is that each copy of an active database in a DAG is seeded (kept up-to-date) by using transaction log shipping. If corruption is inserted in a database the log will simply be played into a copy too.

But don't worry there is a solution for this and that's called "lagged copies". In every DAG you can create next to regular HA copies a Lagged Copy. A lagged copy simply means that you tell Exchange to insert a lag (delay in time) before it commit's changes to the database. Therefore if data gets corrupted in a database the lag will ensure the corruption is not directly in the lagged copy.

The use of Lagged Copies are there since Exchange Server 2007. And therefore also in Exchange Server 2010. However lagged copies where a bit hard to handle in Exchange Server 2010. Also if the organization needs a 0 day RPO it was simply not possible because the logs where gone if all "normal" copies of the databases where not there anymore and therefore the mail queue was empty.

In Exchange Server 2013 this issue is solved by a feature called Safety Net. Safety Net is the successor of the Transport Dumpster and is a layer that is not a part of the databases or the DAG. What Safety Net does is when a transaction is required (incoming or outgoing mail for example) it holds the message until the message is delivered in all the copies (including the lagged copy) of the databases in a DAG.




This all basically means that without any backup software you can tackle item level restore and you can reach a 0 day RTO and RPO together. Of course your design needs to be right and you'll need enough data centers and servers to do the job for you.

Accreditations
A special thanks to Martijn Moret (Data Management Consultant at PQR, @MMMoret) to provide me a table of all backup providers and their support of Exchange Server 2013.

Updates

09-07-2013: Updated support matrix for Symantec NetBackup and HP Dataprotector
09-08-2013: Updated support matrix for Symantec BackupExec

Tuesday, August 14, 2012

Unable to on-board and off-board mailboxes in an Exchange HybridConfiguration

This blog post describes the situation where you are unable to move an Exchange Online (Office 365) mailbox to an on-premise Exchange 2010 server in a hybrid configuration.

If an Exchange Online mailbox is created via the Exchange 2010 Management Console, the ExchangeGUID of the MS Online Mailbox is not properly set in the remote-mailbox configuration of the Active Directory user object. In most cases this is no problem at all, but if you want to move an MS Online mailbox from the cloud to your on-premises Exchange 2010 server the process fails with the error "Exception has been thrown by the target of an invocation.".

Wednesday, July 18, 2012

Exchange Server 2013 – A first glimpse – part 1

On 11 july 2012 Microsoft Released the long expected preview version of Exchange Server 2013 (also known as Exchange 15). In this multipart blog I will try to show you a glimpse of what's new in Exchange Server 2013.

  • In part 1 I will describe the new features and changes that this new version of Exchange is going to offer;

  • In part 2 I will guide you through the installation of Exchange Server 2013;

  • In part 3 we will have a deeper look into the management of Exchange Server 2013.


Thursday, July 12, 2012

Could not bind port 80 on TMG with Windows Server 2008 R2 SP1

During a new implementation of a reverse proxy solution for Exchange Server 2010 OWA based on a Threat Management Gateway 2010 server. I encountered an issue where I couldn't bind port 80 for redirection to port 443. The server where i tried to install and configure TMG on was a Windows Server 2008 R2 SP1 machine.

The following post will guide you thought the issues i had and give you a solution to this problem.

Wednesday, July 20, 2011

Creating a wildcard webserver certificate with your internal CA

It is possible to create a wildcard webserver certificate using your internal Enterprise CA based on Windows Server 2008 R2. To do this you need to have a Enterprice CA with the webserver template deployed.

The question you'll probably ask yourself is "Why do i need this?". Well the answer is simple. You probably don't want to use this certificate in a production environment, but you can use it for testing purposes without having you to buy a expensive commercial wildcard certificate. Espessially when the test results are having you deciding that a wildcard certificate is not the way to go.