Managing Office 365 licenses using FIM 2010

When starting to use Office 365 in large scale you soon realize that although DirSync will solve most of your synchronization needs it will not solve the problem of assigning the correct license to different users. In this post I will show you how I solved this problem using FIM 2010 with a PowerShell MA.

The problem at hand was not only to license unlicensed users but to manage licenses in the respect of changing the license and removing the license based on information in FIM.

Background information

The customer has almost 60,000 users in FIM. In that we have 900 staff and a little over 57,000 students.

In FIM we use the employeeType attribute to decide the type of user. In this context we have three different values to consider, Staff, Student and Alumni. Each type should have a different license assigned to them. And since typically a Student will eventually become an Alumni we need to manage that transformation as well.

In FIM we have also created a simple boolean attribute called off365Licensed. This attribute is the main “switch” to decide if you should have a license at all.

Office 365 Licenses

Different subscriptions in Office 365 gives you different licenses to work with so keep in mind that this post only shows one type of subscription. The EDU subscription gives you a couple of “free” licenses to use but in order to use them we need to learn how they look since we actually will need to disable parts of some licenses in most cases.

As an example I will show you the Faculty licenses that are “free”.
You have the STANDARDWOFFPACK_FACULTY and also the PROJECTONLINE_PLAN_1_FACULTY.

But they consist of different parts.
The STANDARDWOFFPACK_FACULTY has four parts in it

  • SHAREPOINTWAC_EDU
  • MCOSTANDARD
  • SHAREPOINTSTANDARD_EDU
  • EXCHANGE_S_STANDARD

The PROJECTONLINE_PLAN_1_FACULTY has three parts in it

  • SHAREPOINT_PROJECT_EDU
  • SHAREPOINTWAC_EDU
  • SHAREPOINTENTERPRISE_EDU

The problem here is that they overlap and both contain SharePoint licenses that cannot be assigned at the same time. So in our scripts we need to disable parts of the licenses that we cannot assign if we also want to assign the other.

PowerShell MA

We decided to use the PowerShell MA developed by Sören Granfeldt.
I do not do Provisioning and De-provisioning in this case, that part I leave to DirSync. In this MA we import the users, join them to the corresponding user in the MetaVerse and then make sure the correct license is assigned.

Below I show you some parts of the scripts involved but please download O365MAScripts to get the complete scripts.

Schema

The Schema script in this case is quite simple. I used the UPN as anchor in this case.

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-UPN|String" -Value 1
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "user"
$Obj | Add-Member -Type NoteProperty -Name 'IsLicensed|Boolean' -Value $true
$Obj | Add-Member -Type NoteProperty -Name 'LicenseType|String' -Value "Staff"
$obj

Import Script

The import script imports the current MsolUsers

$Users = Get-MsolUser -MaxResults 50000 -DomainName company.com

but also checks the current license assigned

if($user.IsLicensed)
{
if($User.Licenses.AccountSkuId[0].EndsWith("FACULTY"))
{$obj.Add("LicenseType", "Staff")}
elseif($User.Licenses.AccountSkuId[0].EndsWith("STUDENT"))
{$obj.Add("LicenseType", "Student")}
elseif($User.Licenses.AccountSkuId[0].EndsWith("ALUMNI"))
{$obj.Add("LicenseType", "Alumni")}
else {$obj.Add("LicenseType", "Unknown")}
}

Export

The export script picks up the users from O365

$msoluser=Get-MsolUser -UserPrincipalName $_.DN
$IsLicensed=$msoluser.IsLicensed

I do not bother to look at the current license of the users since changing licenses involves removing the old license and adding the new license. My script just removes the old licenses and assigns the correct ones.
Removing old licenses can be done in a single line

Set-MsolUserLicense -UserPrincipalName $msolUser.UserPrincipalName -RemoveLicenses $msoluser.Licenses.AccountSkuId

Adding the correct license involves defining the licenses including the excludes discussed earlier in this post.
Please note that before we can assign any licenses we need to set the UsageLocation on the user.

if(!$msoluser.UsageLocation)
{Set-MsolUser -UserPrincipalName $msolUser.UserPrincipalName -UsageLocation $DefaultUsageLocation;}

We then assign the license depending on the employeeType in FIM.

if($LicenseType -eq "Staff")
{
Set-MsolUserLicense -UserPrincipalName $msolUser.UserPrincipalName -AddLicenses $STANDARDWOFFPACK_FACULTY -LicenseOptions $FacultyStnd;
Set-MsolUserLicense -UserPrincipalName $msolUser.UserPrincipalName -AddLicenses $PROJECTONLINE_PLAN_1_FACULTY;
}
elseif($LicenseType -eq "Student")
{
Set-MsolUserLicense -UserPrincipalName $msolUser.UserPrincipalName -AddLicenses $STANDARDWOFFPACK_STUDENT -LicenseOptions $StudStnd;
Set-MsolUserLicense -UserPrincipalName $msolUser.UserPrincipalName -AddLicenses $PROJECTONLINE_PLAN_1_STUDENT;
}
elseif($LicenseType -eq "Alumni")
{
Set-MsolUserLicense -UserPrincipalName $msolUser.UserPrincipalName -AddLicenses $EXCHANGESTANDARD_ALUMNI;
}
else
{
throw "Unknown LicenseType"
}

Synchronization Rule

Finally we need to configure the Outbound Synchronization rule. We need two flows and I need a condition on the LicenseType to only set in on Licensed users.

O365 MA Outbound Synchronization Rule

Office 365 Management Agent Outbound Synchronization Rule

Conclusion

In this post I have showed you one example on how to manage Office 365 licenses. As you can imagine this task can grow substantially if we also were to manage individually selectable licenses in some way. We have discussed this at this particular client, using the FIM portal for self-service to add additional licenses. That would off course require some major changes to current license management.

Managing Primary Group using FIM 2010

When working with educational customers a typical AD group i Students. This group however might contain hundreds of thousands of users, making it hard to manage. One solution is to make the Students group the primary group for the students. In this post I will show you one way of managing this using FIM 2010 R2.

The Primary Group in AD is stored in the attribute primaryGroupID as an integer. The value is the RID (Relative Identifier) of the group in AD. When you create a new user in AD this value becomes 513, which correspond to the Domain Users group. We cannot set this to any other value during provisioning, so we need to change it after provisioning is completed. If you look at the group in AD you will find that the value we are looking for is the last part of the objectSID.
DomainUsersSID
A thing to remember about the primary group is that if you look at the group itself the user is not listed in the members attribute once the group is configured as the primary group. But if you change the primaryGroupID the user is set as member in the previous primary group.

Now let’s see how we handle this in FIM.
We already have the Students group using some membership criteria.
StudentsBeforePrimary
The group needs to be created and provisioned to AD in order for us to be able to record the primaryGroupID for this particular group. In the picture below you can see my group got the ID 1204.
StudentsSID
You can also get the value using PowerShell.

$group = Get-ADGroup "Students"
$groupSid = $group.SID
[int]$GroupID = $groupSid.Value.Substring($groupSid.Value.LastIndexOf("-")+1)

We need to add the primaryGroupID to the attributes in the AD MA.
Import the current value into a new MV attribute called, for example, primaryGroupID (Number).
If you are using dynamic groups like I always try to do, you also need to add the attribute to the FIM Service schema. Allowing Sync to manage it and allow it in Filters.

Once the primaryGroupID is set in Active Directory the user will no longer show up in the members attribute. So FIM will try to add them again, failing with an error about existing object.
ErrorWhenAddingPrimaryGroupMember
We therefor need to modify the criteria so that it no longer contains the users which have been configured to have it as primary group.
StudentsAfterPrimary
The outbound synchronization rule could then look something like.
IIF(IsPresent(primaryGroupID),1204,Null())
where “1204” is the primaryGroupID of the new group.

Once the account is created and the new primaryGroupID has been set we can run this clean-up activity to remove them from the Domain Users group, if that is desired.

Get-ADUser -SearchBase "OU=Students,DC=ad,DC=company,DC=com" -Filter {primaryGroupID -ne 513} | ForEach-Object{Remove-ADGroupMember "Domain Users" -Members $_}

Automate SSPR registration in FIM 2010 R2

Since customers started using the OTP (One Time Password) authentication for SSPR (Self-Service Password Reset) I’ve had several discussions if registration should be manual or automatic. In a recent case the decision was that an external system was to be master for the email used for OTP based authentication. I then needed to configure FIM to automate this task. In this post I will show you how I did this in this particular case and hopefully it will give you some ideas on how to solve your own automatic registration.

Workflow

The registration management is made as an action Workflow in FIM. I decided to use the PowerShell Workflow Activity you can find on CodePlex. The workflow adds two parameters AccountName and Email before calling the PowerShell activity. This was a single-domain environment. If you have multiple domains just add the domain as a workflow attribute the same way and modify the script below accordingly.
SSPRAutoRegistrationWF
The Workflow data is using the built-in Function Evaluator activity.
GetOTPEmail
Before you can start using the PowerShell activity there are two things you need to configure.

  • The FIM Service account will do the request and needs to be defined as user in FIM and this user needs to be in the Administrators Set in order to perform the registration actions.
  • The FIM Service configuration file (Microsoft.ResourceManagement.Service.exe.config) needs to be updated. The
    <resourceManagementClient resourceManagementServiceBaseAddress="localhost" />

    needs to be changed to have the full URL of the FIM Service. Something like

    <resourceManagementClient resourceManagementServiceBaseAddress="http://fimserver:5725" />

Finally we need to have the script that we use in our PowerShell activity. This example uses the email workflow data to decide if registration or unregistration should happen. NOTE! Due to some wordpress issue the backslash between the $Domain and the $AccountName is stripped. Remember to add it before using this script or download it here.

Add-PSSnapin FIMAutomation
$AccountName = $fimwf.WorkflowDictionary.AccountName
$Email = $fimwf.WorkflowDictionary.Email
$Domain = "AD"
if($Email)
{
$template = Get-AuthenticationWorkflowRegistrationTemplate -AuthenticationWorkflowName 'Password Reset Email OTP AuthN Workflow'
$template.GateRegistrationTemplates[0].Data[0].Value = $Email
Register-AuthenticationWorkflow -UserName "$Domain$AccountName" -AuthenticationWorkflowRegistrationTemplate $template
}
else
{
Unregister-AuthenticationWorkflow -UserName "$Domain$AccountName" -AuthenticationWorkflowName 'Password Reset Email OTP AuthN Workflow'
}

Synchronization Rules

I extended the Synchronization Service schema with a new attribute called OTPEmail.
The external system had an attribute controlling the status of the user. Using the status as a criteria I ended up with this inbound flow.
IIF(Enabled,extEMailAddress,Null()) -> OTPEmail
on the FIM Service MA I added the flow
msidmOneTimePasswordEmailAddress <- OTPEmail
NOTE! Requires that the msidmOneTimePasswordEmailAddress attribute is added to the MPR allowing the Synchronization Account to manage Users.

MPR (Management Policy Rule)

The last thing was to solve how to trigger the workflow. I initially was thinking about using a Set transition MPR but decideed to go for a Request MPR. This MPR will fire of the workflow whenever the OTP email address is changed. This is one way of detecting the inactivation of the user in the external system that will clear the OTPEmail attribute in the MV according to the sync rule above.
The MPR has the following properties.

  • Name: Update SSPR Registration if OTP Email is changed.
  • Type: Request
  • Requestor: All People
  • Operation: Modify a single-valued attribute
  • Target: Password Reset Users using Email OTP
  • Target Attribute: One-Time Password Email Address

Create FIM 2010 CM service accounts using PowerShell

During a recent customer case I created a small PowerShell script that creates all the service accounts used by FIM 2010 CM and configures the required SPN and delegation for Kerberos to work.

In the script just replace the initial parameters with your own before running it.
The script will prompt for the password to use for each of the accounts as it runs.
You can also download the script here.

#Script needs to run as domain admin on computer with AD DS Admin Tools.
$FIMCMUPNDomain = "ad.company.com"
$IssuingCA = "ca01"
$OU = "OU=ServiceAccounts,DC=ad,DC=company,DC=com"
$FIMCMPortalHostname = "cm.company.com"
$FIMCMPool = "svcFIMCMPool"
$FIMCMAgent = "svcFIMCMAgent"
$FIMCMEnrollAgent = "svcFIMCMEnrollAgent"
$FIMCMKRAgent = "svcFIMCMKRAgent"
$FIMCMAuthZAgent = "svcFIMCMAuthZAgent"
$FIMCMCAMngr = "svcFIMCMCAMngr"
$FIMCMService = "svcFIMCMService"

#FIM CM Pool
New-ADUser $FIMCMPool -SamAccountName  $FIMCMPool -GivenName FIMCM -Surname Pool -DisplayName "FIM CM Pool" -UserPrincipalName $FIMCMPool@$FIMCMUPNDomain -Path $OU -Description  "Application pool account for FIM CM Portal, cm.company.com" -AccountPassword (Read-Host -AsSecureString "$FIMCMPool Password") -ChangePasswordAtLogon $false -PasswordNeverExpires $true -Enabled  $true

#Kerberos settings
#SPN
SETSPN -S http/$FIMCMPortalHostname $FIMCMPool
#Delegation for rpcss/issuingca
Get-ADUser $FIMCMPool| Set-ADObject -Add @{"msDS-AllowedToDelegateTo"="rpcss/$IssuingCA","rpcss/$IssuingCA.$FIMCMUPNDomain"}

#FIM CM Agent.
New-ADUser $FIMCMAgent -SamAccountName $FIMCMAgent -GivenName FIMCM -Surname Agent -DisplayName "FIM CM Agent" -UserPrincipalName $FIMCMAgent@$FIMCMUPNDomain -Path $OU -Description  "FIM CM Agent account" -AccountPassword (Read-Host -AsSecureString "$FIMCMAgent Password") -ChangePasswordAtLogon $false -PasswordNeverExpires $true -Enabled  $true

#FIM CM Enrollment Agent
New-ADUser $FIMCMEnrollAgent -SamAccountName $FIMCMEnrollAgent -GivenName FIMCM -Surname EnrollAgent -DisplayName "FIM CM Enroll Agent" -UserPrincipalName $FIMCMEnrollAgent@$FIMCMUPNDomain -Path $OU -Description "FIM CM Enrollment Agent account" -AccountPassword (Read-Host -AsSecureString "$FIMCMEnrollAgent Password") -ChangePasswordAtLogon $false -PasswordNeverExpires $true -Enabled  $true

#FIM CM Key Recovery Agent
New-ADUser $FIMCMKRAgent -SamAccountName $FIMCMKRAgent -GivenName FIMCM -Surname KRAgent -DisplayName "FIM CM KR Agent" -UserPrincipalName $FIMCMKRAgent@$FIMCMUPNDomain -Path $OU -Description "FIM CM Key Recovery Agent account" -AccountPassword (Read-Host -AsSecureString "$FIMCMKRAgent Password") -ChangePasswordAtLogon $false -PasswordNeverExpires $true -Enabled  $true

#FIM CM Authorization Agent
New-ADUser $FIMCMAuthZAgent -SamAccountName $FIMCMAuthZAgent -GivenName FIMCM -Surname AuthZAgent -DisplayName "FIM CM AuthZ Agent" -UserPrincipalName $FIMCMAuthZAgent@$FIMCMUPNDomain -Path $OU -Description "FIM CM Authorization Agent account" -AccountPassword (Read-Host -AsSecureString "$FIMCMAuthZAgent Password") -ChangePasswordAtLogon $false -PasswordNeverExpires $true -Enabled  $true

#FIM CM CA Manager account
New-ADUser $FIMCMCAMngr -SamAccountName $FIMCMCAMngr -GivenName FIMCM -Surname CAMngr -DisplayName "FIM CM CA Mngr" -UserPrincipalName $FIMCMCAMngr@$FIMCMUPNDomain -Path $OU -Description "FIM CM CA Manager account" -AccountPassword (Read-Host -AsSecureString "$FIMCMCAMngr Password") -ChangePasswordAtLogon $false -PasswordNeverExpires $true -Enabled  $true

#FIM CM Service account
New-ADUser $FIMCMService -SamAccountName $FIMCMService -GivenName FIMCM -Surname Service -DisplayName "FIM CM Service" -UserPrincipalName $FIMCMService@$FIMCMUPNDomain -Path $OU -Description "FIM CM Service account" -AccountPassword (Read-Host -AsSecureString "$FIMCMService Password") -ChangePasswordAtLogon $false -PasswordNeverExpires $true -Enabled  $true

Is it possible in FIM 2010 R2 to…?

I got a few question today about FIM 2010 R2 and thought I should share the answers with you all.

The questions were:

I just want to know if the followings are possible technically and  I need a way how to proceed.

  • Is it possible to update or delete  Managed Accounts totally, using an interface?
  • Is it possible to make integration with Active Directory SAP 6.0? any tool or utility?
  • Is it possible to manage the ACLs at File Server? For example can file owner manage ACL and membership on FIM Portal?

The short and quick answer to all questions is… Yes! Because basically you can do anything with FIM 2010 R2 🙂
The longer answer I will give you, will be for each question.

Is it possible to update or delete Managed Accounts totally, using an interface?

Any accounts in FIM 2010 R2 (managed or not) can be updated and deleted using the FIM 2010 R2 portal. If by “managed accounts” you mean linked accounts, like an administrative account. I also have solutions at customers where for example the linked account is automatically updated based on events on the main account.
When it comes to deleting I strongly recommend reading the article on deprovisioning written by Carol Whapshere. My personal recommendation if you plan on implementing deletes, in for example AD, is that you use rules extension to make sure you can filter the deletions to only happen on certain objects.
DeprovisionWithRulesExtension

Is it possible to make integration with Active Directory SAP 6.0? any tool or utility?

I am not that familiar with different versions of SAP. But if we look at the Management Agents available for FIM 2010 R2 I would think that the Connector for Web Services would do the trick. If it doesn’t I am pretty sure a generic adapter like PowerShell can be used to solve the integration with SAP 6.0.

Is it possible to manage the ACLs at File Server? For example can file owner manage ACL and membership on FIM Portal?

To manage folder objects (I do hope your not working with file permissions 😉 ) I would use a PowerShell connector. Microsoft has its own on Connect (Release Candidate at the moment) or you can use the great PowerShell MA developed by Søren Granfeldt. Using PowerShell it’s quite easy to work with the security descriptor on your folders. If you look at my example regarding the HomeFolder MA you might get an idea on how to do it. In FIM we need to extend the schema to hold the objects. To begin with I would have three multi-value reference attributes (Read, Modify, FullControl) to assign the permissions. You would also need to assign owner (or Manager) attribute in order to use the portal for self-service and have an MPR like “Folder Owners can manage permissions on folders they own”. In reality I would also make sure I get some new forms by adding some RCDC’s.

As you can see… We can do a lot of things in FIM! If you have follow-up questions please comment.

FIM when COTS and CBC matters!

I have been working with many FIM projects over the last 4 years. One common question in all projects is… Why FIM? A common answer is COTS and CBC.

All customers I worked with have had some kind of IdM solution in place. Commonly it’s a self-made solution and the customers have finally started to realize the true cost of maintaining this homegrown solution. They realize they want a COTS, Commercial Of The Shelf, product.

Another common request is to introduce self-service. Not only things like SSPR, Self-Service Password Reset, but also to delegate tasks like creating users and managing them. Using the built-in FIM Portal to allow this has actually proven quite useful. Many are complaining about the lack of functionality in the RCDC (FIM portal forms), but customers realizing the cost of homegrown solutions are accepting this shortcomings in order to go for CBC, Configure Before Code (or Configure Before Customize).

So if you ever wondered… Why FIM? I think a simple answer is because you want COTS and CBC!

Using PowerShell MA to replace ECMA 1.0 used for ODBC

At one of my customers they have a number of old ECMA 1.0 Management Agents that use ODBC (NotesSQL driver in this case) to talk to IBM Notes. But since ECMA 1.0 is now being deprecated it was time to look at alternatives. One option was to try and upgrade the old MA to ECMA 2.0. I did however give the PowerShell MA from Søren Granfeldt a try first. What I actually discovered was, that doing so was much easier and more cost-effective than writing new ECMA 2.0 based MA’s.

What I also discovered was that I ended up with something that could be called a “generic” ODBC MA. With only minor changes to the scripts I was able to use it for  all IBM Notes adapters, including managing the Notes Address Book.

If you would like to give it a try the setting in the PS MA I used was to use the export the “old” CSEntryChange object. Since NoteSQL is only available as 32-bit driver I was also required to configure the MA to use x86 architecture. You can dowload my sample scripts here. One nice thing I am doing in this example is to read the schema.ps1 file to get all the “columns” to manage in the script. With that all you need in order to add support to read, update a new column in the ODBC source is to add it to the schema.ps1 file.

HomeFolder script for PowerShell MA

A short while ago Søren Granfeldt released a new version of his fantastic PowerShell MA. One of the nice things is that it now supports sending error messages back to the MA. I implemented it this week, for Home Folder management, at a customer and this resulted in a new example script I wanted to share with you all.

The new example script can be downloaded here: PSMA.4.5.HomeFolder.Example. This sample script is based on using the “old” CSEntryChange object, rather then configuring it to use the new feature in this MA allowing you to Export simple objects, that will use PSCustomObject instead.

The Global Parameters for the MA this script is used in, is set as in the picture below.

Global Parameters for PowerShell MA 4.5

Global Parameters for PowerShell MA 4.5

Using SharePoint Foundation 2013 with FIM

With the new SP1 released for FIM 2010 R2 it is now supported to use SharePoint Foundation 2013 on the FIM Portal server. Installing and configuring SPF 2013 to work with FIM is however not that straight forward. In this post I will tell you how to do it and also give you some handy scripts and files you might need.

The scenario is that I am installing FIM 2010 R2 SP1 on Windows Server 2012. This means I will use SharePoint Foundation 2013 for my FIM Portal. I also use the same SQL Server as I use for my FIM Service but use a separate SQL alias for SPF in order to be able to easily move it if required.

All files discussed in this post can be downloaded from my Skydrive (app 52MB). Make sure you read the ReadMeFirst.txt file before you use anything. The configuration script is attached directly in this post if that is all you need.

Prerequisites for SharePoint Foundation 2013
If you just tries to install SPF 2013 on your Windows Server 2012 you will likely fail since the prerequisites installation will probably not succeed. One reason is that .NET Framework 3.5 cannot be added as a feature by the installer since the source files are no longer available for all features in Server 2012.
To install the Roles and Features required you can use the following PS commands.

Import-Module ServerManager
Add-WindowsFeature Net-Framework-Features,Web-Server,Web-WebServer,Web-Common-Http,Web-Static-Content,Web-Default-Doc,Web-Dir-Browsing,Web-Http-Errors,Web-App-Dev,Web-Asp-Net,Web-Net-Ext,Web-ISAPI-Ext,Web-ISAPI-Filter,Web-Health,Web-Http-Logging,Web-Log-Libraries,Web-Request-Monitor,Web-Http-Tracing,Web-Security,Web-Basic-Auth,Web-Windows-Auth,Web-Filtering,Web-Digest-Auth,Web-Performance,Web-Stat-Compression,Web-Dyn-Compression,Web-Mgmt-Tools,Web-Mgmt-Console,Web-Mgmt-Compat,Web-Metabase,Application-Server,AS-Web-Support,AS-TCP-Port-Sharing,AS-WAS-Support, AS-HTTP-Activation,AS-TCP-Activation,AS-Named-Pipes,AS-Net-Framework,WAS,WAS-Process-Model,WAS-NET-Environment,WAS-Config-APIs,Web-Lgcy-Scripting,Windows-Identity-Foundation,Server-Media-Foundation,Xps-Viewer –Source R:\sources\sxs

As you can see that is quite a one-liner and the key part for success is the -Source parameter in the end that needs to point to a valid Server 2012 sources folder, like the DVD. The source path can be managed by setting a group policy but if you would like to do that i ask you to search the web and you will find it.

At one customer I also ran into the problem that the FIM server did not have Internet access and so I was forced to download the prerequisites for SPF 2013 and run the prerequisites installer manually. One thing here is that you cannot manually install the prerequisites, you have to let the SharePoint prerequisites installer do the job. In the download on my Skydrive you will find the required files as well as a cmd file to run the installer. In order to use this you need to extract the SharePoint.exe file into a folder in order to run the prerequisiteinstaller.exe using the command line.

Installing SharePoint Foundation 2013
The installation I did not automate since it is only one choice to make and one thing to remember.
I select to do a Full installation during the setup. This basically adds all required bits and pieces required by SPF but do not configure anything. NOTE! At the end you need to remember to UNCHECK the Run the SharePoint Products Configuration Wizard now. at the end of the installation. We will do this manually.

Configuring SharePoint Foundation 2013 for FIM
If you look at the official TechNet guideline for using SharePoint Foundation 2013 with FIM, you will find that it only points out a few things to remember. But since I also want do the whole configuration for SPF I have created a PowerShell script that does all the things mentioned on TechNet but also all the other things required to use SPF. In order to use it you will need to walk through it and change the passphrase, account names and website names to fit your needs. The script will take a few minutes to run so don’t be alarmed if nothing happens for a while. Once the script is finished your environment is ready for you to install the FIM Portal.

Replacing OpenLDAP MA with PS MA

By replacing the OpenLDAP XMA with the Søren Granfeldt’s PowerShell MA I gained 20-30% performance improvement, got delta import support, and at the same time reduced the amount of managed code by hundreds of lines.

One of my customers are using OpenDJ as a central LDAP directory for information about users and roles. In order to import this information into FIM we have been using the OpenLDAP XMA from codeplex. But, since this MA is built using the deprecated ECMA 1.0 framework and also have some issues causing me to have to re-write it to a customer specific solution, I decided to move away from it and start using a PowerShell MA instead.

I will try to point out the most critical parts of the PowerShell I used if you would like to try this yourself.
But you can also download a sample script to look at if you like.

This PS MA sends three parameters as input to the PowerShell script
param ($Username,
$Password,
$OperationType)

but also holds a RunStepCustomData parameter that can be useful when doing delta imports. If you look at the example downloads
listed at this PS MA homepage you will see some examples using this RunStepCustomData parameter. In this particular case I choose to store the delta cookie in a file instead. Allowing the FIM administrator some flexibility to manually change the cookie value if needed.

Let’s move on to the script then.

I use the System.DirectoryServices.Protocols to do the LDAP queries so I need to start by adding a reference to that assembly.
Add-Type -AssemblyName System.DirectoryServices.Protocols

I need to define the credentials I will use and the LDAP Server to connect to.
$Credentials = New-Object System.Net.NetworkCredential($username,$password)
$LDAPServer = "ldap.company.com"

With that we can now create the LDAP connection to the OpenDJ server.
$LDAPConnection = New-Object System.DirectoryServices.Protocols.LDAPConnection($LDAPServer,$Credentials,"Basic")

Now it’s all about defining the search filters and search the LDAP for the objects we are interested in.It could look something like this
$BaseOU = "ou=People,dc=company,dc=com"
$Filter = "(&(objectClass=EduPerson)(!EduUsedIdentity=*))"
$TimeOut = New-Object System.TimeSpan(1,0,0)
$Request = New-Object System.DirectoryServices.Protocols.SearchRequest($BaseOU, $Filter, "Subtree", $null)
$Response = $LDAPConnection.SendRequest($Request,$TimeOut)

Within the $Response object we now have the result from our search and can iterate through it and set the values to the object returned to the Management Agent.
ForEach($entry in $Response.Entries) {
$obj = @{}
$obj.Add("OpenDJDN", $entry.DistinguishedName)
$obj.Add("objectClass", "eduPerson")
If($entry.Attributes["cn"]){$obj.Add("cn",$entry.Attributes["cn"][0])}
If($entry.Attributes["uid"]){$obj.Add("uid",$entry.Attributes["uid"][0])}
$obj}

You now have a working import from OpenDJ. But we have not added the delta support just yet. Once you have verified that your full import is working you can start to extend your script to also support delta imports from OpenDJ.

OpenDJ works with a changelog that increases the changenumber for each entry. In order to read the changelog from the correct changenumber we need to store the last known changenumber. In my example I store this in what I call the cookie file.
$CookieFile = "D:\PS-Scripts\OpenDJ\Cookie.bin"
In this file I store the integer value of the last changenumber we have worked with.
To read the value I use the following line.
$val= [int](Get-Content –Path $CookieFile)
We can then search the changelog for entries with higher changeNumber than we have already seen.
It would look something like this.

$ChangeFilter = "(&(targetdn=*$($BaseOU))(!cn=changelog)(changeNumber>=$($changeNumber)))"
$ChangeTimeOut = New-Object System.TimeSpan(0,10,0)
$ChangeRequest = New-Object System.DirectoryServices.Protocols.SearchRequest("cn=changelog", $ChangeFilter, "Subtree", $null)
$ChangeResponse = $LDAPConnection.SendRequest($ChangeRequest,$ChangeTimeOut)

We now run into two problems that need to be resolved.
First: The changelog might give me the same object twice if multiple changes has been done to the object.
Second: The changelog object does not contain all the attibutes I need to return to the MA.

The first problem is solved by filtering the response from the search and only collect unique objects.
In my case I used the following line to do that.
$UniqueDN = $ChangeResponse.Entries | ForEach{$_.attributes["targetdn"][0]} | Get-Unique
We then need to iterate through these unique DNs to get the actual objects and get the attributes we need.
ForEach($DN in $UniqueDN){
$GetUserReq = New-Object System.DirectoryServices.Protocols.SearchRequest($DN, "(objectClass=EduPerson)", "Base", $null)
$GetUser = $LDAPConnection.SendRequest($GetUserReq)
If($GetUser.Entries.Count -eq 0){continue}
$entry = $GetUser.Entries[0]
If($entry.Attributes["EduUsedIdentity"]){continue}
$obj = @{}
$obj.Add("OpenDJDN", $entry.DistinguishedName)
$obj.Add("objectClass", $Class)
If($entry.Attributes["cn"]){$obj.Add("cn",$entry.Attributes["cn"][0])}
If($entry.Attributes["uid"]){$obj.Add("uid",$entry.Attributes["uid"][0])}
$obj}

One thing we need to also remember to do is to save the last changenumber back to our cookie file for the next run.
$LastChangeNumber = [int]$ChangeResponse.Entries[($ChangeResponse.Entries.Count -1)].Attributes["changeNumber"][0]
Set-Content -Value $LastChangeNumber –Path $CookieFile

You also need to remember to do that during your full import runs. So within the script where you process full imports you need to add a search to get the current latest entry in the changelog.

I have attached a demoscript for you to download to get you started in your own exploration of using PowerShell to work with OpenDJ. If you look at it you will likely find that it can be optimized in some ways. But I kept it this way to make it easy for non PowerShell geeks to be able to read it and understand it.

One thing you might have notice is that I have not added support to detect deletes. This is just a matter of adding some logic to read the changetype in the changelog, but I have not yet got the time to do this. At this customer deletes will be detected every night when full import is running anyhow.