Configuration Migration might break after MIM 2016 SP1 upgrade

I just ran into a “bug” related to SP1 in MIM 2016. The problem is related to schema changes made in the FIMService in SP1.

My situation where the problem showed itself might not be the most common one, but as I will explain later in this post it also have implications in other scenarios.

My situation was at a customer where we started buildning a new MIM 2016 solution about a year ago. We installed MIM 2016 RTM (4.3.1935.0) in the Dev/Test environment.
The development of the solution took some time and we updated the Dev/Test environment to SP1 (4.4.1302.0) when it came around.

It was now time to go to production with the solution and the production environment was installed using SP1 (4.4.1302.0). By the looks of it, we had the same version.

Now the problem showed itself when we started migration the configuration from Dev/Test to Production. We used the “standard” method described by Microsoft

  • export configuration
  • compare configuration
  • import changes
  • At the import changes we got a big fat error when importing the schema changes. Drilling down into the error I found that the basic schema was different depending on how I got to 4.4.1302.0.
    If you look in the schema on your MIM 2016 SP1 (4.4.1302.0) or later and search for attributes named domain you should find three different attributes as in the picture below.

    If you cannot see the last one called “Domain FQDN” with the system name msidmDomain you are affected.

    When the configuration import found this missing attribute and two bindings related to it, it could NOT import the changes, FIMService gave the following error in the eventlog.
    Microsoft.ResourceManagement.Service: Microsoft.ResourceManagement.WebServices.Exceptions.PermissionDeniedException: SystemConstraint ---> System.InvalidOperationException: The system name provided includes a reserved prefix.
    The msidm prefix seems to be reserved and you are not allowed to add attributes starting with that prefix.
    Even if we never use this attribute or need it. It will make our future configuration migration fail unless we remove this error from the process.

    Conclusion

    The MIM 2016 SP1 setup version have some issues with the schema. Product team have told me that these issues are fixed in 4.4.1459.0 version, so by making sure both source and target systems run 4.4.1459.0 version of the schema I should be save from this issue.

    Implications

    Not many people will have my situation where production is installed using a newer version than dev/test. But this problem will also show itself for anyone doing a disaster recovery with configuration restore without data restore.

    Typically you make a reinstall reusing the FIMService database, but there are occasions when you would like to make a fresh install, import the configuration and then reload all data into FIMService again. If this is a scenario and your current configuration is based on RTM and reinstall is made using SP1, you might be affected by this problem.

    Bulk update objects in FIM/MIM based on CSV file

    A customer asked me to make a script that allowed them to bulk update objects in MIM based on a csv file. The csv file itself would be created and updated using their favorite tool Excel.

    Since they were already using the Lithnet PowerShell Module i decided to use that in my script.
    This time I also decided to start sharing my scripts using my gist.github.com to make it easier to use for others.

    The script I share with you does not have any errorhandling, and if you start using it you might want to add some.
    I also intend to extend the script with ability to manage Reference and MultiValue attribute. If YOU feel thats a good idea, please comment on the Gist, and tell me the way you would like these to be represented in the csv file.

    Hope you find this a useful tool in your FIM/MIM toolkit.

    Scheduling MIM with advanced options

    Building a good schedule for FIM 2010 and MIM 2016 Synchronization Service is a crucial part of every design.
    In this post I will show you an example of the PowerShell script I use to schedule the Synchronization Service.
    TaskScheduler
    This script uses a few techniques that I have found useful.

    • Running Parallel Pofiles
    • Custom Profile to add Sleep option
    • Custom Profile to add Script option

    Parallel Profiles

    Using the capability in PowerShell to start background jobs I can run some profiles in Parallel.
    Please note that it is NOT supported to run any profiles that include Synchronization in parallel with other profiles.
    Since a typical flow in scheduling is…

    1. Import
    2. Synchronize
    3. Export

    and the Import and Export can be done in parallel I have three sections in the script. The first and last section allows for parallel execution and the middle one is for sequential runs.

    Sleep Option

    I have added the Sleep:seconds as a profile option. It is useful when you know it’s a good idea for the service to wait a few seconds before moving to the next step. It could be some SQL actions or WF actions you know are happening that will take a few seconds to complete.

    Script Option

    The Script:ScriptToRun option is very useful it could be scripts to fire some SQL StoredProcedure or as in the example running the WaitForWF.ps1 to make sure all FIMService WorkFlows are finished before we continue.

    PowerShell Script

    ############
    # PARAMETERS
    ############
    
    $scriptpath = Split-Path $MyInvocation.MyCommand.Path #Used to call other scripts
    
    $ImportAsJob = 
    @(
    	@{
    		MAName="HR";
    		ProfileToRun="Full Import";
    	};
    
        @{
    		MAName="FIMService";
    		ProfileToRun="Delta Import";
    	};
    
        @{
    		MAName="AD";
    		ProfileToRun="Delta Import";
    	};
    );
    
    
    $SyncProfilesOrder = 
    @(
    	@{
    		MAName="HR";
    		profilesToRun=@("Delta Sync");
    	};
    
        @{
    		MAName="FIMService";
    		profilesToRun=@("Export";"Sleep:15";"Script:WaitForWF.ps1";"Delta Import";"Delta Sync");
    	};
    );
    
    $ExportAsJob = 
    @(
    	@{
    		MAName="AD";
    		ProfileToRun="Export";
    	};
    
        @{
    		MAName="HR";
    		ProfileToRun="Export";
    	};
    );
    
    
    ############
    # DATA
    ############
    $MAs = @(get-wmiobject -class "MIIS_ManagementAgent" -namespace "root\MicrosoftIdentityIntegrationServer" -computername ".")
    
    ############
    # FUNCTIONs
    ############
    function RunFIMAsJob
        {
        param([string]$MAName, [string]$Profile)
        Start-Job -Name $MAName -ArgumentList $MAName,$Profile -ScriptBlock {
            param($MAName,$Profile)
            $MA = (get-wmiobject -class "MIIS_ManagementAgent" -namespace "root\MicrosoftIdentityIntegrationServer" -computername "." -Filter "Name='$MAName'")
            $return = $MA.Execute($Profile)
            (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": " + $MAName + " : " + $Profile + " : " + $return.ReturnValue
            }
        }
    
    ############
    # PROGRAM
    ############
    (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": Starting Schedule"
    
    #ImportAsJob
    (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') +": Starting ImportJobs"
    foreach($MAToRun in $ImportAsJob)
        {
            (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": Starting : " + $MAToRun.MAName + " : " + $MAToRun.ProfileToRun
            $void = RunFIMAsJob $MAToRun.MAName $MAToRun.ProfileToRun
        }
    Get-Job | Wait-Job | Receive-Job -Keep
    (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') +": Finished ImportJobs"
    
    #Removing Jobs to release resources
    Get-Job | Remove-Job
    
    #Sync (not as job)
    foreach($MAToRun in $SyncProfilesOrder)
        {
        foreach($profileName in $MAToRun.profilesToRun)
            {
            if($profileName.StartsWith("Sleep"))
                {Start-Sleep -Seconds $profileName.Split(":")[1]}
            elseif($profileName.StartsWith("Script"))
                {& ($scriptpath +"\"+ ($profileName.Split(":")[1]))}
            else
                {
                $return = ($MAs | ?{$_.Name -eq $MAToRun.MAName}).Execute($profileName)
                (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": " + $MAToRun.MAName + " : " + $profileName + " : " + $return.ReturnValue
                }
            }
    	}
    
    #ExportAsJob
    (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') +": Starting ExportJobs"
    foreach($MAToRun in $ExportAsJob)
        {
            (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": Starting : " + $MAToRun.MAName + " : " + $MAToRun.ProfileToRun
            $void = RunFIMAsJob $MAToRun.MAName $MAToRun.ProfileToRun
        }
    Get-Job | Wait-Job | Receive-Job -Keep
    (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') +": Finished ExportJobs"
    
    #Removing Jobs to release resources
    Get-Job | Remove-Job
    
    (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": Finished Schedule"
    

    Summary>

    Designing a good schedule in MIM and FIM will decrease the time it will take for changes to get applied in different systems. So please spend some time thinking about how to optimize it.

    Wait for WorkFlow in MIM

    When running sync (FIM 2010 / MIM 2016) or scripting against FIMService we sometimes have to wait for some WorkFlow to finish before we continue.
    Using a small script that can be called to do the wait is a solution I use at several customers.

    The example script below uses my FIMService PowerShell Module.
    It takes two parameters.
    The $Creator is the Creator whose request we are waiting for. The default creator is the Built-in Synchronization Account.
    The $RetryInterval is the sleep time we use between checking the status of the requests still in PostProcessing.

    At some customers I have also added some more logic to query for Failed requests and act upon them as well.

    <#
        Script that waits for WF triggered by requests.
        Used in Schedules to get dynamic sleep time before importing results after export to FIM Service.
        Default Creator is Built-in Synchronization Account.
    #>
    PARAM(
        [string]$Creator = 'fb89aefa-5ea1-47f1-8890-abe7797d6497',
        [int]$RetryInterval = 30
        )
    
    if(!(Get-Module -Name FIMServiceModule))
    {
    Import-Module 'D:\FIMData\FIMServiceModule.psm1' -ArgumentList 'http://idmservice.ad.konab.net:5725/resourcemanagementservice'
    }
    
    $NotDone = $true
    
    while($NotDone)
    {$Running=GetByxPath -xPath "/Request[Creator='$Creator' and RequestStatus='PostProcessing']"
        if($Running)
        {
        (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": WF is not done.. Waiting " + $RetryInterval + " seconds"
        Start-Sleep -Seconds $RetryInterval
        }
        else
        {
        (Get-Date -Format 'yyyy-MM-dd HH:mm:ss') + ": WF is done. Continuing"
        $NotDone=$false
        }
    }
    

    Environment Variables in RCDC

    When using the FIM or MIM Portal you will often struggle with the RCDC configuration. Making it dynamic is often one of the things we try to achieve. A nice little feature we can use in this case are the Environment Variables available in RCDC.

    The following variables are available to us

    %LoginID%
    Displays the ObjectID of the user who is currently logged in.
    This can for example be used to list “My Direct Reports” using an xPath similar to /Person[Manager=’%LoginID%’]

    %LoginDomain%
    Displays the domain of the user who is currently logged in.

    %Today%
    Displays the current date and time

    %FromToday_nnn%
    Displays the current date, plus nnn and time. nnn is an integer.

    %ObjectID%
    The RCDC primary resource ObjectID. This can be used to list Groups using an xPath similar to /Group[DisplayedOwner=’%ObjectID%’]

    %Attribute_xxx%
    Returns a specified attribute, xxx, of the RCDC primary resource. This can be used to search or list based on some attribute of the object we are managing. It can also be used in text.
    This little control example shows how to create a MailLink that takes the Email attribute of the object to build a UocHyperLink with mailto: function.

    <my:Control my:Name="MailLink" my:TypeName="UocHyperLink" my:Caption="{Binding Source=schema, Path=Email.DisplayName}" my:Description="" my:RightsLevel="{Binding Source=rights, Path=Email}">
     <my:Properties>
      <my:Property my:Name="Text" my:Value="{Binding Source=object, Path=Email, ModeTwoWay}"/>
      <my:Property my:Name="NavigateUrl" my:Value="mailto:%Attribute_Email%"/>
     </my:Properties>
    </my:Control>
    

    In the UI this could look something like in the picture below.
    MailLink.RCDC.Example
    Using Environment Variables when using the RCDC really increases the usability of the Portal so please start using them!

    Criteria based Deprovisioning without ERE

    In a current project I needed criteria based Deprovisioning. If one set of criteria was true the user should be delted from one system, if another criteria was true deleted from another system. Using declarative Provisioning the only way to do this without code in FIM 2010 R2 and MIM 2016 is to use ERE’s. In this case however I would introduce over 2 million ERE objects in order to solve the puzzle. I concluded that was not going to be the best solution. All Provisioning at this customer is made using Declarative rules and Outbound Scoping Filters.

    Another codeless option is available from Søren Granfeldt and his Codeless Provisioning Framework available on CodePlex. It was just that in this special case 3:rd party addons would be “hard” to get approved.

    So I decided that I would for the first time in years make an MVExtension. Just a warning! Before you start building any code or configuring any Deprovisiong you should read the old but still very valid article written by Carol Wapshere on Account Deprovisioning Scenarios. I decided that the solution descibed by Carol as Deprovision based on attribute change was what I needed.

    Let me show you how I solved my problem using about 10 lines of code.

    Criteria Logic

    I like my FIM Service and Synchronization Service and together they can solve almost all logical problems I face. In this case I extended the schema with some boolean attributes, one for each MA. Using a combination of Synchronization and FIM Service logic I could then mimic the criteria the customer had defined. Each boolean flag was set to false whenever I wanted the Deprovisioning to happen.

    Configuration File

    I created a configuration file I could use to make the code in the MVExtension as generic as possible.
    Below is an example.

    <configuration>
      <MA Name="Exchange" Deprovision="true" Flag="flagExchange">
        <person Deprovision="true" />
      </MA>
      <MA Name="AD" Deprovision="true" Flag="flagAD">
        <person Deprovision="true" />
    	<group Deprovision="false" />
      </MA>
    </configuration>
    

    Each MA can be turned on/off and also each MetaVerse ObjectType can be defined to turn the Deprovisioning on/off. In the configuration I also define the boolean attribute in the MetaVerse that decides if Deprovision should happen or not.

    MVExtension.dll

    Time to start Visual Studio and create the MVExtension.dll. And forgive me if the code is not optimal, I just don’t write code as often as I did 10 years ago.

    I first define the XmlDocument Config as a global variable.

    public class MVExtensionObject : IMVSynchronization
    {
      //Config is a global variable
      System.Xml.XmlDocument Config = new System.Xml.XmlDocument();
    

    In the Initialize method I load the configuration file.

    void IMVSynchronization.Initialize ()
    {
      // Read .config file
      Config.Load(Utils.ExtensionsDirectory + @"\" + "MVExtension.config");
    }
    

    And finally in the Provision method I add the lines to perform the Deprovisioning based on the information in configuration file and the boolean value on the defined attribute.

    void IMVSynchronization.Provision (MVEntry mventry)
    {
     foreach (System.Xml.XmlNode MA in Config.SelectSingleNode("/configuration").SelectNodes("MA"))
     {
      //Check if ObjectType is in scope or not
      if (MA.SelectSingleNode(mventry.ObjectType) != null)
      {
       string MAName = MA.Attributes.GetNamedItem("Name").Value;
       bool MADeprovision = Convert.ToBoolean(MA.Attributes.GetNamedItem("Deprovision").Value);
       bool TypeDeprovision = Convert.ToBoolean(MA.SelectSingleNode(mventry.ObjectType).Attributes.GetNamedItem("Deprovision").Value);
       string Flag = MA.Attributes.GetNamedItem("Flag").Value;
    
       //We have the information. Let's go!
       if (MADeprovision & TypeDeprovision & mventry[Flag].IsPresent)
       {
        if (mventry.ConnectedMAs[MAName].Connectors.Count > 0 & mventry[Flag].BooleanValue == false)
        {mventry.ConnectedMAs[MAName].Connectors.ByIndex[0].Deprovision();}
       }
      }
     }
    }
    

    Conclusion

    Since Microsoft does not offer a viable way of performing criteria based deprovisioning writing your own MVExtension is sometimes the only logical solution. By using the solution presented here I can now deprovision based on any type of criteria on any type of resource. All I need is to get my Synchronization, FIM Service and maybe Workflows to do it’s job keeping the boolean flags correctly set. In this particular project I actually ended up re-using these flags in the Outbound Scoping filters making them alot less complex then before. So now true = Provision and false = Deprovision.

    FIMService PowerShell Module

    A nice to have PowerShell Module when doing different kinds of PowerShell scripting against the FIMService in FIM 2010 R2 or MIM 2016. But just using the builtin FIMAutomation PSSnapin.

    On codeplex and other places there are some great PowerShell libraries to use when scripting against FIM/MIM. But I often run into customers where third party dlls and addons are hard to get approved. So I started to make myself a small PS Module so that I didn’t have to rewrite the most common things all over. In the latest version of this I have also added pipeline support to some of the functions in it making it possible to for example make the following command.

    GetByxPath -xPath "/Person[EmployeeType = 'Contractor']" | ImportObject | DeleteObject

    This would delete all Contractors in the FIM Service.

    Feel free to use it as is or cut functions into your own scripts when you need one.

    <#
    .SYNOPSIS
    	PowerShell module containing functions to work with the FIMService in FIM 2010 R2 and/or MIM 2016.
    	Accepts FIMService URL as Argument.
    	Errorhandling within module is very limited. Calling script needs to handle errors.
    .EXAMPLE
    	Import-Module FIMServiceModule.psm1 -Argumentlist 'http://idmservice.konab.net:5725/ResourceManagementService'
    .EXAMPLE
    	Import-Module FIMServiceModule.psm1
    	Will use default URI http://localhost:5725/ResourceManagementService
    #>
    
    PARAM([string]$URI = "http://localhost:5725/ResourceManagementService")
    if(@(Get-PSSnapin | Where-Object {$_.Name -eq "FIMAutomation"} ).count -eq 0) {Add-PSSnapin FIMAutomation -ErrorAction SilentlyContinue}
    
    function CreateObject
    {
        <#
        .SYNOPSIS
    		Creates a new object of type Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject.
    		This object needs to be saved using the SaveObject in order to be commited to the FIMService.
    		Supports pipeline input.
    	.PARAMETER objectType
    		The system name of the FIMService resource type.
    	.EXAMPLE
    		CreateObject -objectType Person
    	.EXAMPLE
    		Objects.Type | CreateObject
        #>
    
        PARAM(
             [Parameter(ValueFromPipeline=$true)]
             [string[]]$ObjectType
             )
        BEGIN{}
        PROCESS
        {
           foreach($Type in $ObjectType)
           {
           $NewObject = New-Object Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject
           $NewObject.ObjectType = $Type
           $NewObject.SourceObjectIdentifier = [System.Guid]::NewGuid().ToString()
           return [Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject]$NewObject
           }
        }
        END{} 
    }
    
    function ImportObject
    {
    	<#
    	.SYNOPSIS
    		Converts an $ExportObject of type [Microsoft.ResourceManagement.Automation.ObjectModel.ExportObject] 
    		to an $ImportObject of type [Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject]
    		Supports pipeline input.
    	#>
        PARAM
            (
            [Parameter(ValueFromPipeline=$true)]
            [Microsoft.ResourceManagement.Automation.ObjectModel.ExportObject[]]$ExportObject
            )
        BEGIN{}
        PROCESS
        {
           foreach($RMObject in $ExportObject.ResourceManagementObject)
           {
           $ImportObject = New-Object Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject
           $ImportObject.ObjectType = $RMObject.ObjectType
           $ImportObject.TargetObjectIdentifier = $RMObject.ObjectIdentifier
           $ImportObject.SourceObjectIdentifier = $RMObject.ObjectIdentifier
           $ImportObject.State = 1
           return [Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject]$ImportObject
           }
        }
        END{}
    }
    
    function SetAttribute
    {#Only for SingleValue attributes
        PARAM([Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject]$ImportObject, [string]$AttributeName, $AttributeValue)
        END
        {
            $ImportChange = New-Object Microsoft.ResourceManagement.Automation.ObjectModel.ImportChange
            $ImportChange.Operation = 1
            $ImportChange.AttributeName = $AttributeName
            $ImportChange.AttributeValue = $AttributeValue
            $ImportChange.FullyResolved = 1
            $ImportChange.Locale = "Invariant"
            if ($ImportObject.Changes -eq $null) {$ImportObject.Changes = (,$ImportChange)}
            else {$ImportObject.Changes += $ImportChange}
        }
    }
    
    function AddMultiValue
    {
        PARAM([Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject]$ImportObject, [string]$AttributeName, $AttributeValue)
        END
        {
            $ImportChange = New-Object Microsoft.ResourceManagement.Automation.ObjectModel.ImportChange
            $ImportChange.Operation = 0
            $ImportChange.AttributeName = $AttributeName
            $ImportChange.AttributeValue = $AttributeValue
            $ImportChange.FullyResolved = 1
            $ImportChange.Locale = "Invariant"
            if ($ImportObject.Changes -eq $null) {$ImportObject.Changes = (,$ImportChange)}
            else {$ImportObject.Changes += $ImportChange}
        }
    }
    
    function RemoveMultiValue
    {
        PARAM([Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject]$ImportObject, [string]$AttributeName, $AttributeValue)
        END
        {
            $ImportChange = New-Object Microsoft.ResourceManagement.Automation.ObjectModel.ImportChange
            $ImportChange.Operation = 2
            $ImportChange.AttributeName = $AttributeName
            $ImportChange.AttributeValue = $AttributeValue
            $ImportChange.FullyResolved = 1
            $ImportChange.Locale = "Invariant"
            if ($ImportObject.Changes -eq $null) {$ImportObject.Changes = (,$ImportChange)}
            else {$ImportObject.Changes += $ImportChange}
        }
    }
    
    function SetFilter
    {
    	<#
    	.SYNOPSIS
    		Special function to set set the filter attribute in Set or Group. 
    		Expects the inner xPath query as input and adds the XML around it before setting the value.
    		Does not support pipeline input.
    	#>
        PARAM([Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject]$ImportObject,[string]$xPath)
    	BEGIN{}
    	PROCESS
        {
            $FilterXMLBegin ='<Filter xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" Dialect="http://schemas.microsoft.com/2006/11/XPathFilterDialect" xmlns="http://schemas.xmlsoap.org/ws/2004/09/enumeration">'
            $FilterXMLEnd = '</Filter>'
            $Filter = $FilterXMLBegin + $xPath + $FilterXMLEnd
            SetAttribute -object $ImportObject -attributeName Filter -attributeValue $Filter
        }
    	END{}
    }
    
    function GetReference
    {
    	<#
    	.SYNOPSIS
    		Returns the ObjectId (Guid) as string of the returned object. Expects a single object to be found in the query.
            .PARAMETER objectType
                The system name of the FIMService resource type.
            .PARAMETER attributeName
                The system name of the FIMService attribute.
            .PARAMETER attributeValue
                The value in the format expected by the attribute.
    	#>
        PARAM([string]$ObjectType,[string]$AttributeName, $AttributeValue)
        END
        {
            $xPath = "/"+$ObjectType+"["+$AttributeName+"='"+$AttributeValue+"']"
            $ExportObject = export-fimconfig -uri $URI –onlyBaseResources -customconfig $xPath
            if($ExportObject)
                {return $ExportObject.ResourceManagementObject.ObjectIdentifier.Substring(9)}
            else
                {return $null}
         } 
    }
    
    function GetObject
    {
        <#
            .SYNOPSIS
                Returns object based on single attribute/value match.
                Intended to be used when only single result is expected.
                Does not support pipeline input.
            .PARAMETER objectType
                The system name of the FIMService resource type.
            .PARAMETER attributeName
                The system name of the FIMService attribute.
            .PARAMETER attributeValue
                The value in the format expected by the attribute.
        #>
        PARAM([string]$ObjectType,[string]$AttributeName, $AttributeValue)
        BEGIN
        {}
        PROCESS
        {
            $xPath = "/"+$ObjectType+"["+$AttributeName+"='"+$AttributeValue+"']"
            $ExportObject = export-fimconfig -uri $URI –onlyBaseResources -customconfig $xPath
        }
        END
        {
            return $ExportObject
        }
    }
    
    function GetByxPath
    {
    	<#
    	.SYNOPSIS
    		Returns array of [Microsoft.ResourceManagement.Automation.ObjectModel.ExportObject] based on xPath query.
    		Returns only base resources not dereferenced objects.
    		Does not support pipeline input.
    	.PARAMETER xPath
    		xPath query to use.
    		Example: /Person[AccountName='Kent' and extActive=True]
    	#>
    	PARAM([string]$xPath)
    	BEGIN
        {}
    	PROCESS
    	{
    		$ExportObject = export-fimconfig -uri $URI –onlyBaseResources -customconfig $xPath
    	}
    	END
    	{
    		return $ExportObject
    	}
    }
    
    function SaveObject
    {
    	<#
    	.SYNOPSIS
    		Saves/Updates the object in FIM Service.
    		Supports pipelining.
    	.PARAM importObject
    		The [Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject] to commit to FIM Service.
    	#>
        PARAM(
    		[Parameter(ValueFromPipeline=$true)]
    		[Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject[]]$ImportObject
    	)
    	BEGIN{}
    	PROCESS
    	{
    		foreach($Object in $ImportObject)
    		{
    			$Object | Import-FIMConfig -Uri $URI -ErrorAction Stop
    		}
    	}
        END{}
    }
    
    function DeleteObject
    {
    	<#
    	.SYNOPSIS
    		Delete the object in FIMService.
    		Supports pipelining
    	.PARAM importObject
    		The [Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject] to delete in FIM Service.
    	#>
        PARAM(
    		[Parameter(ValueFromPipeline=$true)]
    		[Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject[]]$ImportObject
    	)
    	BEGIN{}
    	PROCESS
    	{
    		foreach($Object in $ImportObject)
    		{
    			$Object.State = 2
    			$Object | Import-FIMConfig -Uri $URI -ErrorAction Stop
    		}
    	}
        END{}
    }
    

    Using FIM Portal as administrative tool for AD

    In the April FIM Team User Group meeting I talked about how to use the FIM Portal as administrative tool using just the built-in OOB functionality in FIM.
    If you have any questions regarding this session, please comment to this post.

    Training for Microsoft Identity Manager

    The syllaby for the next version of my FIM/MIM training is now starting to take shape. After the summer delivery will start of my new Mastering Microsoft Identity Manager class.

    In co-operation with partners like Labcenter in Sweden and TrueSec Inc in USA I will also make this training available as remote class.

    For all my customers in the US, I am happy to announce that we are planning to have a class-room training available in Seattle around october/november… Stay in touch for final dates.
    [EDIT 2015-04-25] The first date for training in US (Redmond, WA) is now set to Oct 27-29 2015. Register here.