Saturday, November 14, 2015

Resources from WIN332 Microsoft Ignite Session

Thanks to all who came along to my session at Microsoft Ignite today. Here are the resources I presented at today’s session WIN332 – From Fortran to FIM: Dragging your identity management system of our the dark ages.

Resources for Engineers and Admins

Lithnet ACMA Codeless business rules engine

ACMA is a fast, efficient, codeless way of implementing business rules that can create and transform information within your FIM/MIM implementation. ACMA comes with a UI editor for your rules file, a PowerShell module for modifying ACMA objects directly, and a unit testing engine that allows you to test all the rules you have created. Check out the video link below for a more detailed demonstration of the capabilities of ACMA

Lithnet Universal MA Rules Extension (UMARE)

UMARE is a codeless rules extension for FIM/MIM. It can be used on any MA to perform transform operations on incoming and outgoing identity data. With over 40 transforms available out of the box, including very common scenarios we all need to support like converting an ‘accountDisabled’ attribute to a bitmask on the AD userAccountControl attribute, and converting the FIM Service group type strings into the right groupType value in AD. If there is a transform that’s missing, let me know and I can add it in.

Granfeldt FIM Metaverse Rules Extension

Forget DREs, EREs and sync rules. Get a hold of Soren’s Metaverse Rules Extension. It’s a very powerful and flexible component that can reduce the complexity of your provisioning time. Create a provisionToAD attribute in ACMA, flow it out to the metaverse, and add a provisioning rule to the MRE to provision when that flag is true. Keep the complexity in ACMA, and let MRE handle the ‘acting’.

Visual Studio Online

If you don’t have GIT or TFS in your organization, you can get a Visual Studio Online account from Microsoft that is free for up to 5 users. A version control system is a must-have for tracking your documents, scripts and code versions for your various components

Lithnet FIM Service PowerShell Module (LithnetRMA)

The FIMAutomation module can do a lot, I find it is just overly complicated when we want to simply add, update, create and delete objects in the FIM service. It’s also very very slow. The Lithnet PowerShell module abstracts the complexity of the FIM service, and exposes a more natural and much faster set of cmdlets for working with the FIM service. It also comes with cmdlets to help you build XPath queries correctly, as well as the Import-RMConfig cmdlet for importing your configuration from files, as demonstrated in todays session.  People using this module have reported their scripts improving from hours to minutes using this module. It’s also many orders of magnitude less PowerShell code to write and maintain.

Resources for Developers

Lithnet FIM Service Client (LithnetRMC)

If you have had to write .NET code to talk to the FIM service endpoints, you know how daunting this can be. The fim2010client on codeplex took us partially there by setting up the scaffolding for us, but still left us having to deal with the internals of the FIM service. The Lithnet FIM Service client is a nuget package you can install in your project, and start using simple, get, update, save operations. It’s fast, supports multi-threading out of the box, and has a complete MSDN-style documentation with examples on how to use it. The LithnetRMA PowerShell module, as well as the REST API are both lightweight wrappers for the functionality contained in this module.

Lithnet FIM Service REST API (LithnetRMWS)

Ever tried talking to the FIM service endpoints from a non-windows device such as linux? I don't recommend trying. The Lithnet FIM Service REST API exposes the FIM service using very simple JSON and standard REST API calls. 

Further learning

FIM User group presentation on ACMA

Want to see ACMA in action? Check out my presentation to the FIM team user group. You’ll get to see how you can easily create business rules, unit tests, and see some more advanced topics like creating admin accounts with the shadow object feature, and inheritance of values between referenced objects.

FIM User group presentation on the Lithnet FIM Service Toolkit

The Lithnet FIM Service toolkit contains the .NET client, PowerShell module, and REST API. You can see how these all work in this presentation to the FIM team user group.

FIM Team User Group

I highly recommend that you join the FIM Team User Group. The group meets monthly and experts from around the world present on various topics relating to FIM/MIM. It’s a great way to make connections, and learn how other people are solving challenges in the identity management space

Thursday, October 22, 2015

Providing confirming imports to the sync engine when your target system doesn’t support delta imports

There are many systems out there that just don’t support delta imports. Deltas are important for ensuring speedy and efficient operation of the FIM synchronization engine. While we can’t invent true deltas when the connected system doesn’t provide the information, sometimes its enough to just provide FIM the deltas of what it has changed. This is especially relevant for systems where FIM is mostly responsible for the data in the connected system.
What would be handy, is if at export time, we could provide FIM with the confirmed changes made in the target system. While FIM allows us to say that an export was successful, it still expects to confirm those changes took place on the next import operation. What if we could construct our import CSEntryChange objects at export time?
The Lithnet.MetadirectoryServices library contains a set of helper classes to make writing management agents and rules extensions a little bit easier. One of the tools it contains is a CSEntryChangeQueue class, that gives you the ability to serialize CSEntryChange objects to XML and back again.  CSEntryChange objects are placed in the queue on export and saved to disk at the end of the operation. On a delta import operation, the queue is reloaded from disk, and the CSEntryChanges are passed back to the sync engine. There are two scenarios we can explore that can take advantage of this functionality.

Scenario 1: Re-playing the exported CSEntryChange back to the sync engine on delta import

This is a really quick and easy way to get confirming imports. After the successful export of an item to the target system, we can simply pass the CSEntryChange exported by FIM into the CSEntryChangeQueue, and get it back from the queue when the next import operation is called. Provided nothing went wrong with the export, we have all the information needed to provide the confirming import to FIM on the next import operation.
Be warned – you must ensure that you only replay CSEntryChanges for successful exports, otherwise you may misrepresent the state of your connected system to the sync engine. A full import will be required for FIM to get the correct information.

Scenario 2:  Constructing the delta import CSEntryChange at export time, and saving it for the next delta import

This is a really cool way to provide deltas. Let’s say you update a resource, and upon update, the system provides you a copy of (or link to) the resource after your update. This is quite a common practice with REST APIs. After a successful PUT or POST operation, you may receive a new copy of the resource as it appears after your modification. You actually have a true delta representation that you want to provide to the sync engine, its just that you are in the middle of an export, and the sync engine doesn’t want it yet! So, you can construct your CSEntryChange for import as you normally would with the information returned by the system, and submit it to the delta queue. The next time an import is called, the correct data will be passed to the sync engine, without making another potentially expensive call to the target system.
Even if the target system doesn’t automatically provide you an updated copy of the resource, there’s nothing stopping you from getting the object yourself after export and constructing your import CSEntryChange. After all, at this point in time, you know the resource has been changed – once the export operation is complete, you’ve lost that information.

Remember, its not a true delta

Both these scenarios can potentially save a call to the target system and each allows you to clear the pending export confirmation without having to do a full import. What you don’t get however, is changes made outside of FIM. These will still need to be obtained via a full import process. However, if your target system is only updated by the sync engine, then this process will work well. In a worst case scenario, you can have confirming imports immediately after an export, and run regular (perhaps less regular) full imports to obtain other changes.

Using the Lithnet.MetadirectoryServices.CSEntryChangeQueue

Let’s have a look at how to use the built-in CSEntryChangeQueue object to load and save deltas to a file after export. The following code shows a call-based ECMA 2 MA, that stores the delta information at export time in the queue. At the end of the export operation, the CSEntryChanges in queue are saved to an XML file.
Upon import, we first check to see if we are doing a full or delta import. If a full import has been requested, we import directly from the source system. If a delta import is requested, we load the queue from the disk, and replay the CSEntryChanges back to the sync engine. Once either a delta or full import operation is completed, we clear the queue, and save the empty list back to the disk.

Summary

While this pattern is not necessarily applicable when dealing with ‘source’ systems, it does have a place for ‘target’ systems that are predominately managed by FIM. Even in systems that do have a lot of changes that aren't made by the sync engine, there is still a net gain. If you have 10,000 objects in your target, and are doing hourly full imports to get those changes, these methods allow you to supplement your full imports with fast, frequent confirming delta imports. Keep in mind the following points;
  1. It doesn’t negate the need to do full imports
  2. It does negate the need to do a full import after export purely to confirm the last export run
  3. If there are no changes made in the target system that FIM doesn’t care about, it’s as good as having delta support in the target system
  4. You need to ensure that the ‘delta’ CSEntryChange accurately reflects the state of the target system

Get the nuget package today and read the documentation for full details.

Thursday, September 10, 2015

Take the guess work out of XPath with the Lithnet FIM Service PowerShell Module

Summary

The FIM Service allows you to query for resources using a subset of the XPath 2.0 dialect. It provides a quite powerful mechanism for searching for resources, but has more than a few curiosities when it comes to constructing queries for different attribute types.
The Lithnet FIM Service PowerShell module includes three cmdlets to help take the guess work out of constructing your XPath queries.
New-XPathQuery
The New-XPathQuery cmdlet creates a predicate that forms part of an XPath expression. The query is the Attribute = ‘value’ component of the expression
New-XPathQueryGroup
An XPath query group contains multiple XPath query objects, or other query groups, that are combined together with an ‘and’ or ‘or’ operator.
New-XPathExpression
The XPath expression wraps the query or query group with the appropriate syntax defining the object type (eg /Person[query])

Working with different attribute types

The cmdlets generate the correct syntax for each attribute type, without you have having to remember all the different ways the query needs to be expressed.
For example, lets have a look at the way we have to test for presence for each different attribute type
String /Person[(starts-with(AccountName, '%'))]
Integer /Person[(unixGid <= 999999999999999)]
DateTime /Person[(ExpiryDate <= '9999-12-31T23:59:59.997')]
Boolean /Person[((AccountDisabled = true) or (AccountDisabled = false))]
Reference /Person[(Manager = /*)]
The New-XPathQuery cmdlet has a simple syntax, that is independent of the type of attribute.

The cmdlet will automatically generate the syntax that is appropriate for the attribute specified. It’s not just the IsPresent operator that is made simpler. The cmdlets support all the attribute types and operators that are supported by FIM. The underlying Lithnet RMC library used by the cmdlets has over 100 associated unit tests to ensure all combinations of operators and attributes generate the correct XPath syntax.

How do i use it?


Simple query

The following example shows a simple query that checks for an AccountName of ‘ryan’

Combining queries into a group

You can combine multiple queries together in a group using the New-XPathQueryGroup cmdlet. This allows you to join queries created by New-XPathQuery together with an And or Or operator. Searching for an AccountName of ‘bob’ or ‘ryan’ is shown in the example below.

Nested query groups

Query groups can contain child query groups as well. You can build complex nested expressions using  multiple groups. The following example looks for all users who have a display name starting with ‘ryan’ or ‘bob’ that also have an email address

Nested expressions

When querying a reference attribute, you can use another expression as a query parameter. This allows you to build dereferencing expressions with ease. The following example searches for all people who have a manager with the AccountName ‘ryan’

Dereferencing expressions

Creating a dereferencing expression is easy with the DereferenceAttribute parameter on the New-XPathExpression cmdlet. The following example gets the manager of the person with the AccountName ‘ryan’

Using the expression object


Passing the expression to Search-Resources

Expression objects can be passed to other cmdlets such as Search-Resources. Rather than providing an XPath string to Search-Resources, you can simply pass the expression object.

Filter syntax

You can also use the builder to create the Filter attribute used is sets and groups. The –WrapFilterXml parameter ensures that the <Filter> XML element is wrapped around your expression

Setting Filter attributes directly


The library supports setting the value of a Filter attribute to an expression object directly. Ensure you set the –WrapFilterXml parameter on your expression.


Further Reading



Feedback


If you have an idea for a new feature, contact me using one of the methods below

Email: ryan@lithiumblue.com

Tuesday, August 25, 2015

Cut down on your PowerShell code with the Lithnet FIM Service PowerShell module

The FIMAutomation PowerShell module requires you to write a lot of code to perform even the most basic tasks. Let’s have a look at the following example in which Paul Williams provides some very well-written code for updating the EmployeeEndDate attribute of a user using the FIMAutomation snap-in.


Now lets look at doing the same thing with the Lithnet FIM Service PowerShell module

The same task requires much less code, and much easier to understand. Perhaps most importantly, we don’t need to understand the inner workings of the FIM Service itself (import changes, put operations, etc) to do something as simple as updating an attribute value. Just get, set, and save.

Monday, August 24, 2015

Version control your FIM Service configuration

Keeping track of your FIM Service configuration can seem like a daunting task. Even more so when you have multiple DEV, QA, and production instances that need to be kept consistent. We can make version controlling the FIM service a lot easier with some simple modifications to the schema, some clever scripts and a bit of process control.

This post will reference the configuration management capabilities of the Lithnet FIM Service PowerShell module, but the same concepts can apply even when using your own tools.

  1. Firstly, break up your FIM service design into components. A component is a collection of resources such as sets, MPRs and workflows that come together to perform a particular function. For example, the self-service password reset functionality can be grouped together as an SSPR component. You might have a group of workflows, sets and email templates that handle expiry notifications. I generally use the following components as a starting point;
    • User Interface (RCDCs, Nav bar links, etc)
    • Email notifications (welcome emails, expiry notifications, etc)
    • Schema (attributes, bindings, resource types)
    • Security model (permissions)
    • SSPR
  2. Create a custom resource in the FIM service for tracking these components and their versions. Create a changeLogEntry resource with a binding for a new version and details attribute. You can use the Import-RMConfig cmdlet to make these schema modifications for you. Save the following XML into a file, and apply the configuration changes with Import-RMConfig

  3. Each component should have its own design document. The component design document defines the configuration of all the objects that make up that component. The document itself should be version controlled, and is the authoritative source of both the version and configuration of the component.
  4. Translate these documents into a set of scripts that can create and update the components. Each document should  have its own script, and the script should be written to allow them to be run repeatedly, supporting both the creation of the necessary resources, and updating any existing objects to the documented configuration. Once again, you can use the Import-RMConfig cmdlet of the Lithnet FIM Service PowerShell Module to do this for you automatically. Each script should create or modify the changeLogEntry for that component to reflect the version in the component design document. The following XML demonstrates creating an email notification component, and updates the change log automatically as part of that process.

  5. If you have differences in parameters between your development, QA, and production configurations, make use of the variables file that the ConfigSync file provides as part of the <Variables import-file=""> attribute. Extract the parameters into a separate variables file for each of your environments. Each variables file should be saved and managed independently, you never want to have to change the file itself whenever you move between versions. When importing the configuration, copy the appropriate environment-specific variables file into the folder where the config xml is stored, and rename it environment-variables.xml (or the name you have chosen to use in your main xml file). As an example, you might have a QA and a production variables file as shown below. When used with the example file above, the #env# placeholder is substituted with either QA or PROD, depending on the file that is used.


  6. Use a source control system to store your component design documents and scripts. You can get a free Visual Studio online account from Microsoft, or if you have an existing system such as GIT, TFS, or SVN service, you can use that.
  7. When you have a tested, working set of components, that you are ready to deploy, bundle them together and create a release. Create a release document that details each component and its version, as well as the changes made since the last version. Move this release through the development, QA, and prod environments as a bundle. If you are using a source control system, branch your source control tree for each release. That way, you have a permanent, point-in-time copy of what each release looked like.

In summary



  1. Don’t try and version control the whole FIM service configuration. Break it down into smaller components, and release them in defined bundles
  2. Use the FIM Service to keep track of its own component versions by creating a custom change log resource
  3. Your design documents are authoritative. Ensure the appropriate controls are in place to make sure documents are kept up to date, and accurate  reflect your components
  4. Ensure your scripts can be run repeatedly, only making changes where needed. (And remember Import-RMConfig does this out-of-the-box)
  5. Never modify your scripts as part of a deployment. Make use of variable files to apply per-environment settings
  6. Use a source control system to give you a complete version and release history