Posted on 3. May 2017

Simple, No fuss, Dynamics 365 Deployment Task Runner

Why?

I've used the Dynamics Developer Toolkit since it was first released by MCS for CRM4! I love the functionality it brings however the latest version is still in beta, it isn't supported on VS2017 and there isn't a date when it's likely to be either (yes, you can hack it to make it work but that's not the point J).

Rather than using an add-in Visual Studio project type, I've been attracted by the VS Code style simple project approach and so I decided to create a 'no-frills' alternative that uses a simple json config file (and that can be used in VS2017).

What?

  1. Deploy Plugins & Workflow Activities - Uses reflection to read plugin registration information directly from the assembly. This has the advantage that the plugin configuration is in the same file as the code. You can use the 'instrument' task to pull down the plugin configuration from Dynamics and add the metadata to your classes if you already have an existing project.
  2. Deploy Web Resources – deploy webresources from file locations defined in the spkl.json configuration. You can use the 'get-webresources' task to create the spkl.json if you already have webresources deployed.
  3. Generate Early Bound Types – Uses the spkl.json to define the entities to generate each time the task is run to make the process repeatable.
  4. Profile management – An optional profile can be supplied to select a different set of configuration from spkl.json. E.g. debug and release build profiles.

How?

Let's assume you have a project in the following structure:

Solution
    |-Webresources
    |    |-html
    |    |    |-HtmlPage.htm
    |    |-js
    |    |    |-Somefile.js
    |-Plugins
    |    |-MyPlugin.cs
    |-Workflows
    |    |-MyWorkflowActivity.cs

On both the Plugin and Workflows project, Run the following from the Nuget Console:

Import-Package spkl

This will add the spkl to the packages folder and the metadata CrmPluginConfigurationAttribute.cs that is used to mark up your classes so that spkl can deploy them. Some simple batch files are also included that you can use to get started.

If you already have plugins deployed, you can run the following command line in the context of the Plugins folder:

spkl instrument

This will prompt you for a Dynamics Connection, and then search for any deployed plugins and their matching .cs file. If the MyPlugin.cs plugin is already deployed it might end up with the following Attribute metadata:

[CrmPluginRegistration("Create","account",
    StageEnum.PreValidation,ExecutionModeEnum.Synchronous,
    "name,address1_line1", "Create Step",1,IsolationModeEnum.Sandbox,
    Description ="Description",
    UnSecureConfiguration = "Some config")]

A spkl.json file will be created in the project directly similar to:

{
  "plugins": [
    {
      "solution": "Test",
      "assemblypath": "bin\\Debug"
    }
  ]
}

If you now build your plugins, you can then run the following to deploy

spkl plugins

You can run instrument for the workflow project using the same technique which will result in code similar to the following being added to your workflow activity classes:

[CrmPluginRegistration(
        "WorkflowActivity", "FriendlyName","Description",
        "Group Name",IsolationModeEnum.Sandbox)]

…and then run the following to deploy:

spkl workflow			

To get any currently deployed webresources matched to your project files you can run the following from the Webresource project folder:

spkl get-webresources /s:new			

    Where new is the solution prefix you've used

This will create a spkl.json similar to the following:

{
  "webresources": [
    {
      "root": "",
      "files": [
        {
          "uniquename": "new_/js/somefile.js",
          "file": "js\\somefile.js",
          "description": ""
        },
        {
          "uniquename": "new_/html/HtmlPage.htm",
          "file": "html\\HtmlPage.htm",
          "description": ""
        }
      ]
    }
  ]
}

You can then deploy using:

spkl webresources

Profiles

For Debug/Release builds you can define multiple profiles that can be triggered using the /p:<profilename> parameter.

{
  "plugins": [
    {
      "profile": "default,debug",
      "assemblypath": "bin\\Debug"
    },
    {
      "profile": "release",
      "solution": "Test",
      "assemblypath": " bin\\Release"
    }
  ]
 
}

The default profile will be used if no /p: parameter is supplied. You can specify a profile using:

spkl plugins /p:release			

Referencing a specific assembly rather than searching the folder

If you have multiple plugins in a single deployment folder and you just want to deploy one, you can explicitly provide the path rather than using the folder search. E.g.

{
  "plugins": [
    {
      "assemblypath": "bin\\Debug\MyPlugin.dll"

Adding to a solution

If you'd like to automatically add the items deployed to a solution after deployment you can use:

{
  "webresources": [
    {
      "root": "",
      "solution": "Test",

Combining spkl.json

Perhaps you want to have a single spkl.json rather than multiple ones per project. You can simply add them all together:

{
  "webresources": […],
  "plugins": […]
}

Multiple project deployments

Since the spkl.json configuration files are searched from the current folder, you can deploy multiple plugins/webresources using a single spkl call from a root folder.

I'll be updating the github documentation page as things move forwards.

Posted on 19. February 2016

SharePoint Integration Reloaded – Part 5 (belated)

In previous articles in this series we've talked about the differences between Server Side Sync and the old List Component. Since I published the first articles, a new MSDN article on the topic has been posted which I thought would be good to signpost folks to => Important considerations for server-based SharePoint integration.

One of the topics that has come up recently for people using Server Side Sync to SharePoint is the 5000 item limit of Document Libraries which has led to a bit of panic amongst some so I thought I'd dispel the rumours!    

Here are the facts about the Throttling Limitation

1) You can see how many items you have in a Document Library by opening the site in SharePoint and selecting 'Site Content' from the left hand navigation menu. The number of items will show below the Document Library name – this includes documents and folders.

2) If you have more than 5000 items you can still use Server Side Integration with SharePoint provided that you only use the default sort order of the document view in CRM.

The default sort is by Location Ascending. You can change this to sort Descending but if you change the sort to any other column you will receive the error "Throttling limit is exceeded by the operation"

3) If you have a record with only 2 documents in the associated document locations folder you will still not be able to sort by any other column other than Location if the root Document Library has more than 5000 items overall.

3) If the user clicks on 'Open SharePoint' then they will be able to do all the sorting they need since the limitation is not experienced by the native SharePoint interface – only the CRM document lists.

I find that this sort limitation is not an issue because I encourage users to use SharePoint freely due to its rich interface. Don't try and hide SharePoint from them because it's important to understand the way in which documents are stored and the additional features that SharePoint has to offer. The documents view in CRM is simply to be used as a quick reference for documents associated with the CRM Record.

Hope this helps!

Posted on 12. May 2015

Introducing SparkleXRM Metadata Server

Metadata Server Logo

Speed up your HTML web resources by caching metadata such as localised field labels and options sets.

If you've developed an HTML web resource for Dynamics CRM that includes field labels, option sets or any other element that is stored in the Dynamics CRM metadata then you’ll know about the delay each time the your UI is rendered whilst information is downloaded from the server. You could of course hard code this information in your JavaScript but you'd suffer higher maintenance costs especially when supporting multiple languages.

The SparkleXRM Metadata Server allows dynamic expressions to be included in JavaScript Web Resources that are evaluated on the server and then cached on the client side.

/*metadata
var fieldLabel = <@contact.fullname.DisplayName@>;
metadata*/

This will output and cache the following on the client (for an LCID of 1033):

var fieldLabel = "Full Name";

Learn more about the SparkleXRM Metadata Server!

Posted on 9. April 2015

Control Async Workflow Retries

The Dynamics CRM Async Server is a great mechanism to host integrations to external systems without affecting the responsiveness of the user interface. Once such example is calling SharePoint as I describe in my series – SharePoint Integration Reloaded.

A draw back of this approach (compared to using an integration technology such as BizTalk) is that any exceptions thrown during the integration will simply fail the workflow job with no retries. You might already know that the Async Server does in fact have a retry mechanism built in as described by this excellent post from the archives - http://blogs.msdn.com/b/crm/archive/2009/03/25/when-do-asynchronous-jobs-fail-suspend-or-retry.aspx. As you'll see from this article there are some built in exceptions where CRM will automatically retry a number of times as defined by the 'AsyncMaximumRetries' deployment property. The interval between these retries is defined by the 'AsyncRetryBackoffRate' property.

So how can we make use of this retry mechanism with our custom code?

There is an undocumented and unsupported way of using this retry mechanism in your custom workflow activities. I first used this technique back in the CRM 4.0 days and I'm pleased to report that it still works with CRM 2015!

Although unsupported, the worst that could happen is that the workflow would stop retrying in future version of Dynamics CRM and revert to simply reporting the exception. However it still wouldn't be a good idea to rely on this functionality for mission critical operations.

So…plz show me the codez…

catch (Exception ex)
{
    // TODO: Rollback any non-transactional operations

    OrganizationServiceFault fault = new OrganizationServiceFault();
    fault.ErrorCode = -2147204346; // This will cause the Async Server to retry
    fault.Message = ex.ToString();
    var networkException = new FaultException(fault);
    throw networkException;
}

When an exception is thrown by your custom workflow activity you'll see the workflow enter into a 'Waiting' state and the 'Postpone until' attribute will be set to the time when the retry will take place. Once the maximum number of retries has been performed, it will enter the 'Failed' status.

You can use this technique to ensure that any temporary integration failures such as service connectivity issues will be dealt with by a graceful retry loop. It only remains to ensure that you before throwing the exception you rollback any operations performed already by your code (such as creating records) so that your retries will not create duplicate records.

Hope this helps!

@ScottDurow

Posted on 16. February 2015

Business Rules & “SecLib::RetrievePrivilegeForUser failed - no roles are assigned to user”

When publishing your Ribbon Workbench solution you may receive the following error:

"SecLib::RetrievePrivilegeForUser failed - no roles are assigned to user."

The first step in diagnosing these issues is to try and export the same solution using the CRM solutions area and then immediately re-import to see what error is shown. This is exactly what the Ribbon Workbench does behind the scenes.

When doing this you might see:

Upon Downloading the Log File you'll see that the error occurs on a 'Process' and the same error message is shown as reported by the Ribbon Workbench. The User ID is also given which can be used in a URL similar to the following to find the user record causing the issue.

https://<orgname>.crm.dynamics.com/userdefined/edit.aspx?etc=8&id=%7b<User GUID>%7d

You will most likely discover that this user has been disabled or has no roles assigned – this could be that they have left the company or changed job role. You will need to find any Workflows or Dialogs that are assigned to them and re-assign to yourself before you can import the solution.

In most cases, you should not include Dialogs or Workflows in the solution that is loaded by the Ribbon Workbench since this only slows the download/upload publish process down. There is one exception to this – and that is Business Rules. Business Rules are associated with their particular entity and cannot be excluded from the solution. Oddly like Workflows and Dialogs they also are owned by a user but it is not shown in the user interface – nor is there an option to re-assign. There is a handy option on a user record that allows you to 'Reassign Records'

You would think that this would reassign any Business Rules but unfortunately you'll get the following error:

"Published workflow definition must have non null activation id."

The only way to reassign these Business Rules is to perform an advanced find for Processes of Type 'Definition' and Category 'Business Rule'

The results can then be reassigned to yourself using the 'Assign' button on the results grid.

@ScottDurow

Posted on 3. February 2015

The dependent component Attribute (Id=transactioncurrencyid) does not exist. Failure trying to associate it with Workflow

I recently answered a question on this error on the Community forms and coincidently I've just come up against exactly the same issue!

When importing a solution you receive the error 'There was an error calculating dependencies for this component' and on downloading the log you see the full message similar to:

The dependent component Attribute (Id=transactioncurrencyid) does not exist. Failure trying to associate it with Workflow (Id=<GUID>) as a dependency. Missing dependency lookup type = AttributeNameLookup.

Although this message can appear for other attributes, this post is specifically to do with the transactioncurrencyid attribute being referenced.

When you add a money field to an entity CRM automatically adds a N:1 relationship to the currency entity to hold the currency that the value is stored against. The foreign key attribute is always named transactioncurrencyid.

In this increasingly 'agile' software development world attributes are added and remove fairly regularly during the development phase. If a money field is added to an entity and then removed, the transactioncurrencyid attribute is not removed. Because the relationship automatically created by the system it is not created when deploying to another environment via a solution import because there are no money fields. This leads to your development environment having the additional relationship. This wouldn't cause a problem apart from that when you create a workflow with a 'create' or 'update' step, the currency field is usually pre-populated with the default currency. Consequently when you try to import this workflow into another organization that does not have the currency relationship you will see this error.

The solution is to either delete the transactioncurrencyid field from your development environmentm and from the workflow create/update steps or simply add a dummy currency field to your target environment in order to create the relationship to currency.

@ScottDurow

Posted on 20. January 2015

The cream cracker conundrum (or customising the sub grid command bar)

I still find the streamlined user experience offered by the Command Bar a welcome change from the CRM2011 Ribbon. The sub-grid command bar is the only possible exception with the loss of the ability to add custom sub-grid buttons. There are only at most two buttons on a sub grid – 'Add' and 'Open Associated Sub-Grid'

The user must click on the 'Open associated sub-grid' button to show the full associated view and the full command bar with custom buttons. I say 'possible exception' because in fact there are still the same number of clicks involved (users had to click on the sub grid to show the contextual ribbon before) but it does feel as though we should have the ability to add buttons to the sub-grid command bar. I can think of some good reasons why this design decision may have been made (performance for one!) – but this post details what you CAN do to the sub-grid command bar.

Because the 'Open associated sub-grid' button is a bit of a mouthful, I'll refer to it from now on as the 'cream cracker' because it kind of looks like one and is equally a bit of a mouth full! (Thanks goes to my friends at the British Red Cross who first named it this way!)

Hiding buttons

We have established that you cannot add buttons to the form sub grid, but both the 'Add New' and 'Cream cracker' buttons are customisable in terms of their command and visibility (but you cannot change the image or the tool tip text).

To hide the sub grid buttons you have the following options:

  1. Hiding based on a security role (if the user does not have access to append a record to the parent or create new records of the sub grid type, the 'add new' button will be invisible
  2. Hiding all of the time using a 'Hide Action'
  3. Hiding conditionally using a security role
  4. Hiding conditionally using a custom JavaScript rule.

A common requirement is to hide the Add New button on a sub-gird until a specific criteria is met on the parent form. The standard ValueRule cannot be used because this rule will only work when the command button is added to a form command bar. So to achieve the conditional show/hide we must use a Custom JavaScript Rule.

The first hurdle is to determine which button needs to be customised. The sub grid 'Add New' button calls a different command depending on the nature of the relationship.

If you have a foreign-key attribute on your child entity that is marked as Field Requirement = 'Optional' then the Add New button will be the AddExistingStandard since it allows you to search for an existing record first. If the Field Requirement = 'Business required' then the button will be AddNewStandard

 

 

Once you've identified the right button, you can then open the ribbon workbench and click Customize Command and add the Value Rule as described by my user voice article.

Changing the command actions

Although we cannot add new buttons (did I mention that?!) we can change the command actions that either of those two buttons call. Since we can't customise the button, the only option here is to customise the command and change its behaviour in a very similar way to adding custom enable rules.

  1. Right click the button in the Ribbon Workbench and select Customise Command
  2. Expand the command in the Commands node in the Solution Elements panel and select the command that has been created for you to customise.
  3. Right click Edit Actions and you can simply delete the standard action and add your own custom ones.
  4. Remember to mark all the enable and display rules that you don't want to customise as IsCore=True.

Once such use of this technique is to replace the standard add new button with a custom dialog.

Refreshing the sub grid Command Bar

You will find that when the form is loaded and the sub grid is refreshed for the first time the EnableRules are evaluated. If however the conditions for the EnableRules change (e.g. a value changes on the parent form) the sub grid command bar will not automatically refresh to reflect the new state. Upon adding or deleting rows in the sub grid the command bar is refreshed – but this isn't much use in this case.

The main form command bar can be refreshed using Xrm.Page.ui.refreshRibbon() however this will not refresh sub grid command bars. Instead, we can add an onchange event to the fields that are used in our ValueRule and call:

Xrm.Page.data.save();

This will refresh the sub grids and re-evaluate any of the EnableRules however it will also save any other dirty attributes and so should be used with caution if you do not have auto-save enabled.

Responding to new/deleted rows in the sub grid

Since the sub grid command bar is refreshed when new rows are added or deleted we can use the fact that the EnableRules will be re-evaluated to call custom JavaScript when the sub grid changes. This simulates a sub-gird onchange event and was first described by James Wood's blog post for CRM2011. He states on his blog that this technique doesn't work for CRM2013 – however if we add the custom EnableRule to the existing command (rather than use a new button as James describes) then this technique works well in CRM2013 and CRM2015. So we can customise the command for the Add New or cream cracker and add a Custom JavaScript Enable Rule that always returns true in just the same way that you might use the EnableRule to dynamically show/hide the button but rather we just run our onchange code.

Perhaps in the future there will be more possibilities but for now that about sums up the possibilities for customising the sub grid command bar.

@ScottDurow

Posted on 29. December 2014

SharePoint Integration Reloaded – Part 3

In Part 1 and Part 2 of this series we have discussed how the new server-to-server integration with SharePoint works under the covers. In this post I'll show you how to integrate with SharePoint directly from a sandboxed workflow activty/plugin rather than relying on the out of the box integration.

Using the standard integration, a new folder will be created for each record underneath the default site. In some solutions you'll find that you want to modify this behaviour so that folders are created in a custom location. You may for example want to have an opportunity folder created under a site that is specific to a particular client rather than all under the same site.

The challenge with integrating with SharePoint using a CRM Online Workflow Activity/Plugin is that you can't reference the SharePoint Assemblies which authenticating and calling the SharePoint web service somewhat harder. Thanks goes to fellow Dynamics CRM MVP Rhett for his blog that provided a starting point for this sample - https://bingsoft.wordpress.com/2013/06/19/crm-online-to-sharepoint-online-integration-using-rest-and-adfs/. The sample code in this post shows how to create a folder in SharePoint and then associate it with a Document Location. The authentication with SharePoint works via ADFS and since the out of the box integration uses a trust between CRM and SharePoint that is not accessible from a sandbox (even if you try and ILMerge it!) we have to provide a username and password that will act as our privileged user that can create folders in SharePoint. I have left a function where you can add your own credentials or implement a method to retrieve from a secure entity in CRM that only administrators have access to. Look in the code for the 'GetSecureConfigValue' function.

The sample contains a custom workflow activity that works in a CRM online 2013/2015 sandbox accepting the following parameters:

  • Site – A reference to the site that you want to create a folder in. You could store a look up to a site for each customer and populate this parameter from the related account.
  • Record Dynamic Url – The 'Dynamic Record Url' for the record that you want the SharePoint document location to be related to. This uses my Polymorphic input parameter technique. You simply need to pass the Record Url (Dynamic) for the record that you wish to create the folder for.
  • Document Library Name – The name of the document location to create the folder underneath. In the out of the box integration this is the entity logical name (e.g. account)
  • Record Folder Name – The name of the folder to create. You could use the client name, client ID etc. – but it will automatically have the GUID appended to it to ensure uniqueness just like the out of the box integration.

Calling the workflow activity might look like:

The workflow activity is deployed using the Developer Toolkit for Dynamics CRM and performs the following:

  1. Checks if the document location already exists for the given site/document library – if so it simply returns a reference to that
  2. Checks if a document location exists for the given document library – if not, one is created
  3. Creates a SharePoint folder using the SpService class. It is worth noting that if the folder already exists, no exception is thrown by SharePoint. The SpService class must first authenticate using the SpoAuthUtility class.
  4. Creates a Document Location for the newly created folder.

You could choose to run the workflow in Real Time or asynchronously on create of a record – the down side of real time is that it will increase the time that the record takes to save.

Check out the code in MSDN Samples- you'll need to do a Nuget package restore to pick up the referenced assemblies.

View/Download Code

That's all for now – have a Happy New Year!

@ScottDurow

 

 

Posted on 9. August 2014

SharePoint Integration Reloaded – Part 2

Part 1 of in this series described how SharePoint Server to Server SharePoint integration (new to CRM2013 SP1) works from the client interface perspective. In this post I'd like to share a bit more how this all works from the server side.

Authentication

When this feature was introduced my first question was about authentication. Having created a fair number of solutions that integrated SharePoint with Dynamics CRM on the server side I knew that this is a tricky area. Since this feature is only available for CRM Online to SharePoint Online where they are in the same tenant it makes authentication slightly simpler because there is already an existing trust in place between the two servers which allows Dynamics CRM to authenticate with SharePoint and act as the calling user. The HTTP request that comes from the client is inspected by the integration component and the UPN is used to authenticate with SharePoint as the same user rather than the service account. This is acting on behalf of the user is critical because when documents are created, checked in/out or queried, it must be performed under the account of the user and not the system account. Perhaps even more important, when CRM queries for documents it will only return those that the user has access to as configured in SharePoint.

If this feature is made available for On Prem customers I would expect that a configuration would have to be made available to provide the user's SharePoint username and password to use when performing server side operations.

Query SharePoint Documents

The new SharePoint sub grid that is rendered by CRM actually uses exactly the same query mechanism as any other entity – but rather than the query being sent to the CRM Database, it is handled by the SharePoint query handler. If you fire up Advanced Find, you'll see a new Entity named 'Documents' but if you query against this entity you will get the error:

The error is given by the SharePoint FetchXml conversion to CAML only works if a specific regarding object is provided – this means that you can only return records for a specific folder, rather than all documents in all document locations. When the refresh button is clicked on the client sub-grid there is FetchXml similar to the following executed:

<fetch distinct="false" no-lock="true" mapping="logical" page="1" count="50" returntotalrecordcount="true" >
    <entity name="sharepointdocument" >
        <attribute name="documentid" />
        <attribute name="fullname" />
        <attribute name="relativelocation" />
        <attribute name="sharepointcreatedon" />
        <attribute name="ischeckedout" />
        <attribute name="filetype" />
        <attribute name="fullname" />
        <attribute name="modified" />
        <attribute name="sharepointmodifiedby" />
        <attribute name="relativelocation" />
        <attribute name="documentid" />
        <attribute name="modified" />
        <attribute name="fullname" />
        <attribute name="title" />
        <attribute name="author" />
        <attribute name="sharepointcreatedon" />
        <attribute name="sharepointmodifiedby" />
        <attribute name="sharepointdocumentid" />
        <attribute name="filetype" />
        <attribute name="readurl" />
        <attribute name="editurl" />
        <attribute name="ischeckedout" />
        <attribute name="absoluteurl" />
        <filter type="and" >
            <condition attribute="regardingobjecttypecode" operator="eq" value="1" />
            <condition attribute="regardingobjectid" operator="eq" value="{1EF22CCD-9F19-E411-811D-6C3BE5A87DF0}" />
        </filter>
        <order attribute="relativelocation" descending="false" />
    </entity>
</fetch>

The interesting part here is that we can add filters not only by regarding object but we could also add our own filters for name or document type. Initially I was confused because running this Fetch query in the Xrm Toolbox FetchXml tester gave no results but as it turns out this uses the ExecuteFetchRequest rather than RetrieveMultiple and this new SharePoint integration is only implemented on the latter.

Internal Execute Messages

This new server to server functionality is exposed by a set of internal messages that are not documented in the SDK but by using Fiddler (the tool that give you super powers!), you can see these messages being called from the client when operation such as Check In/Check Out are called. Here is a list of these internal messages:

Message Name

Description

RetrieveMultipleRequest
(sharepointdocument)

Returns a list of document names from SharePoint for a specific document location. This query is converted into SharePoint CAML on the server and supports basic filter criteria and sorting.

NewDocumentRequest

Creates a new document location and matching SharePoint folder and is called when the documents sub grid is first shown with no document locations configured.

 

FileName – Name of the file to create including extension (e.g. NewDocument.docx)

RegardingObjectId – Guid of the record that the document location belongs to

RegardingObjectTypeCode – Object type code of the record the document location belongs to

LocationId – The ID of the document location to add the new document to (in case there are multiple)

CheckInDocumentRequest

CheckOutDocumentRequest

DisregardDocumentCheckoutRequest

This performs the check in/out operation on a specific document in SharePoint.

Entity – The document to check in/out with the 'documentid' property populated with the List Id in SharePoint.

CheckInComments

RetainCheckOut

CreateFolderRequest

Creates a new documents location in CRM and the corresponding SharePoint folder.

FolderName – the name to give to the SharePoint folder

RegardingObjectId – Guid of the record that the document location belongs to

RegardingObjectTypeCode – Object type code of the record the document location belongs to

 

Now before you get excited you can't use these requests on the server because you will get a 'The request <Request Name> cannot be invoked from the Sandbox.' (Yes, I did try!) This is expected since the sandbox does not have access to the HTTP context that contains the information about the calling user and so the authentication with SharePoint cannot take place.

I proved this using a Custom Workflow Activity that tried to call 'CreateFolder' and you see the following error.

These requests can however be called easily from JavaScript which opens up some interesting possibilities (if a little unsupported because these messages are not actually documented the SDK at the moment):

  1. Automatically create a document location using a different naming convention to the standard one via JavaScript onload of a record if there isn't one already.
  2. Provide a custom view of SharePoint documents using fetchxml – this could even be filtered to just show a particular file type by adding a condition similar to <condition attribute="filetype" operator="eq" value="jpeg"/>
  3. Provide custom buttons to create documents in SharePoint.

I hope you've found this interesting - Next time I'll show you how to get around the sandbox limitation to perform server side operation on SharePoint from a CRM Plugin or Workflow Activity.

@ScottDurow

Posted on 9. August 2014

Early Binding vs Late Binding Performance (Revisited)

After having an interesting debate on the CRM Community forums about the performance of Early verses Late Bound entities my friend Guido Preite pointed me at a good blog post on this subject by James Wood named 'CRM 2011 Early Binding vs Late Binding Performance'. I have always been an advocate of Early Bound types but it is true that the SDK still states in the 'Best Practices for Developing with Microsoft Dynamics CRM'

"…use of the Entity class results in slightly better performance than the early-bound entity types"

However it also states that that the disadvantages of using the late-bound Entity types is:

"…you cannot verify entity and attribute names at compile time"

I've seen many bugs introduced into code from the use of late bound types because typos can easily be introduced into the strings that are used to determine the entity and attribute logical names. Due to the productivity gains that come with Early Bound types I always recommend their use if the schema of your entities is known at compile time. There are times when this is not true or you are creating code that must run in a configurable way on many different entities or attributes in which case the late bound entity type is the only choice.

So what about performance?

  1. The SDK state that Late Bound types give 'slightly' better performance and states 'Serialization costs' as the reason.
  2. James' post states a 30% increase in speed for 200 Create operations, and <5% increased for 1500 operations.

So addressing each of these points in turn:

Although before CRM2011 there were serialization costs in using early bound types because they were serialised as part of the web service call, with CRM2011/2013 the early bound types just inherit from the Entity class and the early bound attribute properties simple set/get values from the underlying Attribute collection. The serialization when making SDK calls is effectively the same for both early and late. The main difference is due to the extra work that the OrganizationService Proxy has to do when converting the Early Bound type to the Entity type and then back against when it's received from the server. This is done using Reflection to first search for the Early bound classes and then by searching the classes for the one that matches the logical name received. This obviously will have a cost but it seems to be work that is done once per Service Channel and then cached to avoid any further cost.

James' tests are interesting but perhaps a bit misleading because the initial cost of this additional work is included in his overall speed calculations. This is probably why the overall percentage cost of early binding goes down as the number of records increases.

To remove this initial cost from the equation I adapted his code to introduce a warmup time. In tests I couldn't categorically show that either Late Bound or Early Bound had any performance difference once the OrganizationService was 'warmed up' with all reflection done and cached. In fact sometimes the test showed that early bound was quicker which leads me to believe that the main influencing factor is somewhere else like Database or Server performance. To make the results easier to interpret I have simply shown the average operation time after the warmup period. I also separated out the tests so that the Early Bound types were not compiled and picked in the Late Bound tests.

Each test was a warm up of creating 400 records and a run of creating 500 records.

Conclusions

Whilst it is true that using Early Bound classes incurs some cost of additional 'plumbing' – assuming that you are caching the WCF Service Channels (which the Microsoft.Xrm.Client.Services.OrganizationService does for you) because the difference in speed is so small (< there really is no reason not to use the Early Bound classes unless you have performance related issues and want to eliminate this as the cause.

If you are interesting, here is the code I used (based on James' code)

static void Main(string[] args)
{       
    int warmupCount = 400;
    int runCount = 500;
 
    CrmConnection connection = new CrmConnection("Xrm");
    var service = new OrganizationService(connection);
    
    CreateAccounts("Early Bound Test", warmupCount, runCount, () =>
    {
        Account a = new Account();
        a.Name = "Test Early Vs Late";
        service.Create(a);
    });


    CreateAccounts("Late Bound Test", warmupCount, runCount, () =>
    {
        Entity e = new Entity("account");
        e["name"] = "Test Early Vs Late";
        service.Create(e);
    });

    TidyUp();

    Console.WriteLine("Finished");
    Console.ReadKey();
}

static void CreateAccounts(String name, int warmup, int runs, Action action)
{
    Console.WriteLine("\n" + name);
    // Warm Up
    for (int i = 1; i <= warmup; i++)
    {
        if (i % 10 == 0)
            Console.Write("\r{0:P0}     Warmup   ", (((decimal)i / warmup) ));
        action();
    }

    // Run Test
    double runningTotal = 0;
    Stopwatch stopwatch = new Stopwatch();
    for (int i = 1; i <= runs; i++)
    {
        stopwatch.Reset();
        stopwatch.Start();
        action();
        stopwatch.Stop();
        runningTotal += stopwatch.ElapsedMilliseconds;
      
        double runningAverage = runningTotal / i;
        if (i % 10 == 0)
            Console.Write("\r{0:P0}     {1:N1}ms   ", (((decimal)i / runs)), runningAverage);
    }
}