Pages

Monday, December 30, 2013

CRM 2013 Notes Control only showing notes

CRM 2013 has a handy new feature for displaying activities, notes, and posts in a single control. This is very easy to configure by just selecting the "Notes" control in the form designer.


You can subsequently also define the default view for this control in the properties. As an aside, it's also recommended to uncheck the "Display label on the form" check box as it will display better without this.



So far so good. Except for one small wrinkle - the "Activities" and "Posts" tabs for this control do not show...


I found a solution to this issue which is outlined in this blog post. Although it does appear to be an issue even for forms that have already been "upgraded". Case in point - the form that I was working with was a new form created in CRM 2013 for an existing entity and the issue still appeared.

Anyway, as described in that post, the issue is fixed by modifying FormPresentation node for the form in question from 0 to 1.


After importing and publishing, the 3 tabs for this control display as expected.



Friday, December 27, 2013

CRM Manipulation Library Date Calculation Issue

Speaking of the CRM Manipulation Library solution, I encountered an issue with the date calculation logic. Specifically with regards to the somewhat misleadingly named "Add Days" function. Misleading in that if you look at the function, you will notice that in addition to the capability of adding days, you can also use this to add years, months, weeks, hours and minutes to a given date.


This is an important clarification as adding a month to date is not always going to be the same as adding 30 days to a date. Similarly for years (as in the case of a leap year). And obviously being able to specify the more granular hour and minute parameters is also helpful. We probably could have worked around not having the week parameter but it does make life a little easier by not having to perform additional calculations.

Anyway, the issue that I encountered is that ironically if you do not specify a non-zero "Days To Add" value and just supply one of the other parameters, the calculation will not work. That is, if we were to specify parameters in the screenshot above (where we're trying to add years to the date) it would not work.

Luckily there is a simple workaround solution. Below is the same function call except we trick it by subtracting a day value and then cancel it out by passing in the equivalent days in the week parameter.

If you needed to use the week parameter, then you could similarly pass in 24 hours and cancel it out by passing -1 to the day parameter.


Thursday, December 26, 2013

Workflow Plugin CRM Online Compatibility

There are a lot of useful free add-ons available for Dynamics CRM that can add some serious customization features. The issue in many cases is that these add-ons were only released for the on-premise version of CRM.

For example, take the very useful CRM Manipulation Library which contain some important calculation capabilities. I especially find the Date Utilities to come in handy as there is often a requirement to calculate an off-set of days or business days as part of workflow logic. But when you try import the solution into a CRM Online instance you will receive an error that looks something like the following:



The good news is that this can be made compatible with CRM Online by following these steps:


  1. Unzip the solution into a folder
  2. Open the customizations.xml file
  3. Perform a search for the IsolationMode node
  4. Change IsolationMode value from 1 to 2
  5. Update the original zip file with the modified file


Once you have done so you should be able to import it into CRM Online. Applying this trick does not necessarily mean that the plugin will work as it might fail on something when trying to execute. However it also just may function without a problem which is indeed the case with the CRM Manipulation Library. So it's definitely worth a shot. If your plugin does not work after performing this tweak, you'll have to do some debugging at the code level.

Tuesday, December 24, 2013

CRM 2013 Form Design Guideline

This post is a work in progress and is far from being a definitive stance. Rather it is somewhat exploratory/tentative/philosophical based on experimentation using the CRM 2013 form designer. It may well be tweaked over time based on additional findings, user feedback etc.

In previous versions of CRM, the "default" design for forms was pretty straight forward. That is, when configuring a form, you could feel fairly "safe" in terms of selecting the "tab with one section and two columns" design option in terms of your starting point.


Sure on occasion when you knew you were adding lots of check box controls you might elect a 3 or 4 column section or conversely a 1-column section for a sub-grid. But the point is that you felt fairly certain you couldn't go too wrong in picking the "tab with one section and two columns" option in terms of how the form would render i.e. you were pretty certain that all fields would be easily visible and if the form was lengthy, you could easily add new tabs to simplify form navigation.

Maybe it's just me but I'm finding that I'm struggling a little bit more with the CRM 2013 forms. Suddenly we have a whole lot more form real estate to work with such that forms that used the previous "tab with one section and two columns" standard layout (2011, 4.0, 3.0, 1.2) , look rather sparse when viewing in CRM 2013. For example take a look at the upgraded contact form in 2013.


This is not helped by the fact that Microsoft has essentially eliminated the tab "jump" navigation option such that you have a lot of "scroll and hunt" in upgraded forms. As an aside, I personally don't know if eliminating the tab navigation is a long term decision - it seems to me that this could easily have been incorporated without compromising the look and feel of the 2013 form layout (e.g. by having a drop down where the general tab is showing or just having the tab names going from right to left at the top of the page). In contrast, they have indeed enabled a jump menu which enables navigating to different forms, so perhaps the long term design objective is to break up long forms into multiple mini-forms. Not quite sure at this point. But for now, I have to say that the lack of incorporating this tab navigation "bridge" to aid in navigation for forms upgraded from previous versions of CRM eludes me.

Anyway, the point of this post is essentially to be able to define a new "default" for CRM 2013 forms. There are definitely a lot more form design options - a simple comparison between the 2011 and 2013 forms illustrates this. We jump from having 2 tabs options to 6 tab options - this increases the form design options exponentially.


And while there are benefits to having more options there are also downsides to having so many choices. And correspondingly I've been finding that I've tended to vacillate a lot more than I did when working with previous versions of CRM. And so I've been looking for guidance as to how to go about thinking about form design in CRM 2013. But the most I've found is the following diagram which is Microsoft's take on form ergonomics.



And while this is helpful, it doesn't really address the overall issue described above. Therefore the objective of this exercise is to essentially take a stance on "default" form designs. That is, just as with previous versions of CRM, I'd like to have a fallback design that I can feel confident in taking based on the "you can't go too wrong" starting point premise. And when you reduce the requirement down, I think you come to the following basic guidelines:

  1. The form should not look too sparse with all the new form real estate i.e. take advantage of the new space that is available on the form such that the need for scrolling down is greatly reduced.
  2. Bring sub-grids that will be commonly accessed to the body of the form so that most of what the user will need will be right in front of them (I'll refer to these as "prominent" sub-grids).
  3. Consider the laptop/tablet experience where monitors are smaller - some things that fit nicely on your maximized form on your desktop monitor will be cut off on these somewhat smaller screen resolution sizes.

We can also take a look at how Microsoft designed the core entity forms - account, contact, lead, opportunity - to hopefully draw some inspiration (as presumably they also took into account the above considerations).

When all is said and done, I will tentatively be employing the following "default" form design strategy:

  • Any form where notes/activities is something that is relevant/tracked - I'll follow the guideline that Microsoft has used which is the "3-Column Varied Width" for the first form tab with the notes/activities in the central column using the new activity, notes and post control. This doesn't leave a generous amount of space for the other "prominent" sub-grids which we'll place in the third column but we'll live with that.



  • Any form that doesn't have notes/activities but has other prominent sub-grids - I'll be employing the "2-Column Varied Width" for the first form tab. The right column will be for the prominent sub-grids with a more generous allocation of space.



  • Any other form sections that are just a listing of fields - I'll be employing the "3-Column Equal Width" tab option.



That's pretty much it. This more or less throws out the  "tab with one section and two columns" which was the staple of previous versions of CRM - it just looks way too sparse in CRM 2013. In fact, as a general guideline it throws out having any section with more than one column (all the examples shown above have tabs with multiple sections, not sections with multiple columns - a tad confusing). As if you do add additional columns to the sections shown above, then you're bound to run into issues with fields being cut off when the actual form renders.

When you boil it down, it means that I will be limiting myself to the 3 form options below when designing the initial CRM form (using the 3 points mentioned above as a general guideline to determine when to use each option).



... with perhaps a few exceptions here and there for very specific requirements. And then of course, forms can subsequently always be tweaked using one of the other myriad options based on user feedback.

As I mentioned at the outset - somewhat philosophical - but this guideline helps me take quick and clinical decisions in terms of form design that I can confidently feel will will function as a very good starting point.

Monday, December 23, 2013

CRM 2013 Option Set Length

There are a few, let's say, particularities of the CRM 2013 UI experience. Some of them take getting a little getting used to such as the new way of navigating the system whereas others seem likely to be early release features that are destined to be "tweaked" in early product rollup releases... at least one would hope so!

A good example of this is the length of the drop down for option sets. Take for example, the following form:


As can be seen the drop down for the title is only showing two options at a time. Considering that there are at least 20 values in this option set, this experience is not likely to go down well with end users... and nor should it.

The way to get round this design issue is to add some spacers into the form as follows:


And after publishing, the form renders in a little bit more of a presentable fashion:



The downside is that when the option set is not in its drop down state there's a lot of unnecessary white space on the form.



So we have a workaround to this behavior but I find it to difficult to understand why it was designed in this way (maybe someone can enlighten me?). The way it worked in all previous versions seems much better and hopefully this will be rectified shortly (at which point we'll likely have to go through all those forms and remove the spacers which were added as a workaround - so hopefully sooner rather than later).

Thursday, December 19, 2013

Send email with PDF attachment

Frequently there is a need to send out an email from CRM containing a generated PDF attachment. the following approach can be used to allow an end user to manually initiate the process using the click of a button via the user interface.

But what if we wanted this to happen automatically rather than relying on the user to manually execute the action? From my experience this is by far a more common scenario. A good example might be auto-generating an invoice and then emailing it to a customer once an invoice has been approved. Another one might be to generate a quote and have it sent off once the quote has been approved. The natural solution that we can look to in such a scenario would be to leverage the CRM workflow capability.

The first step would be to design some kind of report that would be used as the template for generating the PDF document. Once this is in place we can leverage the technique outlined in my previous post for publishing the report externally and making it context sensitive which is a key requirement.

This allows us to design a workflow where we can construct a URL that will produce a report for the record in question. For example, we can design a workflow as follows:


  1. Generate GUID for the record against which the workflow is running (using the workflow productivity tools)
  2. Create the URL passing in the GUID generated above as a dynamic parameter i.e. http://crm2011db01/ReportServer/Pages/ReportViewer.aspx?%2fCRMEZE2011TEST_MSCRM%2fUser+Summary&CRM_FilteredSystemUser=select  [systemuser0].*  from  FilteredSystemUser as systemuser0  where  (systemuser0.systemuserid = N'%7B{0}%7D')&rs:Command=Render    [%7B and %7D are decodes for the { and }. And {0} is the parameter to be replaced by Param 1]
  3. Send email using the link generated above




Finally, you can also pass in the same URL and use the technique mentioned in this post to convert the report to a PDF and send as an attachment via email.

Note: Make sure that the service account under which the the asyncronous service is running has access to the SSRS report folder containing the externally published CRM report.

Wednesday, December 18, 2013

Publish External Reports - Dynamic Content

Time to complete what I started... So in my last post we walked through the steps to make an externally published report accessible to non-CRM users which would seem to be the objective of the whole publishing externally exercise as documented in the introduction to this topic.

However by following those steps we are limited to a fairly static report based on the report parameters of the originally published report. This is going to be of limited benefit. For example, consider the scenario where I want to include a link via an email to a non-CRM user that would allow them to view data from within CRM - for instance - we might want to send them a report on an opportunity whereby clicking on the link they are able to view the opportunity details. Well, unless we're able to pass in parameters to this published report we will be prevented from doing so i.e. it will lack the necessary context sensitivity.

In order to overcome this limitation we'll be manipulating the CRMFilteredEntity report parameter mentioned in my previous post. We'll continue using the example of the User Summary report for the sake of illustration.

The first thing to do is to navigate to the report using the Report Server navigation i.e.:


Now click on your externally published report (in our case "User Summary") and retrieve the execution URL i.e. http://crm2011/ReportServer/Pages/ReportViewer.aspx?%2fCRM_MSCRM%2fUser+Summary&rs:Command=Render and you should be presented with something that looks like this:


Now take the URL and carefully graft on the value from the CRMFilteredEntity parameter (provided in my previous post). That is, your URL should now include the CRM_FilteredSystemUser parameter and look something like the following:

http://crm2011/ReportServer/Pages/ReportViewer.aspx?%2fCRM_MSCRM%2fUser+Summary&CRM_FilteredSystemUser=select  [systemuser0].*  from  FilteredSystemUser as "systemuser0"  where  ("systemuser0".systemuserid = N'{43AF6573-C97C-E011-9556-00155DA5304E}')&rs:Command=Render

And voila - your report will now render the User Summary report only returning the corresponding user specified by the report parameter.


Also test with a non-CRM user to ensure that this is working outside of CRM authentication.

This exercise proves that we can now manipulate the URL of this report to pass in parameters to the report. Essentially all you need to do is to update the query specified in the CRM_FilteredEntity parameter and this will filter the results of the report. Of course the same approach can be applied to any other report in the system. For example, if we were working with an account report we'd manipulate the query for the CRM_FilteredAccount parameter; if we were working an opportunity report we'd manipulate the query for the CRM_FilteredOpportunity parameter. And so on and so forth.

Now of course this result doesn't lend itself very well to specifying report parameters manually - your users would lynch you if you tried proposing this to them!  But it does lend itself quite well to automated mechanisms such as via workflow whereby you can construct a URL using dynamic workflow fields resulting in a report URL in an email that a user just has to click on in order to see the relevant, context-sensitive, dynamic report content.

In the next post we'll take a look at performing those steps as well as seeing how this approach can also be leveraged to generate PDF documents that can be sent to external users or customers.

Tuesday, December 17, 2013

CRM 2013 Report Error: "An error has occurred. Try this action again..."

We were getting an odd error when running a CRM report in an environment that had been upgraded from CRM 2011 Online to CRM 2013 Online.

In short, we were getting the following rather unhelpful generic error message:


This error only occurred with a few of the records but ran fine for other records. In order to troubleshoot, I removed all the front end components of the report so that essentially the report would just have a blank rendering. And even after doing this, the report still gave up the error above.

So this seemed to point to the fact that there was some issue with the data set...

As it happened this data set had a link-entity join clause and I discovered that when you added in the linked record such that the join would succeed the report would run fine. And in cases where the join did not bring back results the report error would occur - this being our particular case.

So this explained why it occurred only for certain records whereas not for other records in the system. However it does not explain why this error occurs in the first place. At the most I would expect the report to not render results when a join does not bring back results. After all in many cases it is perfectly valid to have a join that does not succeed. Even if there was an error caused by the lack of the join, I would expect to see this error manifesting itself in the returned report as is the case for example when running a report where the sub-report it is calling does not exist.

All in all, while I'm glad I discovered the issue, this does seem very much like a bug in the platform to me.

Wednesday, September 18, 2013

Publish External Reports - Credentials

My previous post introduced the topic of publishing reports for external usage and the conclusion drawn was that - as far as I can tell - the only reason why this action would be performed would be to grant non-CRM users a view into CRM data. But unless we make some changes to the report credentials, external users will be prevented from running the report. This post describes how to go about performing this step.

We'll use the default "User Summary" report for the sake of this exercise. In order to demonstrate how this works end to end, first modify the Default Filter for this report and arbritarily add some filter criteria. We'll add a filter by user name.


Subsequently edit the report and select the "Publish Report for External Use" option. We'll publish the default "User Summary" report for the sake of this exercise.


The report will now appear in the CRM report server home page (http://CRM/Reports/Pages/Folder.aspx?ItemPath=%2fCRM_MSCRM&ViewMode=Detail).In its current state it will only be accessible to CRM users. In order to make this accessible to other users we need to have it impersonate a user who has the necessary credentials to run the report. By doing so, you will actually be giving permission to anyone who has the URL to run the report.

To do so, hover over the report name and click Manage from the menu that appears.


Let's first take a look at the report parameters. What's interesting to note is that anything that was specified as part of the default filter will be stuffed into the single parameter called CRM_FilteredEntity (in our case CRM_FilteredSystemUser).




So you'll see something like the following appear for this parameter based on the default filter that we used:
select  [systemuser0].*  from  FilteredSystemUser as "systemuser0"  where  ("systemuser0".systemuserid = N'{43AF6573-C97C-E011-9556-00155DA5304E}')

The "systemuserid" is the identity (guid) of Adam Barr's user record (see above). If we had instead specified a default filter of "last name = Smith" then this parameter would like like this:

select  [systemuser0].*  from  FilteredSystemUser as "systemuser0"  where  ("systemuser0".lastname = N'Smith')

You get the picture. Understanding how this parameter is constructed will feature prominently in the next related post. Now let's move onto updating the report credentials.

Click on "Data Sources" and create a custom data source and specify a user that has the necessary permissions as shown.



Alternatively, create a custom data source using the same input as shown above and reference it using the "a shared data source" option shown above. This approach is the recommended approach because if you have more than one external report, you can just link to the same shared data source for each rather than specifying the same credentials in each case (which also helps with maintenance as passwords need to change and such like). And finally bear in mind that if your report has sub-reports then you will already have more than one report that needs to have the credentials updated (yes, all generated reports need to have this applied).

Now that you've performed this action, test the report by running it with a non-CRM user and it should return a report that looks something like what we have below. Note how the result is being limited to the default filter defined for this report in CRM.


Next: We will build on this walk through by manipulating the parameters of the report in order to have the output be dynamic and context sensitive versus the static report output that we've just presented. Being able to do so provides all kinds of interesting options.

Tuesday, September 17, 2013

Publishing reports externally

It's been a while since I last posted. Things got a little busy over the summer. Anyway, during that period I collected quite a few tidbits here and there and hope to be posting them over the course of the next few weeks. We'll see how it goes.

As anyone who has been using CRM for a while knows, CRM reports can be published externally as shown in the screenshot below. Note: this option is only available to CRM On Premise since there is no report server accessible via online.


After the report has been published it is copied to the Org_MSCRM home page in the report server i.e. http://reportserver/Reports/Pages/Folder.aspx?ItemPath=%2fOrg_MSCRM&ViewMode=Detail. Navigating to this location you will see the result of your publish action - the report along with any sub-reports that it referenced will be published as shown. In this case, the published report contained 3 sub-reports resulting in 4 reports being published.



But the question has to be asked - seems like a nifty little piece of functionality - but what is the practical application of it? To compound it further - if you try and run this report from the reporting server location - it will only work for a CRM user that has access to the CRM database. And therefore why would you not just execute the same report from within CRM? The latter option provides you with the CRM pre-filter option and a generally much better overall user experience - so there is no reason that I can think for a CRM user to execute the report from anywhere other than from within CRM.

In light of all of the above, the only reason that I can think of for using this feature is to provide a report that can be accessed by non-CRM users. There are a number of scenarios in increasing degrees of complexity:

  1. Just provide a URL for external users to be able to access the report
  2. Use workflow to send out a report link to non-CRM users based on a given business scenario
  3. Use workflow to email a PDF document based on an externally published report

As will be demonstrated over the next few posts, each of these builds on top of one another. And an issue that we'll have to contend in both cases 2 and 3 is one of context sensitivity i.e. typically if you're automating something you're going to want it to generate a report that is specific to the quote, order, invoice etc. that it was generated from.

The first thing we need to deal with is giving access to an externally published report. Because, if you've been paying attention, there is an inherent contradiction - externally published reports cannot be accessed by non-CRM users... By default.

In the next post, I will walk through the steps for allowing external users to access the CRM published report.

Thursday, June 6, 2013

CRM 2011: Changing SQL Port

By default the CRM installation wizard uses the default SQL port i.e. 1433. I don't believe you are able to specify a different port as part of the initial installation, so you'll need to ensure this port is left open for this step. Once CRM is installed you can modify the SQL port that it uses.

To do so you can follow the steps described in this post which describes the steps required to do so for a CRM 4.0 installation. The steps are only marginally different for CRM 2011.

The difference lies in the modifications required to be made to the MSCRM key in the registry. You only need to modify the configdb string. The others (database, metabase) are no longer relevant for CRM 2011.

I'd recommend the following steps in terms of implementing the above change to isolate the port change in case of issues:
  1. Open port 1433
  2. Install CRM and confirm installation
  3. Make port changes above in the CRM installation
  4. Make sure the new port is opened
  5. Restart IIS on the CRM server and restart the SQL service on the database server
  6. Refresh CRM to ensure it is still running. If not, you did something wrong.
  7. Block port 1433
  8. Refresh CRM to ensure it is still running. 

Wednesday, May 29, 2013

Deleting iPhone Contacts

One of the side benefits of writing blog posts is that over time you create your own knowledge base that you can refer to. I know I am not alone when I say that there have been countless occasions where although I seem to recall having implemented a particular solution in the past... only to struggle to figure out exactly where, when and how it was performed. My experience is that once you have blogged about a topic, it is much more likely to be registered to long term memory which when combined with the power of search... voila! I guess blogging is an example of the proverb: "You never really own something until you're able to give it away" (Bill Nichols).

In light of that long-winded intro...

As part of rolling out the Outlook client functionality to end users we needed to of course clean up existing contacts so that duplicate contact records are not created when syncing from CRM.

In most cases, this is just a matter of cleaning up your Outlook contacts because Outlook contacts sync to the various mobile apps so cleaning up in Outlook results in a clean up everywhere else.

We were however presented with a case where the contacts were stored directly on the iPhone and we needed to get rid of these. Given that various users had thousands of contacts going through the list was not really an option.

Fortunately we were able to purchase a very inexpensive iPhone app in order to solve the problem. The app name is Contacts In : Import CSV & Manage Groups. Steps:

Select All Contacts:


 Click Action menu and choose "Delete contacts":


Tuesday, May 28, 2013

CRM 2011 Installation Requirements

This question gets posed to me every so often. In light of that, the following post will provide information describing the installation requirements for CRM 2011. I will also highlight what I consider to be practical best practice recommendations and finally at the end I have provided server breakdown installation scenarios.

First of all, please refer to the very useful link below. It contains a step by step visual walk through of all the installation components, including SQL, CRM prerequisites, CRM 2011, Email Router etc. The rest of this post makes abundant reference to this walk through.

http://social.technet.microsoft.com/wiki/contents/articles/3176.how-to-install-microsoft-dynamics-crm-2011.aspx

The following link similarly provides a visual walk through for installing and configuring the Microsoft Dynamics CRM E-mail Router (Online & Exchange Online).

http://social.technet.microsoft.com/wiki/contents/articles/4686.microsoft-dynamics-crm-e-mail-router-configuration-crm-2011-online-exchange-online.aspx

Finally, bookmark this link. It contains a very useful set of Q&A for commonly asked questions related to CRM 2011 server setup.

http://blogs.msdn.com/b/crminthefield/archive/2012/02/11/crm-2011-server-setup-commonly-asked-questions.aspx

Hardware and Software Requirements

  • See sections "Hardware Requirements" and "Software Requirements" 
Credentials
  • Installation User - see section "Required for installation of Dynamics CRM 2011" for permissions required for this user. 
    • Recommendation: Create a user called "CRMAdmin" under which you will install CRM
  • Services - CRM creates a number of windows services
    • Recommendation: It is recommended to have a separate to have a separate account for each of these. For example: 
      • Sandboxing service – e.g. create account CRMSandbox (see section “Microsoft Dynamics CRM Sandbox Processing Service” for permissions required for this user)
      • Async service – e.g. create account CRMAsyncProcess (see section “Microsoft Dynamics CRM Asynchronous Processing” for permissions required for this user)
      • Web Deployment service e.g. – create account CRMDeploymentWS  (see section “Web Deployment Service (CRMDeploymentServiceAppPool Application Pool identity)” for permissions required for this user)
      • Application service – e.g. create account CRMAppService (see section “Application Service (IIS Application Pool identity CRMAppPool)” for permissions required for this user)
Ports

See section "Ports" for a full description of default ports that are used as part of the CRM solution. You can also refer to the following post for a visual representation and link to a white paper covering the topic.

Installation Summary

The reference provides a walk through of each of these procedures:
  • SQL Server 2008 R2 – Scroll to section “Installing SQL 2008 R2” for a step by step guide. I’d follow these instructions although I’d select Mixed Mode (step 5)
  • CRM Prerequisites – Scroll to section “Installing pre-requisites in Dynamics CRM Server 2011” – Walks through server roles that need to be installed on the CRM server.
  • CRM Installation – Scroll to section “2011 Dynamics CRM Installation” – this walks you through the wizard installer screens. Note:
    • License key - you should obtain this prior to beginning the CRM installation.
    • Server Roles - See write up below assuming a 3 server configuration.
    • Organizational Unit – This links to AD. 
      • Recommended: Create an OU called MSCRM or similar (can hang off the main OU). Note. The OU is required for the installation. CRM just copies security groups into the OU (UserGroup, ReportingGroup, PrivUserGroup, SQLAccessGroup). It is good practice to use a separate OU rather than an existing OU.
    • Service Accounts – per above
    • Web site – recommend that we’ll install to the Default Web Site. If that’s an issue we can use the 5555 port. No real practical difference other than URL (if configuring IFD, the IFD URL will override this)
    • Email Router – this will point to the server which will has the Email Router installed.
    • Organization Settings 
      • Display name should just be Your Company Name. This will create a database called YourCompanyName_MSCRM. Specify if you wish a different display name.
      •  Review other settings - generally no need to change the defaults supplied
    •  Reporting Server – This can point to any reporting server. Typically you would point to the report server of the database against which CRM is installed but if you have a dedicated reporting server, you can point it to that location.
  • CRM Reporting Extensions – Scroll to section “Installing Microsoft Dynamics CRM Reporting Extensions”
    • This will be installed on the SQL Server Reporting Services instance which essentially allows authentication from CRM to pass through to SSRS. 
  • Email Router – See server breakdown below for recommendation on which server to install
  • Rollups – Applied as necessary to all the CRM products (CRM Server, Email Router, Reporting Extensions). The latest rollup can be obtained from here.

Production Server Breakdown

The following is the recommended set up in a production environment. If you do not have two application servers then combine the two app servers together.

Database Server
  • SQL
  • Reporting Extensions
Front End Server 
  • CRM Pre-requisite roles (see above)
  • Install CRM (all roles) and Disable the Back End Server Roles (Async Processing Service, Sandboxing Service)
    • Having the roles available allows these roles to be started as a failover solution

Async Server (Mid Tier) 
  • CRM Pre-requisite roles (see above)
  • Install CRM and disable Front End Server Roles (Web Application Server, Organization Web Service, Discovery Web Service, Help Server)

Development/Test Server Breakdown

Typically it is sufficient for a development/test environment to be hosted on a single server. If that is the case, then install SQL and CRM with all server roles on the server.


Tuesday, April 30, 2013

Contact Address Solution

In my previous post I provided some background with regards to the challenge of managing account and contact addresses in CRM. I also provided a solution for handling a simple scenario where the contact's account only ever has a single address. But the reality in most cases is more involved than that.

We will be walking through the scenario described at the end of my previous post i.e.
It would therefore be better to stick to a single account with multiple addresses and allow for the contact records to be associated to one of these account addresses. In the next post, I will present an elegant approach to handling this requirement.
The objective of this solution is to provide the flexibility of a custom lookup solution while not compromising any features/functions in the system by ensuring that the default (out of the box) address fields are populated with the address information.

This requirement covers a number of configuration scenarios and therefore may be useful as a good overall "configuration" tutorial. Topics referenced:

  • Default vs. Custom entities - when to resort to custom
  • Use case for the "Related Records Filtering" configuration option
  • REST API
  • Distribute Workflow
  • Plugin requirement

First of all, we need to create a new address entity. The reason we do so is because the ability to leverage the default More Addresses entity is limited. Specifically, the system does not allow custom relationships to be created against More Addresses which is a key requirement to the solution we are trying to implement. So first of all, we create a custom entity like the one shown and in order to eliminate user confusion we rename and remove the default More Addresses from the account and contact forms.


 Next we create an N:1 relationship with Account - this essentially replaces the More Addresses link in the account entity.


... and a 1:N relationship with Contact - this is going to allow for an address selector at the contact level.


We now add the relationship field created in the step above to the contact form. In the "Related Records Filtering" section we need to filter it to the Parent Account. This will limit the addresses shown in the lookup dialog to addresses defined at the parent account level.



The next thing we need to do is to add some jscript to the form. The objective here is to retrieve the address fields from the address which has been selected and copy it to the corresponding address fields on the contact record. The jscript below will perform the following logic:

  • Copy/Clear Address fields based on address selected from lookup using the REST API (note this use the retrieveEntityByID function described in this post)
  • Make fields read only if address is selected via the "picker" (the idea here is that you can enter an ad-hoc address if you want at the contact level, but if you select from one of the lookup values then the address values will be controlled by the official parent address records).

function AddressPicker_OnChange() {

 var entity = "snt_address";
 var entityid = Xrm.Page.getAttribute("snt_addressid").getValue();
 var fields = "?$select=snt_Street1,snt_Street2,snt_Street3,snt_City,snt_State,snt_Zip,snt_Country"
 
 if (entityid != null) {
  entityData = retrieveEntityById(entity,entityid[0].id,fields,UpdateAddressFields);
 } else {
  Xrm.Page.getAttribute("address1_line1").setValue(null);
  Xrm.Page.getAttribute("address1_line2").setValue(null);
  Xrm.Page.getAttribute("address1_line3").setValue(null); 
  Xrm.Page.getAttribute("address1_city").setValue(null);
  Xrm.Page.getAttribute("address1_stateorprovince").setValue(null); 
  Xrm.Page.getAttribute("address1_postalcode").setValue(null); 
  Xrm.Page.getAttribute("address1_country").setValue(null);   
 }

 Xrm.Page.getAttribute("address1_line1").setSubmitMode("always");
 Xrm.Page.getAttribute("address1_line2").setSubmitMode("always");
 Xrm.Page.getAttribute("address1_line3").setSubmitMode("always");
 Xrm.Page.getAttribute("address1_city").setSubmitMode("always");
 Xrm.Page.getAttribute("address1_stateorprovince").setSubmitMode("always");
 Xrm.Page.getAttribute("address1_postalcode").setSubmitMode("always");
 Xrm.Page.getAttribute("address1_country").setSubmitMode("always"); 
 
 AddressRender();
}

function UpdateAddressFields(entityData) {
 if (entityData != null) {  
  Xrm.Page.getAttribute("address1_line1").setValue(entityData.snt_Street1);
  Xrm.Page.getAttribute("address1_line2").setValue(entityData.snt_Street2);
  Xrm.Page.getAttribute("address1_line3").setValue(entityData.snt_Street3);
  Xrm.Page.getAttribute("address1_city").setValue(entityData.snt_City);
  Xrm.Page.getAttribute("address1_stateorprovince").setValue(entityData.snt_State);
  Xrm.Page.getAttribute("address1_postalcode").setValue(entityData.snt_Zip);
  Xrm.Page.getAttribute("address1_country").setValue(entityData.snt_Country);
 } 
}

function AddressRender() {
  -- Add function to form onload function
  var disabled;
  if (Xrm.Page.getAttribute("snt_addressid").getValue() != null)
   disabled = true;
  else
   disabled = false;
   
  Xrm.Page.getControl("address1_line1").setDisabled(disabled);
  Xrm.Page.getControl("address1_line2").setDisabled(disabled);
  Xrm.Page.getControl("address1_line3").setDisabled(disabled);
  Xrm.Page.getControl("address1_city").setDisabled(disabled);
  Xrm.Page.getControl("address1_stateorprovince").setDisabled(disabled);
  Xrm.Page.getControl("address1_postalcode").setDisabled(disabled);
  Xrm.Page.getControl("address1_country").setDisabled(disabled);  

}

For the sake of illustration, after the above jscript is implemented, when an address record is selected the fields are copied as shown:


Now we need to implement logic that will keep addresses in sync. For example, if the address shown in the screenshot is changed and there are 5 contacts that are linked to this address, then we want to make sure that the address fields are propagated to all 5 contact records. This is fairly easy to accomplish using the distribute workflow. The screenshots below show the 2 workflows required in order to configure this option.


The last piece of the puzzle requires a plugin. Basically at this juncture we can define addresses against the account entity that will appear in the lookup for contacts tied to this account:


But what about the address defined on the account itself? This in essence is the Primary address of the account and we also want it available for selection for the contacts. Therefore the plugin logic required here is to automatically maintain the PRIMARY address as shown above based on the values entered on the account form (as additional validation, you should also define jscript on the address entity to make the fields read only when the address name = PRIMARY and prevent manually keying in address with this name i.e. reserve PRIMARY for the plugin only).

The end result? A lookup on the contact form that allows you to select from the specified account addresses.



Tuesday, April 23, 2013

Handling Account and Contact Addresses

In my experience one of the most challenging issues to deal with in a CRM implementation is dealing with account and contact addresses. As is well known, the customer in CRM can either be the account or the contact. In a B2C business model, this is not likely to be a big deal as your contact record becomes the center of your universe and therefore all address information should likewise be on the contact record.

In a B2B business model, your customer is represented by the account records and the contacts (usually employees) belonging to these accounts. The challenge in this case is that you have address information on both account and contact records and the storage is neither efficient nor easy to maintain. At least out of the box. Before delving into the justifications for the above statement let's first stipulate that having the address information stored on the contact record in the default address fields (address1_line1 etc.) is important. The importance is because many default functions rely on this information. For example:


  • Syncing contacts to Outlook - if you don't have the address information stored on the contact, the address information will be missing on the synced contact
  • Mail merge - if you want to pull in contact address details while creating your merge documents, these will need to exist on the record

The reason why the storage of address information is not efficient is because in most cases, contacts of an account have the same address of the parent account (for example, generally with employees the address information for that employee is going to be the business address not personal address). Therefore storing it redundantly in 2 places is, well, redundant.

The reason why it is not easy to maintain is because it needs to be updated in multiple places. For example, let's assume you have an account with 5 contacts all with the same address information. If the business address changes then all 5 linked contact records should be updated as well. This of course can be done manually and perhaps even "optimized" by using tools such as the bulk edit tool (in which case the above scenario results in 2 manual steps) - however I am firmly of the opinion that if such updates are not automated the data will inevitably fall out of sync. Call me a skeptic but users can generally not be relied on to apply such manual procedures with rigor.

Fortunately if the scenario that you have resembles the one described above you can apply tools such as the Copy Utility or the Distribute Workflow to automate the updates. Both of these tools can be configured to copy down updates made from the account record to the contact records beneath them.

Frequently however companies have more than one office and the employees will be spread among these office. So simply applying a straight down copy will not work. A simple way to approach this is to use the account hierarchy whereby each office represents a separate account record with the relevant employees linked to each of the offices (with each having a separate address that will feed to the contact records). I would however argue that creating a separate account record solely to deal with the address issue is a little bit of overkill. And it results in disjointed information such that in order to "see" everything related to an account you need to view all account records and/or write custom reports in order to pull all the information together. Essentially you are creating "artificial" accounts in order to address the problem and I am not a fan of that. I believe that the account hierarchy should be used when the accounting requirements (reporting etc.) necessitate this. 

Aside from this, there are also other practical reasons where this approach can introduce problems. For example, if you have a mail merge requirement (or similar) whereby you want to pull the account name into the body of the letter, email etc. - the customer may not be too happy with the fact that letter head will go out from "ACME Medford" instead of just plain old "ACME". On the other hand, naming all these accounts with the same name of "ACME" will result in "duplicates" which will confound end users.

It would therefore be better to stick to a single account with multiple addresses and allow for the contact records to be associated to one of these account addresses. In the next post, I present an elegant approach to handling this requirement.

Thursday, April 11, 2013

Searching Workflow Processes

I was recently upgrading a 4.0 environment to 2011 and in the process I needed to identify cases where workflow plugins were used to ensure that they were compatible with the upgraded version. The problem is how do you go about finding which workflows use a specific workflow plugin?

The answer is fairly straight-forward and is useful not only for upgrade scenarios but in other cases where you might be searching for a particular workflow that has a specific condition, action, or workflow plugin. That is, you can use this as a general purpose mechanism for finding workflows with a particular clause.

Let's start with identifying workflow plugins. For example, let's say that I want to find workflows that use the following workflow plugins:



The following query will find cases where the GetDate method is used. You could similarly search for the other methods by simply replacing the search clause.

select distinct  name from FilteredWorkflow
where statecodename = 'Published'
and activities like  '%DateUtilities%'
and activities like  '%GetDate%'

This returns with a result of "App Server Install". And sure enough if I open up that workflow in CRM, I see that the workflow plugin is called.



Alternatively let's say I was searching for another clause within the workflow e.g. 

select distinct  name from FilteredWorkflow
where statecodename = 'Published'
and activities like  '%region= Australia%'

Once again this also returns the "App Server Install" workflow and if I scroll a little further down I will see that also in the workflow:



By now you should be getting the idea. But interestingly if you tried the same search in a CRM 2011 environment you will not get any results returned. To make it work in 2011, you will need to tweak the queries very slightly replacing the "Activities" column with the "Xaml" column. The following will now work in your CRM 2011 environment:

select distinct  name from FilteredWorkflow
where statecodename = 'Published'
and Xaml like  '%DateUtilities%'
and Xaml like  '%GetDate%'

select distinct  name from FilteredWorkflow
where statecodename = 'Published'
and Xaml like  '%region= Australia%'

Wednesday, April 10, 2013

Spell Checking Options

I came across this topic in my email archives. I think it's still a relevant topic so I thought I'd post it. It deals with options for spell checking within Dynamics CRM.

One of the simplest options of course is to use the spell checkers that come built into the various add-on toolbars such as the Google toolbar.


More sophisticated options employ the use of a spell checker API where the experience is similar to spell checking in applications such as Microsoft Word.I have implemented 2 such versions of the spell checker.

The first one powered by webspellchecker.net is a paid for solution ($250 per year for up to 10000 sessions and then $0.025 for each additional sessions - see web site for additional pricing options). This tool is very intuitive and will be familiar to anyone who has used the Microsoft Word spell checker solution.



The second option was to use a free product called NetSpell – this is deployed on the server entirely i.e. the check occurs against a dictionary defined on the server. This tool is also fairly intuitive however the way in which the spell checker works is not quite as comprehensive as that as webspellchecker.net which I think is in evidence by comparing the screenshots.



In summary, below  are the pros/cons of each approach:

webspellchecker.net Pros

  • Quite a bit more comprehensive in terms of functionality including an easier to use interface as well as grammar and thesaurus options
  • Very intuitive
  • Allows for custom dictionary to be defined
  • Is a supported product and will be maintained over time presumably with incremental improvements, dictionary term additions, etc.


webspellchecker.net Cons:


  • Is not free. $250 per 10000 sessions. To get a true cost it is necessary to determine the number of users who will be using this on a daily basis multiplied by the number of times they are likely to use it per day e.g. the formula would look something like: #users*#dailysessions*200 (200 being roughly the number of work days per year)
  • Is an online solution although there is also a licensed downloadable version for a not insignificant cost

NetSpell Pros

  • Fairly comprehensive
  • Fairly intuitive
  • Allows for custom dictionary to be defined
  • On premise solution so not reliant on network performance
  • It is free


NetSpell Cons

  • Is not a supported product in case of product issues
  • Dictionary is not auto maintained
  • User interface not as slick as that of webspellchecker.net 


Thursday, April 4, 2013

TDE and CRM

Although the title of this post is "TDE and CRM" the reality is that it could be titled "TDE and SharePoint" or "TDE and Any App Running on SQL Server". The point is that the CRM database like any other application running on SQL can be encrypted using TDE. There really are no specific settings as far as the requirements for setting up TDE for CRM.

TDE stands for Transparent Data Encryption and became available in SQL 2008 R2 Enterprise. TDE essentially protects your "data at rest" i.e. it protects your database from being stolen. As in backing up the database and restoring it in some other location where the contents can be analyzed via SQL.

Essentially when you apply TDE you are applying a certificate used to encrypt your data to the back-end database files (it only encrypts to the database files, when you access a TDE database via SQL there is no difference to the usual experience). You will only be able to restore the database to another database server if that server has the same TDE certificate applied. Otherwise you will be prevented from restoring the database - you will receive the "cannot find certificate with thumbprint" error:


The backed up database file will be similarly unintelligible.

As the name itself implies, TDE is transparent. Meaning this is configuration is applied on the database itself and any applications with back-end SQL databases running on top of it will be oblivious to the TDE. The only impact will be a slightly increased level of CPU utilization on the database server as it handles the data encryption for the SQL transactions.

In light of this, it is difficult to not recommend using TDE. It is simple to apply, it is completely transparent to applications, it protects your data... with the only downside being higher CPU utilization which for most databases won't make a difference (assuming the old adage "use it or lose it") and easily compensated in cases where the additional churn pushes your CPU over the edge.

Below are some scripts for setting up TDE on a SQL 2008 R2 Enterprise server:


/* If you wish to copy the certificate from an old server you will need to back it up */
--Backup the certificate from the 4.0 database server:

USE master
BACKUP CERTIFICATE TDECert
TO FILE = 'C:\TDECRMCERT.BAK'
WITH PRIVATE KEY ( FILE = 'C:\TDE.key',
ENCRYPTION BY PASSWORD = 'EncryptionPassword' )

/* On a new server you will need to create a certificate using the backed up one from the old server.
   This procedure assumes you copied the backed up certificate to C:\ root on the new server
 */


-- Create a master key. NB: Record the password for future reference.
USE master
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'MasterKeyPassword'

-- Backup the master key. NB: Store the backup file and record the password for future reference.
USE master;
BACKUP MASTER KEY TO FILE = 'C:\Master_Key.bak'
ENCRYPTION BY PASSWORD = 'MasterKeyBackupFilePassword'

-- Create a certificate by importing the certificate backed up in the first step.
USE master
CREATE CERTIFICATE TDECRM 
    FROM FILE = 'C:\TDECRMCERT.BAK'
    WITH PRIVATE KEY (FILE = 'C:\TDE.key',
    DECRYPTION BY PASSWORD = 'EncryptionPassword')

-- Backup the certificate. If you’re using the imported certificate this step is optional since the file 
-- you used to create the new certificate is already the backup file so you’ll just be recreating this file. 
-- NB: Store the backup file and record the password for future reference.

BACKUP CERTIFICATE TDECRM
TO FILE = 'C:\TDECRMCERT.BAK'
 WITH PRIVATE KEY ( FILE = 'C:\TDE.key',
ENCRYPTION BY PASSWORD = 'EncryptionPassword' )

-- For each database you wish to restore to from a TDE enabled database you will need to apply the certificate
-- and enable encryption.
USE Vanilla_MSCRM
GO
CREATE DATABASE ENCRYPTION KEY WITH ALGORITHM = AES_256
Encryption By Server Certificate TDECRM
GO

ALTER DATABASE Vanilla_MSCRM
SET ENCRYPTION ON