Jul 03

CRM 2011 Upgrade Analysis Toolkit

This post mainly for my own benefit – to save me Googling these tools each time I need them.

Handy tools for analysing a Dynamics CRM 2011 system to check for issues when upgrading to CRM 2015 (via 2013) .

 

Jun 25

Dynamics CRM Exchange Server Side Synchronisation Troubleshooting

I recently had to assist a client troubleshoot their Dynamics CRM 2013 and Exchange 2010 server-side synchronization configuration which was not working in one of their environments.  We were attempting to use impersonation so that the CRM Async Service account could synchronise mailboxes for CRM users.

We kept getting Http Unauthorized errors appear after running the ‘Test and Enable Mailboxes’ process. I tried all the different types of configuration – storing the credentials in CRM, Windows Authentication (which relies upon the service account), even storing the credentials for individual users. At the time I couldn’t confirm that impersonation was configured correctly, but I felt something else was wrong.

When individual user accounts failed to work with their credentials stored in CRM I knew something was definitely wrong. Continually getting this Http Unauthorized error message was driving me mad because I knew the credentials were correct, I could log on to Outlook Web Access with that account (also confirming that network connectivity from the CRM box to the Exchange box worked).

In the end I checked the Exchange IIS settings for the EWS virtual directory. This hosts the Exchange Web Services which are called by Dynamics CRM. For some reason Windows Authentication in IIS was disabled. What this meant was that no clients were being challenged (prompted for credentials), so no credentials were being sent back which meant that all calls were unathenticated. This matched with what I saw in the IIS logs – 3 x 401 errors with no username logged. Simply correcting this problem – setting Windows Authentication in IIS to enabled for the EWS virtual directory and voila, individual user credentials stored in CRM started working. Progress.

I then had some minor problems to fix with the Impersonation configuration, but that led to me the next challenge – how do you know for sure that Impersonation is configured correctly? I found a very useful tool that is initiated from PowerShell that is essentially a SOAP client. It allows you to provide credentials for an account (in my case the service account), communicate with the Exchange server via EWS and attempt to open other mailboxes (plus much more). This helps prove whether the configurations is correct or not.

This tool is called the Simple Exchange Email Client For PowerShell and is available here. It also relies upon these DLLs available here. Install this on a server (doesn’t have to be the Exchange server but that rules out other complications), run it up and give it a go. Invaluable. A more manual method would be to use a tool like SoapUI to hand craft the web service calls – but without being completely familiar with the API that would take longer.

Then, finally, you can configure the CRM Server Profile in the usual manner and enjoy much success.

 

 

May 19

There are too many active security negotiations or secure conversations at the service. Please retry later.

If you’ve got a load balanced Dynamics CRM on premises installation be careful how you configure the load balancer. An incorrect setting can limit the capacity of your system.

During load testing of a client’s Dynamics CRM environment we encountered a problem where the CRM web services appeared to be saturated and would no longer accept connections – logging the following error:

There are too many active security negotiations or secure conversations at the service. Please retry later.

When load testing things I expect to reach a breaking point eventually, but this was happening with very few connections (i.e. less than 50 in a few seconds).

The basic configuration was as per the diagram below.

Blog.NLBProblem

 

Some custom .NET web services had been written using the CRM 2013 SDK and deployed to two load balanced web servers. The CRM itself had two web services (both with full role) behind a load balancer as well. The load balancers were Kemp LoadMaster virtual appliances.

When we generated a whole lot of requests to the custom .NET web services the errors started appearing very quickly. This results in the CRM web services no longer accepting connections, requiring an iisreset to restore normal order. By generating load against the CRM OData Web Service directly we knew we had ample capacity in the infrastructure and that the native product had no problem handling waaaaay more requests than we saw from the custom web services. After pointing the finger at the developers for not closing their connections, they showed us the code and indeed they were closing their connections.

So the next thing we checked was the load balancers. The NLB fronting the CRM server (purple above) had a setting for session persistence that disagreed with the whole setup. Changing this to SuperHTTP resolved the issue and the custom web services were then able to handle the desired number of requests. This issue may not be limited to the Kemp LoadMaster and may be present in other scenarios if you don’t have the persistence configured correctly.

This wasn’t found with ‘normal’ user based testing of the system (because it was only a couple of users generating minimal traffic), but by using automated tools to generate the load (Visual Studio 2013 Ultimate). The load which flushed out the issue was waaaaay less than would have been experienced when the system went live, so this time the exercise of load testing certainly paid for itself by identifying this issue in advance. Happy Ending.

Apr 18

Microsoft Dynamics Compatibility

This is one of those posts for my own bookmarking benefit – the following links give you the detailed breakdown on compatibility with Microsoft Dynamics CRM 2011 / 2013 / 2015.

Compatibility with Microsoft Dynamics CRM 2011

Compatibility with Microsoft Dynamics CRM 2013

Compatibility with Microsoft Dynamics CRM 2015

And while we’re at it: Dynamics CRM Build Numbers

Mar 02

Dynamics CRM performance troubleshooting

If you’ve got a Dynamics CRM 2011 / 2013 / 2015 on premise installation that isn’t performing the way you (or your users) think it should then this article gives you information on how to diagnose where the problem(s) lie and some suggested solutions. Some of these tips and tricks apply to CRM Online as well, so read on cloud customers.

 

Stating the obvious

The first thing you’re going to need to understand is – what is slow about your CRM and how have you measured that? You are going to need to get a handle on what aspects of performance are being reported as problematic if you’re going to solve them. Often there’s a multitude of factors that lead to poor performance so it is important that you establish some kind of baseline measurement of the current ‘poor’ performance is, understand the acceptable performance and then work out what is causing the difference.

  • Is it ‘the vibe’ reported by users – otherwise known as the user perception of performance, when accessing the web UI?
  • Is it finding data?
  • Is it editing and creating data?
  • Is it web service calls from some custom integration?
  • Is the poor performance consistent (easily reproduced), or intermittent?
  • Has performance gotten worse over time?
    • Could it be related to increased number of users, or increased data?

 

Diagnostic Tools

Following are some of the tools that I think are handy when troubleshooting performance problems with a CRM installation and they should be familiar to most developers and infralopers worth their salt.

  • Fiddler – for capturing traffic between the browser and server
  • SOAP-UI – for reproducing web service calls – works on both SOAP and REST web services
  • CRM’s Built-in diagnostic tools – for measuring browser web UI performance, and network metrics between browser and server
  • SQL Server Management Studio – for examining underlying SQL Server performance, executing queries, checking execution plans

I like to be ‘scientific’ in my approach and capture timings for operations using tools where-ever possible. Multiple test runs, using the average as the measure (but beware the magic of caching on secondary runs). The tricky issue of end user perception of performance is often measuring in feelpinions, you need to get users to try and time their operations where possible. The good news is there are built-in tools that can help you with this.

 

Health Check

First up, have you run the Microsoft Dynamics CRM 2013 Best Practices Analyzer on your installation? If it highlighted any errors you should address those – they might not be having a direct impact on the performance of your system but you’re best to eliminate those before you get too far.

Also while you’re at it, how up to date are your patches? I know there’s times when you’re hamstrung and can’t apply the latest patch because of some #enterprisey reason but if there’s nothing stopping you, ensure you’re up to date.

If you’re not sure what patch level you’re at, check the CRM Build Numbers here.

 

Web Interface

Before you even try – are you using a supported browser?

From the Web UI perspective there’s some great built-in tools that can help you diagnose problems and measure performance.

CRM Browser Diagnostics

If you go to /tools/diagnostics/diag.aspx">http://<CRM-URL>/tools/diagnostics/diag.aspx you’ll see the page below. Click the Run button and you’ll get the results of some tests executed from the browser that reveal any issues between browser and server, or with the browser itself. Note that the URL is for the root of the CRM website, it’s not for a particular organisation. This is a very handy way of getting end users to capture the performance of the CRM from their end and send it in to you. Also helps to run this over different times of the day if your scenario involves different performance at different times of the day,.

image

 

As of CRM 2013 SP1 onwards there’s a new browser diagnostic tool in town. Hit Ctrl + Shift + Q in IE and you’ll see the following.

image

 

Now click the Enable button and load a form you’re trying to analyse. Hit Ctrl + Shift + Q again and now you’ve got an excellent breakdown of the form performance.

image

This can be handy to compare performance from different browsers / end-users, and also to see the impact your form design is having. Loaded up with a billion sub-grids? That’s a paddling.

 

Form Design

I’ll keep this brief – review your form design carefully. Consider your end users and tailor the form layout appropriately, you might not need to have all the fields on their all the time. Maybe designing a lightweight form that is used 90% of the time and allowing the user to switch to the detailed form the remaining 10% leaves users more productive compared to 1 gigantic form trying to be everything to everyone. Likewise remember to consider end user roles and segregating different form layouts that way. Yes it leads to a bit of extra development and maintenance but if it leads to a more useful form for users that performs better, the system is more likely to be used than a slow-as-molasses form loaded with a billion sub-grids that might come in handy.

 

Web Services

CRM comes with a great web services API that allows integration by other systems. A pattern I’ve seen often involves getting .NET developers to write a simplified set of web services that conform to organisation specific data models, acting as a wrapper to the CRM. This simplifies integration and transforms CRM objects into the required models, it also provides a bit of abstraction so you can minimise disruption if you upgrade the CRM installation later. Sounds awesome, and you can bash out some code pretty damn quickly that gives you the desired results using LINQ. Like Peter Parker’s Uncle Ben said – with great power comes great responsibility. Getting code-complete quickly doesn’t mean you’ve written an efficient system. Assuming you’re writing queries against the OData service:

  • Test your queries using SOAP-UI directly against the CRM OData service
    • Use the OData Query Designer in Dynamics XRM Tools, or be brave and just work out the URL format yourself
    • Now test the custom web service that performs this same service – the difference is the overhead of your custom web service (i.e. +200ms)
  • Understand that your LINQ query may result in multiple OData web service calls. Which happen sequentially. Which adds up to lost time.
    • Check the IIS logs of the CRM server to see the number of requests coming into the OData web service
    • Can you refactor the query to reduce the number of calls?
  • Only return the attributes and links that you need
    • Friends don’t let friends write “SELECT *” queries, and similarly you shouldn’t load more attributes in the CRM entities than you need
    • Specify only the attributes that you need and then execute the query
    • Additional unnecessary attributes just result in additional overhead of serialising / de-serialising.
  • Compare the results to SQL Server Filtered Views – try T-SQL in SQL Server Management Studio that gets similar result-sets, how does that perform by comparison?
    • One option for reading data is to connect to the SQL Server Filtered Views – go straight to the heart of the beast.
    • Don’t jump into this without considering the future implications – it won’t work in a CRM Online world for instance, but if the bulk of the operations for your web services are read-oriented it may be worth checking out.
  • A handy way to log the timing of your custom web services is to ensure ‘time-taken’ is logged in IIS (assuming ASP.NET web services). You can then analyse this for queries exceeding your target times.

 

CRM Maintenance Jobs

CRM has a number of built-in maintenance related jobs that perform things such as index recalculations. By default these kick off once a day roughly around the time that the CRM was installed. Which is usually business hours. An excellent tool to review these schedules and change them to a time of less interruption to the user is the CRM Job Editor – with editions for CRM 2011 / 2013 / 2015. 

 

Networking and Infrastructure

Have the underlying Windows servers been given the appropriate amount of RAM and CPU? Check the 2011 / 2013 / 2015 Implementation Guides for the recommended values.

What seems to be the average memory use and CPU usage in perfmon (or for more manager friendly graphs Resource Monitor)?

If you’ve got multiple CRM servers deployed behind a load balancer, are you able to side step the load balancer and browse the CRM server directly (from the end user desktop environment), and does this make any difference to the performance? If it does then check out the NLB configuration for any issues.

What does the performance of the CRM Web UI seem like from one of the servers itself (via Remote Desktop)? How does it compare?

Are the Windows Event Logs full of errors or other information indicating problems with disk, networking (connectivity, DNS), authentication or other critical things?

What’s the network topology? When you’re browsing the CRM Web UI and getting your own feel of system performance, are you doing it only metres away from the data centre, while the users complaining about performance are in some regional office connected over a wet piece of string? If the performance symptoms being complained about seem to be geo-specific, replicate testing from their end as much as possible (see the built-in diagnostic tools in the Web Interface section).

Have you got latency issues between end users and your CRM servers? CRM can be a bit ‘chatty’ and this can cause you pain over a connection with high latency (e.g. think London to Canberra). In some organisations I’ve seen better performance through Citrix because the browser to CRM server chattiness occurred “locally” within the same data centre. Your mileage will vary and so will the tools at your disposal to tackle this.

 

IIS Settings

Double check that Dynamic Compression is enabled for your CRM website in IIS 8 / 8.5. After you’ve done that, check the outputCache setting for omitVaryStar as per this CRM Tip of the Day. Yes it applies to IIS 8.5 as well, and it’s not just a CRM issue – it affects all sites hosted in IIS. Without this setting you may find that output isn’t being cached by the browser properly which causes a performance drag by making additional requests for content upon page load. Be sure with this one to test before / after with browsers that have had their cache emptied (then load a few different forms) to measure performance difference.

 

SQL Server

Of course Dynamics CRM performance is heavily dependent on the performance of the underlying SQL Server installation. So, have you run the SQL Server Best Practices Analzyer (2012 edition)?

  • Memory and CPU – is SQL crying out for any more of either of these?
  • Physical location of Data and Log files – are the data and log files on separate physical disks?
  • Max Degree of Parallelism (MAXDOP) – it is recommended that this is set to 1. It affects the entire instance, and changes occur immediately. Tread carefully before making this change.
  • tempdb – it is generally recommended to have the same number of physical files for tempdb data as the number of CPUs on the server. By default there will be 1 file.
  • Growth of database files – check the auto-growth settings of the database files and pre-grow them to a larger size if your database is growing regularly. This can reduce the number of ‘disk grabs’ SQL makes as it expands the databases.

Data Management Views – dmvs

SQL has some excellent statistics that it keeps regarding costly queries, index usage and other tuning related things. From a CRM perspective these can help reveal what your most costly queries are. This article from Nuno Costa does a much better job of explaining these than I can, so check it out.

 

Indexes

If you’re doing a lot of queries that involve WHERE clauses with custom attributes or ORDER BYs with custom attributes chances are you can benefit from an index on those attributes – particularly if the number of records is large. Adding an INDEX to the SQL Table is the only supported thing you can change in the SQL Database. Things you need to consider – how unique is the data? How often are you reading from it vs writing to it (inserts, updates)? Because the cost will come in terms of index calculation as you make changes. These days this calculation happens ‘online’ and doesn’t block but it still taxes CPU and Memory of course.

But how will you know which attributes need indexes?

Run queries similar to the ones that are performing slowly, directly in SQL Server Management Studio and make sure to include the execution plan. SQL will tell you the cost of the query components and reveal if an index would benefit that query.

What if I’m a CRM Online customer?

If you put a support call in to Microsoft you can have them add the index for you.

 

More Information

I’m yet to come across an update beyond the CRM 2011 version, but there’s an extensive performance and optimisation whitepaper from Microsoft and a lot of the same principles still apply.

 

Conclusion

There’s a lot of moving parts to Dynamics CRM. Every performance analysis will be different as context is everything for an on-premise installation. However I hope you’ve found this a helpful overview of some important things to be aware of with regards to Dynamics CRM performance optimising and troubleshooting.

Nov 04

Rich powerful note taking with OneNote

I really love OneNote and have used it for years. Which is why I’ve just added a guide to OneNote on FreemiumTools.com.

 

Oct 27

Freemiumtools.com

A while ago I put together a website that explains popular cloud services in an introductory manner with the aim of letting people read about some of the tools that are out there and decide if they can use them for their business (or life). The content isn’t for the tech savvy, but hopefully it’s still helpful to people on the learning curve. So if you’re interested in finding out more about cloud tools and software for productivity and general business, why not check out freemiumtools.com.

Oct 21

Anonabox – online security, privacy and anonymity

The recent Kickstarter campaign for Anonabox proved to be insanely popular but ultimately unsuccessful after the campaign was suspended by Kickstarter. If you’re unfamiliar with it, it was a small network device that you could plug into an existing network connection and then either use the provided Wi Fi hotspot, or plug in another network cable all traffic routed through the device would go over Tor. The idea behind the device is to make it dead easy for people to use Tor holistically, preventing software from making direct requests to the internet. The other advantage is that it would work without particular applications having to be aware of Tor or configured to use Tor. Just plug it in an go.

I think the intent behind this device is great but there’s a bunch of things people forget and confuse when it comes to online privacy, anonymity and security. I think the danger with a device like this would have been lulling users into a false sense of security, or illusion that they couldn’t be traced / tracked / monitored / discovered / whatever it is they thought they were achieving by using this. This prompted me to think about a bunch of things and following is my brain dump on what some of the differences are between privacy, anonymity and security – why you might want to pursue any or all of these online. This is stuff I’ve been mulling over for months after reading books like Black Code, No Place To Hide and following the details of the Snowden revelations as well as the metadata collection debate currently happening in Australia.

Security

Let’s start with online security. This covers your basic approach to doing things securely on the internet – ensuring you use a password manager to enable strong, unique passwords on each and every site you use; using anti-virus software and personal firewalls; enabling two-factor authentication and using extensions such as HTTPS Everywhere to enforce the usage of https on websites. You make a conscious effort to avoid using insecure websites and applications that do dumb things like email you your password, don’t use https, having stupid password restrictions and so on.

More advanced approaches to security include encrypting communications (instant messaging, text messaging, emails), encrypting files and whole drives (computer, smartphone).

Why do we do these things? We do these things to avoid having our accounts compromised, money stolen, identity revealed (more on this later), personal information leaked – including nude selfies. The consequences of these things ranges from pain in the arse to major impact on our lives.

Privacy

Privacy is about controlling information about yourself – consenting to provide that information understanding how it will be used, your rights regarding deleting that information and how long it is stored for. Clearly a major trend in the last decade is the erosion of our privacy in the online world through constant mishandling of our personal information leading to leaks.

This is not to be confused with scenarios where we opt in to applications to receive a benefit – providing personal information when there is a net benefit in applications like Facebook. These systems work because without opting in and providing your information you won’t be able to establish the connections with your friends. Essentially you get to use this application for free because you’re providing personal information. That the application aggregates all of this personal information and uses it for marketing purposes is something (most of us) are consciously aware of and acknowledge. The benefit we receive is worth it to us.

What complicates privacy are the many different facets of information about ourselves and who we want to reveal them to is very granular.

Anonymity

Anonymity is about the right to pursue your life anonymously without having to provide identifying information. In an online world this means the ability to use pseudonyms and non-identifying information when interacting with applications and other users on the internet.

A disturbing angle relating to anonymity is the practice of having your online habits tracked across multiple sites over a period of time through advertising networks. While we can read and agree to privacy statements of individual sites and receive a pretty obvious benefit in return for providing some information to the Facebooks of the world, it’s less obvious the benefit we get from being tracked. ‘More targeted advertising’ is usually the result, but for most people that’s a pretty dubious benefit. It’s great for business, but not the individual.

Trying to be anonymous on the internet can include trying to opt out or actively block this kind of tracking (if you’re interested, check out Disconnect.Me). Modern browsers have privacy modes that attempt to limit some of this but it’s really only a quick and convenient way of browsing a few sites you don’t want to appear in your local history. They’re not known as porn mode for nothing. These browser modes do nothing to prevent your requests from being monitored by your ISP, tracked by the servers your requesting information from and more.

So what about Anonabox?

Anonabox seemed to be popular because it makes using Tor easier. Tor enables you to cloak details about your web requests. Requests are routed through the Tor network rather than straight out from your ISP. The Tor network is a series of nodes around the internet that bounce requests between them. The idea is that you’re making it harder for people at the remote end to trace back to you, and you’re also disrupting people who may be monitoring your traffic (ISP, government, local Wi Fi snoop).

I think the main demand for this box (in the campaign) has come from people who are aghast at the Snowden revelations and want to stymie mass surveillance of the internet by governments. But I think that’s flawed – in that I don’t think the Anonabox is the panacea it seems.

The accusations against the five eyes governments involve mass surveillance of the internet with the (begrudging?) cooperation of telcos and internet companies who provide these services. Using Anonabox or Tor only scrambles your network traversal. They still have access to your information either straight from the pipe or from the company itself. Furthermore, using a normal browser with Anonabox still means you’re subject to the same advertising based tracking and so you’ve defeated nothing. (To be fair they recommended using the Tor Browser Bundle in conjunction with Anonabox).

Whistleblowers and people who face persecution in their country (for sexual orientation, political or other reasons) are pretty serious about security, privacy and anonymity. In order to keep themselves safe they have probably already researched effective ways to keep their identities hidden. While Anonabox is trying to make this easier, without the proper education of users there is still a risk that they make mistakes that reveal their identity or personal information.

I don’t think there’s a single easy solution to trying achieve anonymity, maintain full privacy and security online. Anonabox looks like a step in the right direction but seems to be at risk of giving people a false sense of security that they are totally anonymous and private on the internet.

Oct 17

Driverless cars

As the technology for driverless cars continues to improve we inevitably approach the point where we want this amazing technology to go mainstream. There will be resistance to allow these cars on our roads on a number of fronts though.

At some point in the future people will die or be seriously injured as a result of a driverless car accident. To think that this will never happen is ridiculous. However this shouldn’t be the first thing we think about. People seem to be afraid of introducing driverless cars because they don’t know who to blame, or hold accountable when this happens.

At the moment we have a system where drivers are licensed and held responsible for their actions. Car owners must register their car and meet regulatory requirements to ensure their car is roadworthy. Car manufacturers are accountable for the quality of the cars they produce in regards to meeting safety standards and legislated requirements. And governments are accountable for the system of road rules and safety standards that cars must meet. Obviously the only thing that changes in the driverless car scenario is the removal of the driver.

A driverless car is still owned and registered by someone – they are the ones who would be accountable for the actions of their car. Manufacturers could provide some kind of surety / guarantee about the quality of their car, and perhaps provide liability insurance or protection on behalf of the owner in an attempt to sell the car and assure them it was safe.

The frustrating thing is that this ‘problem’ which will delay the introduction of driverless cars is a problem of skewed perception. I am confident that the introduction of driverless cars will dramatically reduce the number of deaths and injuries on our roads. We don’t need these cars to be perfect, they only need to be better than the current system of human driven cars.

Currently we measure the number of people who die on Australian roads each year in the hundreds. This is way too high, and doesn’t mention the thousands of people who are seriously injured as a result of road accidents. If the introduction of driverless cars cuts this in half – wouldn’t that be amazing?

 

Oct 15

Why use Dynamics CRM as a platform for xRM development?

For a few years now, COTS based solutions have been all the rage – taking an existing off the shelf product and configuring and customising it to meet your organisation’s needs rather than build from scratch. Stand on the shoulders of giants! This article does a pretty good job of articulating the pros and cons of using Microsoft Dynamics CRM 2011 / 2013 as that platform for your organisation’s needs.

Older posts «