Dynamics CRM performance troubleshooting

CRM dynamics crm microsoft msdyncrm on premise performance sql

If you’ve got a Dynamics CRM 2011 / 2013 / 2015 on premise installation that isn’t performing the way you (or your users) think it should then this article gives you information on how to diagnose where the problem(s) lie and some suggested solutions. Some of these tips and tricks apply to CRM Online as well, so read on cloud customers.



Stating the obvious

The first thing you’re going to need to understand is – *what is slow about your CRM and how have you measured that? *You are going to need to get a handle on what aspects of performance are being reported as problematic if you’re going to solve them. Often there’s a multitude of factors that lead to poor performance so it is important that you establish some kind of baseline measurement of the current ‘poor’ performance is, understand the acceptable performance and then work out what is causing the difference.

  • Is it ‘the vibe’ reported by users – otherwise known as the user perception of performance, when accessing the web UI?
  • Is it finding data?
  • Is it editing and creating data?
  • Is it web service calls from some custom integration?
  • Is the poor performance consistent (easily reproduced), or intermittent?
  • Has performance gotten worse over time? - Could it be related to increased number of users, or increased data?



Diagnostic Tools

Following are some of the tools that I think are handy when troubleshooting performance problems with a CRM installation and they should be familiar to most developers and infralopers worth their salt.

  • Fiddler – for capturing traffic between the browser and server
  • SOAP-UI – for reproducing web service calls – works on both SOAP and REST web services
  • CRM’s Built-in diagnostic tools – for measuring browser web UI performance, and network metrics between browser and server
  • SQL Server Management Studio – for examining underlying SQL Server performance, executing queries, checking execution plans

I like to be ‘scientific’ in my approach and capture timings for operations using tools where-ever possible. Multiple test runs, using the average as the measure (but beware the magic of caching on secondary runs). The tricky issue of end user perception of performance is often measuring in feelpinions, you need to get users to try and time their operations where possible. The good news is there are built-in tools that can help you with this.



Health Check

First up, have you run the Microsoft Dynamics CRM 2013 Best Practices Analyzer on your installation? If it highlighted any errors you should address those – they might not be having a direct impact on the performance of your system but you’re best to eliminate those before you get too far.

Also while you’re at it, how up to date are your patches? I know there’s times when you’re hamstrung and can’t apply the latest patch because of some #enterprisey reason but if there’s nothing stopping you, ensure you’re up to date.

If you’re not sure what patch level you’re at, check the CRM Build Numbers here.



Web Interface

Before you even try – are you using a supported browser?

From the Web UI perspective there’s some great built-in tools that can help you diagnose problems and measure performance.

CRM Browser Diagnostics

If you go to http://crm2015web/tools/diagnostics/diag.aspx">/tools/diagnostics/diag.aspx">http:///tools/diagnostics/diag.aspx you’ll see the page below. Click the Run button and you’ll get the results of some tests executed from the browser that reveal any issues between browser and server, or with the browser itself. Note that the URL is for the root of the CRM website, it’s not for a particular organisation. This is a very handy way of getting end users to capture the performance of the CRM from their end and send it in to you. Also helps to run this over different times of the day if your scenario involves different performance at different times of the day,.

CRM Diagnostics



As of CRM 2013 SP1 onwards there’s a new browser diagnostic tool in town. Hit Ctrl + Shift + Q in IE and you’ll see the following.

Start the diagnostics



Now click the Enable button and load a form you’re trying to analyse. Hit Ctrl + Shift + Q again and now you’ve got an excellent breakdown of the form performance.

Browser diagnostics results

This can be handy to compare performance from different browsers / end-users, and also to see the impact your form design is having. Loaded up with a billion sub-grids? That’s a paddling.



Form Design

I’ll keep this brief – review your form design carefully. Consider your end users and tailor the form layout appropriately, you might not need to have all the fields on their all the time. Maybe designing a lightweight form that is used 90% of the time and allowing the user to switch to the detailed form the remaining 10% leaves users more productive compared to 1 gigantic form trying to be everything to everyone. Likewise remember to consider end user roles and segregating different form layouts that way. Yes it leads to a bit of extra development and maintenance but if it leads to a more useful form for users that performs better, the system is more likely to be used than a slow-as-molasses form loaded with a billion sub-grids that might come in handy.

Web Services

CRM comes with a great web services API that allows integration by other systems. A pattern I’ve seen often involves getting .NET developers to write a simplified set of web services that conform to organisation specific data models, acting as a wrapper to the CRM. This simplifies integration and transforms CRM objects into the required models, it also provides a bit of abstraction so you can minimise disruption if you upgrade the CRM installation later. Sounds awesome, and you can bash out some code pretty damn quickly that gives you the desired results using LINQ. Like Peter Parker’s Uncle Ben said – with great power comes great responsibility. Getting code-complete quickly doesn’t mean you’ve written an efficient system. Assuming you’re writing queries against the OData service:

  • (Update September 2016) Use the PFE Core Library to avoid reinventing the wheel
  • Test your queries using SOAP-UI directly against the CRM OData service
  • Use the OData Query Designer in Dynamics XRM Tools, or be brave and just work out the URL format yourself
  • Now test the custom web service that performs this same service – the difference is the overhead of your custom web service (i.e. +200ms)

  • Understand that your LINQ query may result in multiple OData web service calls. Which happen sequentially. Which adds up to lost time.

  • Check the IIS logs of the CRM server to see the number of requests coming into the OData web service
  • Can you refactor the query to reduce the number of calls?

  • Only return the attributes and links that you need

  • Friends don’t let friends write “SELECT *” queries, and similarly you shouldn’t load more attributes in the CRM entities than you need
  • Specify only the attributes that you need and then execute the query
  • Additional unnecessary attributes just result in additional overhead of serialising / de-serialising.

  • Compare the results to SQL Server Filtered Views – try T-SQL in SQL Server Management Studio that gets similar result-sets, how does that perform by comparison?

  • One option for reading data is to connect to the SQL Server Filtered Views – go straight to the heart of the beast.
  • Don’t jump into this without considering the future implications – it won’t work in a CRM Online world for instance, but if the bulk of the operations for your web services are read-oriented it may be worth checking out.

  • A handy way to log the timing of your custom web services is to ensure ‘time-taken’ is logged in IIS (assuming ASP.NET web services). You can then analyse this for queries exceeding your target times.



CRM Maintenance Jobs

CRM has a number of built-in maintenance related jobs that perform things such as index recalculations. By default these kick off once a day roughly around the time that the CRM was installed. Which is usually business hours. An excellent tool to review these schedules and change them to a time of less interruption to the user is the CRM Job Editor – with editions for CRM 2011 / 2013 / 2015.



Networking and Infrastructure

Have the underlying Windows servers been given the appropriate amount of RAM and CPU? Check the 2011 / 2013 / 2015 Implementation Guides for the recommended values.

What seems to be the average memory use and CPU usage in perfmon (or for more manager friendly graphs Resource Monitor)?

If you’ve got multiple CRM servers deployed behind a load balancer, are you able to side step the load balancer and browse the CRM server directly (from the end user desktop environment), and does this make any difference to the performance? If it does then check out the NLB configuration for any issues.

What does the performance of the CRM Web UI seem like from one of the servers itself (via Remote Desktop)? How does it compare?

Are the Windows Event Logs full of errors or other information indicating problems with disk, networking (connectivity, DNS), authentication or other critical things?

What’s the network topology? When you’re browsing the CRM Web UI and getting your own feel of system performance, are you doing it only metres away from the data centre, while the users complaining about performance are in some regional office connected over a wet piece of string? If the performance symptoms being complained about seem to be geo-specific, replicate testing from their end as much as possible (see the built-in diagnostic tools in the Web Interface section).

Have you got latency issues between end users and your CRM servers? CRM can be a bit ‘chatty’ and this can cause you pain over a connection with high latency (e.g. think London to Canberra). In some organisations I’ve seen better performance through Citrix because the browser to CRM server chattiness occurred “locally” within the same data centre. Your mileage will vary and so will the tools at your disposal to tackle this.



IIS Settings

Double check that Dynamic Compression is enabled for your CRM website in IIS 8 / 8.5. After you’ve done that, check the outputCache setting for omitVaryStar as per this CRM Tip of the Day. Yes it applies to IIS 8.5 as well, and it’s not just a CRM issue – it affects all sites hosted in IIS. Without this setting you may find that output isn’t being cached by the browser properly which causes a performance drag by making additional requests for content upon page load. Be sure with this one to test before / after with browsers that have had their cache emptied (then load a few different forms) to measure performance difference.



SQL Server

Of course Dynamics CRM performance is heavily dependent on the performance of the underlying SQL Server installation. So, have you run the SQL Server Best Practices Analzyer (2012 edition)?

  • Memory and CPU – is SQL crying out for any more of either of these?
  • Physical location of Data and Log files – are the data and log files on separate physical disks?
  • Max Degree of Parallelism (MAXDOP) – it is recommended that this is set to 1. It affects the entire instance, and changes occur immediately. Tread carefully before making this change.
  • *tempdb – *it is generally recommended to have the same number of physical files for tempdb data as the number of CPUs on the server. By default there will be 1 file.
  • Growth of database files – check the auto-growth settings of the database files and pre-grow them to a larger size if your database is growing regularly. This can reduce the number of ‘disk grabs’ SQL makes as it expands the databases.

Data Management Views – dmvs

SQL has some excellent statistics that it keeps regarding costly queries, index usage and other tuning related things. From a CRM perspective these can help reveal what your most costly queries are. This article from Nuno Costa does a much better job of explaining these than I can, so check it out.

Update 18 Oct 2015: Performance Analyzer for Microsoft Dynamics (aka DynamicsPerf) is a toolset developed by Microsoft Premier Field Engineering. It is a set of SQL scripts to collect SQL Server DMV data and Microsoft Dynamics (CRM, AX, GP, NAV, SL) specific product data for quick resolution of performance issues on Microsoft Dynamics products.

Indexes

If you’re doing a lot of queries that involve WHERE clauses with custom attributes or ORDER BYs with custom attributes chances are you can benefit from an index on those attributes – particularly if the number of records is large. Adding an INDEX to the SQL Table is the only supported thing you can change in the SQL Database. Things you need to consider – how unique is the data? How often are you reading from it vs writing to it (inserts, updates)? Because the cost will come in terms of index calculation as you make changes. These days this calculation happens ‘online’ and doesn’t block but it still taxes CPU and Memory of course.

But how will you know which attributes need indexes?

Run queries similar to the ones that are performing slowly, directly in SQL Server Management Studio and make sure to include the execution plan. SQL will tell you the cost of the query components and reveal if an index would benefit that query.

What if I’m a CRM Online customer?

If you put a support call in to Microsoft you can have them add the index for you.



More Information

I’m yet to come across an update beyond the CRM 2011 version, but there’s an extensive performance and optimisation whitepaper from Microsoft and a lot of the same principles still apply.

Update 2nd September 2015 Microsoft released some CRM 2015 specific information on scalable Dynamics CRM, check it out

Conclusion

There’s a lot of moving parts to Dynamics CRM. Every performance analysis will be different as context is everything for an on-premise installation. However I hope you’ve found this a helpful overview of some important things to be aware of with regards to Dynamics CRM performance optimising and troubleshooting.