Mar 02

Dynamics CRM performance troubleshooting

If you’ve got a Dynamics CRM 2011 / 2013 / 2015 on premise installation that isn’t performing the way you (or your users) think it should then this article gives you information on how to diagnose where the problem(s) lie and some suggested solutions. Some of these tips and tricks apply to CRM Online as well, so read on cloud customers.

 

Stating the obvious

The first thing you’re going to need to understand is – what is slow about your CRM and how have you measured that? You are going to need to get a handle on what aspects of performance are being reported as problematic if you’re going to solve them. Often there’s a multitude of factors that lead to poor performance so it is important that you establish some kind of baseline measurement of the current ‘poor’ performance is, understand the acceptable performance and then work out what is causing the difference.

  • Is it ‘the vibe’ reported by users – otherwise known as the user perception of performance, when accessing the web UI?
  • Is it finding data?
  • Is it editing and creating data?
  • Is it web service calls from some custom integration?
  • Is the poor performance consistent (easily reproduced), or intermittent?
  • Has performance gotten worse over time?
    • Could it be related to increased number of users, or increased data?

 

Diagnostic Tools

Following are some of the tools that I think are handy when troubleshooting performance problems with a CRM installation and they should be familiar to most developers and infralopers worth their salt.

  • Fiddler – for capturing traffic between the browser and server
  • SOAP-UI – for reproducing web service calls – works on both SOAP and REST web services
  • CRM’s Built-in diagnostic tools – for measuring browser web UI performance, and network metrics between browser and server
  • SQL Server Management Studio – for examining underlying SQL Server performance, executing queries, checking execution plans

I like to be ‘scientific’ in my approach and capture timings for operations using tools where-ever possible. Multiple test runs, using the average as the measure (but beware the magic of caching on secondary runs). The tricky issue of end user perception of performance is often measuring in feelpinions, you need to get users to try and time their operations where possible. The good news is there are built-in tools that can help you with this.

 

Health Check

First up, have you run the Microsoft Dynamics CRM 2013 Best Practices Analyzer on your installation? If it highlighted any errors you should address those – they might not be having a direct impact on the performance of your system but you’re best to eliminate those before you get too far.

Also while you’re at it, how up to date are your patches? I know there’s times when you’re hamstrung and can’t apply the latest patch because of some #enterprisey reason but if there’s nothing stopping you, ensure you’re up to date.

If you’re not sure what patch level you’re at, check the CRM Build Numbers here.

 

Web Interface

Before you even try – are you using a supported browser?

From the Web UI perspective there’s some great built-in tools that can help you diagnose problems and measure performance.

CRM Browser Diagnostics

If you go to /tools/diagnostics/diag.aspx">http://<CRM-URL>/tools/diagnostics/diag.aspx you’ll see the page below. Click the Run button and you’ll get the results of some tests executed from the browser that reveal any issues between browser and server, or with the browser itself. Note that the URL is for the root of the CRM website, it’s not for a particular organisation. This is a very handy way of getting end users to capture the performance of the CRM from their end and send it in to you. Also helps to run this over different times of the day if your scenario involves different performance at different times of the day,.

image

 

As of CRM 2013 SP1 onwards there’s a new browser diagnostic tool in town. Hit Ctrl + Shift + Q in IE and you’ll see the following.

image

 

Now click the Enable button and load a form you’re trying to analyse. Hit Ctrl + Shift + Q again and now you’ve got an excellent breakdown of the form performance.

image

This can be handy to compare performance from different browsers / end-users, and also to see the impact your form design is having. Loaded up with a billion sub-grids? That’s a paddling.

 

Form Design

I’ll keep this brief – review your form design carefully. Consider your end users and tailor the form layout appropriately, you might not need to have all the fields on their all the time. Maybe designing a lightweight form that is used 90% of the time and allowing the user to switch to the detailed form the remaining 10% leaves users more productive compared to 1 gigantic form trying to be everything to everyone. Likewise remember to consider end user roles and segregating different form layouts that way. Yes it leads to a bit of extra development and maintenance but if it leads to a more useful form for users that performs better, the system is more likely to be used than a slow-as-molasses form loaded with a billion sub-grids that might come in handy.

 

Web Services

CRM comes with a great web services API that allows integration by other systems. A pattern I’ve seen often involves getting .NET developers to write a simplified set of web services that conform to organisation specific data models, acting as a wrapper to the CRM. This simplifies integration and transforms CRM objects into the required models, it also provides a bit of abstraction so you can minimise disruption if you upgrade the CRM installation later. Sounds awesome, and you can bash out some code pretty damn quickly that gives you the desired results using LINQ. Like Peter Parker’s Uncle Ben said – with great power comes great responsibility. Getting code-complete quickly doesn’t mean you’ve written an efficient system. Assuming you’re writing queries against the OData service:

  • Test your queries using SOAP-UI directly against the CRM OData service
    • Use the OData Query Designer in Dynamics XRM Tools, or be brave and just work out the URL format yourself
    • Now test the custom web service that performs this same service – the difference is the overhead of your custom web service (i.e. +200ms)
  • Understand that your LINQ query may result in multiple OData web service calls. Which happen sequentially. Which adds up to lost time.
    • Check the IIS logs of the CRM server to see the number of requests coming into the OData web service
    • Can you refactor the query to reduce the number of calls?
  • Only return the attributes and links that you need
    • Friends don’t let friends write “SELECT *” queries, and similarly you shouldn’t load more attributes in the CRM entities than you need
    • Specify only the attributes that you need and then execute the query
    • Additional unnecessary attributes just result in additional overhead of serialising / de-serialising.
  • Compare the results to SQL Server Filtered Views – try T-SQL in SQL Server Management Studio that gets similar result-sets, how does that perform by comparison?
    • One option for reading data is to connect to the SQL Server Filtered Views – go straight to the heart of the beast.
    • Don’t jump into this without considering the future implications – it won’t work in a CRM Online world for instance, but if the bulk of the operations for your web services are read-oriented it may be worth checking out.
  • A handy way to log the timing of your custom web services is to ensure ‘time-taken’ is logged in IIS (assuming ASP.NET web services). You can then analyse this for queries exceeding your target times.

 

CRM Maintenance Jobs

CRM has a number of built-in maintenance related jobs that perform things such as index recalculations. By default these kick off once a day roughly around the time that the CRM was installed. Which is usually business hours. An excellent tool to review these schedules and change them to a time of less interruption to the user is the CRM Job Editor – with editions for CRM 2011 / 2013 / 2015. 

 

Networking and Infrastructure

Have the underlying Windows servers been given the appropriate amount of RAM and CPU? Check the 2011 / 2013 / 2015 Implementation Guides for the recommended values.

What seems to be the average memory use and CPU usage in perfmon (or for more manager friendly graphs Resource Monitor)?

If you’ve got multiple CRM servers deployed behind a load balancer, are you able to side step the load balancer and browse the CRM server directly (from the end user desktop environment), and does this make any difference to the performance? If it does then check out the NLB configuration for any issues.

What does the performance of the CRM Web UI seem like from one of the servers itself (via Remote Desktop)? How does it compare?

Are the Windows Event Logs full of errors or other information indicating problems with disk, networking (connectivity, DNS), authentication or other critical things?

What’s the network topology? When you’re browsing the CRM Web UI and getting your own feel of system performance, are you doing it only metres away from the data centre, while the users complaining about performance are in some regional office connected over a wet piece of string? If the performance symptoms being complained about seem to be geo-specific, replicate testing from their end as much as possible (see the built-in diagnostic tools in the Web Interface section).

Have you got latency issues between end users and your CRM servers? CRM can be a bit ‘chatty’ and this can cause you pain over a connection with high latency (e.g. think London to Canberra). In some organisations I’ve seen better performance through Citrix because the browser to CRM server chattiness occurred “locally” within the same data centre. Your mileage will vary and so will the tools at your disposal to tackle this.

 

IIS Settings

Double check that Dynamic Compression is enabled for your CRM website in IIS 8 / 8.5. After you’ve done that, check the outputCache setting for omitVaryStar as per this CRM Tip of the Day. Yes it applies to IIS 8.5 as well, and it’s not just a CRM issue – it affects all sites hosted in IIS. Without this setting you may find that output isn’t being cached by the browser properly which causes a performance drag by making additional requests for content upon page load. Be sure with this one to test before / after with browsers that have had their cache emptied (then load a few different forms) to measure performance difference.

 

SQL Server

Of course Dynamics CRM performance is heavily dependent on the performance of the underlying SQL Server installation. So, have you run the SQL Server Best Practices Analzyer (2012 edition)?

  • Memory and CPU – is SQL crying out for any more of either of these?
  • Physical location of Data and Log files – are the data and log files on separate physical disks?
  • Max Degree of Parallelism (MAXDOP) – it is recommended that this is set to 1. It affects the entire instance, and changes occur immediately. Tread carefully before making this change.
  • tempdb – it is generally recommended to have the same number of physical files for tempdb data as the number of CPUs on the server. By default there will be 1 file.
  • Growth of database files – check the auto-growth settings of the database files and pre-grow them to a larger size if your database is growing regularly. This can reduce the number of ‘disk grabs’ SQL makes as it expands the databases.

Data Management Views – dmvs

SQL has some excellent statistics that it keeps regarding costly queries, index usage and other tuning related things. From a CRM perspective these can help reveal what your most costly queries are. This article from Nuno Costa does a much better job of explaining these than I can, so check it out.

 

Indexes

If you’re doing a lot of queries that involve WHERE clauses with custom attributes or ORDER BYs with custom attributes chances are you can benefit from an index on those attributes – particularly if the number of records is large. Adding an INDEX to the SQL Table is the only supported thing you can change in the SQL Database. Things you need to consider – how unique is the data? How often are you reading from it vs writing to it (inserts, updates)? Because the cost will come in terms of index calculation as you make changes. These days this calculation happens ‘online’ and doesn’t block but it still taxes CPU and Memory of course.

But how will you know which attributes need indexes?

Run queries similar to the ones that are performing slowly, directly in SQL Server Management Studio and make sure to include the execution plan. SQL will tell you the cost of the query components and reveal if an index would benefit that query.

What if I’m a CRM Online customer?

If you put a support call in to Microsoft you can have them add the index for you.

 

More Information

I’m yet to come across an update beyond the CRM 2011 version, but there’s an extensive performance and optimisation whitepaper from Microsoft and a lot of the same principles still apply.

 

Conclusion

There’s a lot of moving parts to Dynamics CRM. Every performance analysis will be different as context is everything for an on-premise installation. However I hope you’ve found this a helpful overview of some important things to be aware of with regards to Dynamics CRM performance optimising and troubleshooting.

Nov 04

Rich powerful note taking with OneNote

I really love OneNote and have used it for years. Which is why I’ve just added a guide to OneNote on FreemiumTools.com.

 

Oct 27

Freemiumtools.com

A while ago I put together a website that explains popular cloud services in an introductory manner with the aim of letting people read about some of the tools that are out there and decide if they can use them for their business (or life). The content isn’t for the tech savvy, but hopefully it’s still helpful to people on the learning curve. So if you’re interested in finding out more about cloud tools and software for productivity and general business, why not check out freemiumtools.com.

Oct 21

Anonabox – online security, privacy and anonymity

The recent Kickstarter campaign for Anonabox proved to be insanely popular but ultimately unsuccessful after the campaign was suspended by Kickstarter. If you’re unfamiliar with it, it was a small network device that you could plug into an existing network connection and then either use the provided Wi Fi hotspot, or plug in another network cable all traffic routed through the device would go over Tor. The idea behind the device is to make it dead easy for people to use Tor holistically, preventing software from making direct requests to the internet. The other advantage is that it would work without particular applications having to be aware of Tor or configured to use Tor. Just plug it in an go.

I think the intent behind this device is great but there’s a bunch of things people forget and confuse when it comes to online privacy, anonymity and security. I think the danger with a device like this would have been lulling users into a false sense of security, or illusion that they couldn’t be traced / tracked / monitored / discovered / whatever it is they thought they were achieving by using this. This prompted me to think about a bunch of things and following is my brain dump on what some of the differences are between privacy, anonymity and security – why you might want to pursue any or all of these online. This is stuff I’ve been mulling over for months after reading books like Black Code, No Place To Hide and following the details of the Snowden revelations as well as the metadata collection debate currently happening in Australia.

Security

Let’s start with online security. This covers your basic approach to doing things securely on the internet – ensuring you use a password manager to enable strong, unique passwords on each and every site you use; using anti-virus software and personal firewalls; enabling two-factor authentication and using extensions such as HTTPS Everywhere to enforce the usage of https on websites. You make a conscious effort to avoid using insecure websites and applications that do dumb things like email you your password, don’t use https, having stupid password restrictions and so on.

More advanced approaches to security include encrypting communications (instant messaging, text messaging, emails), encrypting files and whole drives (computer, smartphone).

Why do we do these things? We do these things to avoid having our accounts compromised, money stolen, identity revealed (more on this later), personal information leaked – including nude selfies. The consequences of these things ranges from pain in the arse to major impact on our lives.

Privacy

Privacy is about controlling information about yourself – consenting to provide that information understanding how it will be used, your rights regarding deleting that information and how long it is stored for. Clearly a major trend in the last decade is the erosion of our privacy in the online world through constant mishandling of our personal information leading to leaks.

This is not to be confused with scenarios where we opt in to applications to receive a benefit – providing personal information when there is a net benefit in applications like Facebook. These systems work because without opting in and providing your information you won’t be able to establish the connections with your friends. Essentially you get to use this application for free because you’re providing personal information. That the application aggregates all of this personal information and uses it for marketing purposes is something (most of us) are consciously aware of and acknowledge. The benefit we receive is worth it to us.

What complicates privacy are the many different facets of information about ourselves and who we want to reveal them to is very granular.

Anonymity

Anonymity is about the right to pursue your life anonymously without having to provide identifying information. In an online world this means the ability to use pseudonyms and non-identifying information when interacting with applications and other users on the internet.

A disturbing angle relating to anonymity is the practice of having your online habits tracked across multiple sites over a period of time through advertising networks. While we can read and agree to privacy statements of individual sites and receive a pretty obvious benefit in return for providing some information to the Facebooks of the world, it’s less obvious the benefit we get from being tracked. ‘More targeted advertising’ is usually the result, but for most people that’s a pretty dubious benefit. It’s great for business, but not the individual.

Trying to be anonymous on the internet can include trying to opt out or actively block this kind of tracking (if you’re interested, check out Disconnect.Me). Modern browsers have privacy modes that attempt to limit some of this but it’s really only a quick and convenient way of browsing a few sites you don’t want to appear in your local history. They’re not known as porn mode for nothing. These browser modes do nothing to prevent your requests from being monitored by your ISP, tracked by the servers your requesting information from and more.

So what about Anonabox?

Anonabox seemed to be popular because it makes using Tor easier. Tor enables you to cloak details about your web requests. Requests are routed through the Tor network rather than straight out from your ISP. The Tor network is a series of nodes around the internet that bounce requests between them. The idea is that you’re making it harder for people at the remote end to trace back to you, and you’re also disrupting people who may be monitoring your traffic (ISP, government, local Wi Fi snoop).

I think the main demand for this box (in the campaign) has come from people who are aghast at the Snowden revelations and want to stymie mass surveillance of the internet by governments. But I think that’s flawed – in that I don’t think the Anonabox is the panacea it seems.

The accusations against the five eyes governments involve mass surveillance of the internet with the (begrudging?) cooperation of telcos and internet companies who provide these services. Using Anonabox or Tor only scrambles your network traversal. They still have access to your information either straight from the pipe or from the company itself. Furthermore, using a normal browser with Anonabox still means you’re subject to the same advertising based tracking and so you’ve defeated nothing. (To be fair they recommended using the Tor Browser Bundle in conjunction with Anonabox).

Whistleblowers and people who face persecution in their country (for sexual orientation, political or other reasons) are pretty serious about security, privacy and anonymity. In order to keep themselves safe they have probably already researched effective ways to keep their identities hidden. While Anonabox is trying to make this easier, without the proper education of users there is still a risk that they make mistakes that reveal their identity or personal information.

I don’t think there’s a single easy solution to trying achieve anonymity, maintain full privacy and security online. Anonabox looks like a step in the right direction but seems to be at risk of giving people a false sense of security that they are totally anonymous and private on the internet.

Oct 17

Driverless cars

As the technology for driverless cars continues to improve we inevitably approach the point where we want this amazing technology to go mainstream. There will be resistance to allow these cars on our roads on a number of fronts though.

At some point in the future people will die or be seriously injured as a result of a driverless car accident. To think that this will never happen is ridiculous. However this shouldn’t be the first thing we think about. People seem to be afraid of introducing driverless cars because they don’t know who to blame, or hold accountable when this happens.

At the moment we have a system where drivers are licensed and held responsible for their actions. Car owners must register their car and meet regulatory requirements to ensure their car is roadworthy. Car manufacturers are accountable for the quality of the cars they produce in regards to meeting safety standards and legislated requirements. And governments are accountable for the system of road rules and safety standards that cars must meet. Obviously the only thing that changes in the driverless car scenario is the removal of the driver.

A driverless car is still owned and registered by someone – they are the ones who would be accountable for the actions of their car. Manufacturers could provide some kind of surety / guarantee about the quality of their car, and perhaps provide liability insurance or protection on behalf of the owner in an attempt to sell the car and assure them it was safe.

The frustrating thing is that this ‘problem’ which will delay the introduction of driverless cars is a problem of skewed perception. I am confident that the introduction of driverless cars will dramatically reduce the number of deaths and injuries on our roads. We don’t need these cars to be perfect, they only need to be better than the current system of human driven cars.

Currently we measure the number of people who die on Australian roads each year in the hundreds. This is way too high, and doesn’t mention the thousands of people who are seriously injured as a result of road accidents. If the introduction of driverless cars cuts this in half – wouldn’t that be amazing?

 

Oct 15

Why use Dynamics CRM as a platform for xRM development?

For a few years now, COTS based solutions have been all the rage – taking an existing off the shelf product and configuring and customising it to meet your organisation’s needs rather than build from scratch. Stand on the shoulders of giants! This article does a pretty good job of articulating the pros and cons of using Microsoft Dynamics CRM 2011 / 2013 as that platform for your organisation’s needs.

Aug 19

Squeegee

What started as an exercise in skills refresh for me has finally grown into a side-project that went live today: Squeegee

A personal finance software SaaS product. Yes there’s a thousand of those out there already, but I used this as an exercise in learning how to take an idea all the way to fruition. There’s clearly more work to be done, but it’s a satisfying milestone to reach.

It would be great if you could check it out and let me know what you think!

 

Aug 05

Government announces Operation Sovereign Data

Prime Minister Tony Abbott and Attorney-General George Brandis today announced a new government initiative known as Operation Data Protect. This nationwide program will act as a data backup service for the nation, relieving the millions of Australians with connections to the internet from having to worry about safeguarding their data.

Mr Abbott reveal that the scheme would be rolled out later this year and would retain 2 years worth of data for every internet connected Australiaa. “People are afraid. Their precious memories are only a dropped laptop or stolen mobile phone away from being lost forever.”

Senator Brandis told the media contingent that after reviewing commercial services currently available, the Government had decided to step in. “Some of these services aren’t even located in Australia. There was a real risk that data was being sent to ‘the cloud’. That’s not even a country.”

“We simply could not stand by while a generation of Australians lost their documents through a lack of a backup” added the Prime Minister. In addition to backup, the service will index all the data stored so that users can easily find their files when they need to. “Handily this helps copyright owners check their records of ownership against the backed up data so we can ensure that they have received every last cent they are owed” confirmed Senator Brandis.

A glossy brochure was distributed at the press conference, listing some of the other benefits of the service. In the future, people applying for public service positions will be able to let the government simply refer to their indexed backup instead of having to complete arduous selection criteria. When asked about the whereabouts of Communications Minister Malcolm Turnbull, Prime Minister Abbott informed the press conference that he was “out negotiating a great deal on the hard drives required. Malcolm practically invented hard drives in Australia and knows a fair price when he sees it.”

Questioned about the security of the data stored on behalf of Australians, Senator Brandis said he would use a really strong password – “with those funny characters and everything” and would keep this written inside the cover of a random book on his parliamentary bookshelf. “I can’t give away which one, but it rhymes with twine-teen gatey-floor” said the winking Senator.

Aug 04

Ditching GoDaddy

I have a number of domains I originally registered with GoDaddy and I’ve finally dumped them. I’ve transferred both the DNS hosting and the Registration over to DNSimple. I got sick of trying to be sold dumb shit, piss poor web interface, and just sleazy marketing with GoDaddy. They were damn cheap I’ll give them that.

 

If you want to do something similar, here’s an overview:

  • You can change just your DNS hosting
  • You can also move over your Registration (I recommend going the whole hog)
  • You can do it without any downtime

 

The general process is:

  • Create an account with DNSimple
  • “Add’ domains to them
  • Export your Zone file from GoDaddy
  • Import the Zone file into DNSimple
  • Verify that the DNSimple servers are resolving your site
  • Change over the root nameservers for your domain(s) to DNSimple and await for propagation – this might take 24 hours
  • Cancel any domain privacy you have with GoDaddy, and ‘Unlock’ the domain (to allow transfer)
  • Transfer the Registration to DNSimple (using an Authorisation code from GoDaddy per domain)
  • Click ‘Confim’ in a few email links
  • Delete your GoDaddy account
  • Crack open a beer and put on a smug smile.

 

It is that simple. But why DNSimple? Nice, clean UI. Simple pricing structure (yes dearer than GoDaddy but it’s worth it). Two-factor authentication. You can still have WHOis privacy to obscure your details from the public registers.

 

DNSimple provide a guide to the process here. And if you are thinking about signing up, it’d be nice if you went via my referral link.

 

Jul 28

Why Free-to-Air TV matches are not HD

In the ultimate of first-world-problems, free to air NRL matches in Australia are broadcast in standard definition (SD), much to the annoyance of anyone with a decent pair of eyes. At the same time as a big game of footy is being broadcast, Channel 9 are bound to be utilising their High Definition (HD) channel for something really spectacular – like a re-run of an old Elizabeth Taylor movie from 19 diggity 8. No disrespect to Liz, but the fast paced action of sport is better suited for the HD channel. So why is it like this?

The main reasons are:

  • Government legislation
  • Government standards
  • Money

Firstly the legislation. With the introduction of Pay-TV in Australia, laws were introduced that are known as ‘anti-siphoning’ laws that prevented all the major sporting events Australians love to watch from being sucked up and only shown on Pay TV providers. Ensuring these major sporting events (Rugby League, AFL, Cricket etc) stayed on free to air TV was a great win for sports fans. The Pay TV providers can show additional events from these sports codes but as long as there is a minimum shown on free to air everything is cool.

Now for the standards. With the introduction of digital TV in Australia and then the switchover which would see the end of analogue television the government decided that the minimum digital TV standard would be Standard Definition (SD), or 480p. With all the short-sightedness of Mr Magoo, SD became the minimum standard digital TV broadcast experience for free to air TV. Additional HD channels came along as broadcasters were allowed to expand their free to air offerings. But these are essential ‘bonus’ channels because they are not required to show anything on them.

And of course money. There’s two factors for this – by offering to show sports in HD on Pay TV, there’s money to be made from people who want to cough up the cash to enjoy the sport they love in glorious HD. The problem is, the HD games they show are not the ones that are shown on free to air TV – or they are delayed if they are. The other money angle relates to ratings. Why don’t the TV stations just simulcast the games and show them on SD (as required) and in HD? The way TV ratings are calculated sees these as separate shows and so instead of ratings of “800,000″, you’d end up with ratings of “500,000″ and “300,000″. TV stations love ratings and need them to be as high as possible so they can charge higher advertising rates.

So there we have it. The possible solutions are to change the way TV ratings and advertising rates are calculated. Change the anti-siphoning laws and run the risk of losing access to these sports on free-to-air TV altogether, or to change the minimum standard for digital TV broadcast. I can’t see any of these changing any time soon, so I guess the situation we have now is one we’re stuck with. Well, at least you now know what it is the mess that it is.

 

 

Older posts «