More RAM might not prove the right solution at all times

Mar 14, 2012 20:21 GMT  ·  By

One of the aspects that needs to be taken into consideration when it comes to ensuring a good CRM Performance is the SQL Server sizing.

There are simple rules to follow to make sure that the SQL environment will be able to handle the estimated load of data and users, when starting up a new environment.

The same applies when it comes to existing CRM deployments that you would like to enhance to improve performance.

A recent post on the CRM Software Blog (authored by Joel Lindstrom, Customer Effective) offers a series of general recommendations related to the sizing of a CRM database server.

CRM customers will have to consider the expected user count and transaction levels, but should also take into account these simple principles to ensure that they get things right.

For example, Microsoft recommends that you get only 80 percent of the SQL Server’s supported RAM. Apparently, the performance is degraded if the maximum amount of RAM is installed.

Apparently, disk I/O plays a more important role than RAM does. When using virtual servers for the database, this becomes rather important, it seems. While recommending the use of the fastest disk drive, the said blog post does not agree with the use of virtual partitions on a shared physical disk.

The post also notes that the use of many lower capacity disks could prove better than the installation of a single, high capacity disk on which to split out data, transaction logs, and temp db onto their own physical partitions.

When it comes to existing servers, the road to take would be the optimization of SQL performance first, followed by the addition of more RAM. CRM performance will be slow when there are tables not optimized, or missing indexes. Database optimizations would prove great in this case.

However, some optimizations of indexes do need more RAM. 12 GB should be the minimum RAM on the SQL Server for most implementations of CRM 2011. It is also recommended to put in place a good maintenance plan and to defragment indexes regularly.

Some other recommendations include: - Be sure to set the MaxServerMemory setting in SQL to reserve 20% of system memory for the OS and other applications. SQL Server likes to use as much memory as it can, and it sometimes doesn’t like to release that memory. That’s why setting the maximum server memory available to SQL to a number lower than the total system memory is a good idea—it leaves enough memory available for the operating system.

- Your data partition should be approximately 2.5 times as big as the estimated size of your CRM database. This will give you room to grow, and also give you leeway should you ever want to set up a second copy of your environment for test purposes or to use as a replicated reporting environment.

- Monitor performance regularly—don’t set it and forget it. Data volumes and usage patterns with change over time. If you have tools like SCOM, you can automate monitoring of standard OS and SQL performance metrics, and about every 1-3 months consider running standard sql queries and reports to check ongoing performance. The Standard reports available in SQL Management Studio are a good starting point, such as the memory consumption at the server level and Index Usage Statistics at the database level

Moreover, CRM customers should also have a look at the recommendations for SQL Server performance optimization in the Optimizing and Maintaining Performance of a Microsoft Dynamics CRM 2011 Server Architecture White Paper.