- Microsoft Dynamics CRM 2011 Cookbook
- Dipankar Bhattacharya
- 2369字
- 2025-02-26 01:33:17
Optimizing Dynamics CRM 2011 Database Server performance
Microsoft Dynamics CRM 2011 is a resource-intensive, web-based application. Certain parameters of the system have to be tuned for optimum performance. Dynamics CRM 2011 Server performance tuning can happen mainly in two components: the Dynamics CRM 2011 Database level and the Dynamics CRM 2011 Server component level. In this recipe, we will delve into how the Dynamics CRM 2011 database can be tuned for better performance.
Getting ready
Microsoft Dynamics CRM 2011 uses SQL Server. For better performance tuning on SQL Server, we need to identify the performance bottlenecks in the database server. SQL Server performance monitoring tools such as Activity Monitor (part of SQL Server 2008 Management Studio) should be used to check the performance of SQL Server.
Accurate performance tuning steps should be taken by analyzing the performance bottlenecks. Here we will explore a few common performance-tuning techniques for the Dynamics CRM 2011 database.
Note
Before applying these performance optimization techniques in a live environment, they should first be verified in a test environment that approximately matches the complexity, data volume, and integration points of the production environment.
How to do it…
The following points explain how to performance tune your Dynamics CRM 2011 database:
- Store datafiles and logfiles on separate physical disks: Keeping both data and log files on the same drive can cause contention for that device and result in poor performance. Storing the files on separate drives allows I/O activity to occur at the same time for both the data and log files. This is a very important consideration for organization databases as the data growth is very frequent and usually very high too.
During a database
CREATE
operation, specify separate physical drive locations for the data and log files. In case the database has already been created, move the files by performing the following steps:- Take a backup of the database and then take the database offline: The database can be backed up by right-clicking on the database name in the Object Explorer pane within SQL Server Management Studio and then selecting Tasks | Back Up… as shown in the following screenshot:
- This will open a Back Up Database screen; enter a value for Backup type and the location information, and click on OK to back up. Wait until a backup successful message appears.
- Thereafter, to take the database offline, right-click on the database name inside SQL Server Management Studio and then select Tasks | Take Offline.
- Take a backup of the database and then take the database offline: The database can be backed up by right-clicking on the database name in the Object Explorer pane within SQL Server Management Studio and then selecting Tasks | Back Up… as shown in the following screenshot:
- Restore the database from the backup: Right-click on the Databases node inside SQL Server Management Studio and then select the Restore Database… option as shown in the following screenshot:
This will launch the Restore Database page. Here select the Device option under the Source details in the Generals node and then click on the associated File Explorer beside it. This will launch the Select Backup Devices page. Choose Backup media type as File and click on the Add button to select the database backup file(s). Then, click on OK to save the selections.
The next step would be to specify separate locations for data and log files. Click on the Files node on the Restore Database page. In this page, check the Relocate all files to folder option and then use the folder browser to change the Data file folder and Log file folder values. Finally, click on OK to start the database restore operation and wait until it is successful.
Finally, restart the SQL Server instance.
- Set the right database file size for the organization databases: Organization databases contain data that changes more frequently, and the arrival of new data is also more frequent. If the database file size is small, it will attract incremental growth of the file. There is a little overhead in frequent increments in the database file. Hence, it is recommended to specify a reasonable amount considering the data growth for the next three to six months.
To change the initial database file size using SQL Server Management Studio, please follow these steps:
- Right-click on the organization database and then select Properties | Files.
- On the Files page, modify the Initial Size column values (as shown in the following screenshot).
- Restart the SQL Server service:
- Set the Autogrowth size of the organization databases (
OrganizationName_MSCRM
) considering the growth of the database for a period of three to six months: If the autogrowth size is not managed properly, the database might experience many autogrow events, or very few. Every time an autogrowth event is fired, SQL Server holds up the database processing until the autogrowth event is complete. This equates to a slower response time for those SQL commands that are being processed against the database that is growing.The autogrowth size of a database can be modified by right-clicking on the database name, then going to Properties | Files. Then, click on the button associated under the Autogrowth column; then in the pop-up window, autogrowth-related settings can be modified.
- Increase the Initial Size value and the Autogrowth size of the tempdb database: This database is used by SQL Server, and it increments as its usage increases. By default, the initial size of
tempdb
is 8 MB; this means that every timetempdb
crosses the 8 MB size, an autogrowth event would fire. In addition to this, every time a SQL Server service is restarted,tempdb
would be set back to its initial size, which is 8 MB, by default. Frequenttempdb
file size growth slows down the performance of SQL Server. Hence, its initial size has to be increased to a larger value.Note
The
FileGrowth
property oftempdb
has to be adjusted based on the speed of the I/O subsystem on which thetempdb
files are located. It is recommended that we put thetempdb
database on a fast I/O subsystem and on disks that differ from those that are used by Dynamics CRM 2011 databases.The best way to estimate the size and the reasonable
FileGrowth
size oftempdb
is to run the workload in a test environment. Once we have a good estimate of the size oftempdb
, set its size with a safety factor that is appropriate. Autogrow should be used as a last resort but not as a strategy. Moreover, to avoid any potential latch timeouts, it is usually recommended to limit the autogrow operation to approximately 2 minutes.Another general guideline about
tempdb
is to create one data file for each CPU or core on the server (accounting for any affinity mask settings) and then adjust the number of files up or down as appropriate.Using multiple files reduces the
tempdb
storage contention and yields significantly better scalability. Make each data file the same size; this allows for optimal proportional-fill performance.To change the
tempdb
database file sizes using SQL Server Management Studio, please follow these steps:- Right-click on the
tempdb
database and then go to Properties | Files. - On the Files page, modify the Initial Size and Autogrowth size field to an appropriate size.
- Click on the Add button to add additional data files to the
tempdb
database. When additional data files are added, we should specify the following properties of the data file:- Logical Name
- Initial Size
- Autogrowth size
- Path
- File Name
- Then restart the SQL Server service.
We will also see how to allocate sufficient memory to SQL Server. By default, SQL Server can change its memory requirements dynamically based on the maximum resources available. The default setting for maximum server memory in SQL Server 2008 R2 or the 2012 edition is 2147483647 MB. Maximizing the allocation of maximum server memory might assist in getting a better performance. But if more memory is provided to SQL Server, the other applications will perform slowly on the same machine.
To change the maximum server memory size of SQL Server, follow these steps:
- Inside SQL Server Management Studio, on the Object Explorer pane, right-click on the server and select Properties.
- Then, navigate to the Memory node and under Server memory options, enter the Maximum server memory value as shown in the following screenshot:
More information on SQL Server memory configuration options can be found at the following link:
- Right-click on the
- Create indexes for the Extension and custom entity tables: Dynamics CRM 2011 creates the
<EntityName>_Extension
table for storing the custom attributes of a system entity, and custom entities are stored in custom tables in the organization databases. Monitor slow performing queries for these tables and then create a custom index for theExtension
and custom entity tables. This would assist in improving performance.If the Quick Find View and Lookup View features contain many custom fields, it can have an adverse impact on the performance of these lookup queries. Either use very limited custom fields in these views or create a non-clustered index that includes these custom fields.
Note
The application of any update rollup might delete existing custom indexes on Dynamics CRM 2011 database tables.
Use the
CREATE INDEX
query to create indexes in the Dynamics CRM 2011 tables. - Set maximum degree of parallelism (MAXDOP) to 1: When SQL Server runs on a multicore machine, the number of processors that can be used to run a single statement for each parallel execution plan can be specified. This is called maximum degree of parallelism. When the default value of MAXDOP is
0
, it means that SQL Server determines the maximum degree of parallelism.When we set the MAXDOP value to
1
, the parallel plan generation will be suppressed. It is often observed that with the MAXDOP value of1
, SQL Server performs better.The MAXDOP value can be set inside SQL Server Management Studio by following these steps:
- Right-click on the server name and then go to Properties | Advanced.
- In the advanced settings page, we can set the Max Degree of Parallelism field as shown in the following screenshot:
- Modify certain deployment properties' parameters: Updating a few deployment properties in the Dynamics CRM 2011 database provides fixes around certain database-level issues and exceptions.
SQLCommandTimeout
: It has been observed that a large size file import fails inside Dynamics CRM 2011 throwing SQL Server-level timeout errors.SQLCommandTimeout
is a field with a default value of30
that is present in theMSCRM_CONFIG
database.Increase the value of the
SQLCOmmandTimeout
parameter to a larger value. Use the following SQL command to modify the value:USE MSCRM_CONFIG GO UPDATE DeploymentProperties SET IntColumn=9000 WHERE ColumnName='SqlCommandTimeout'
AggregateQueryRecordLimit
: This deployment setting controls the maximum number of records that are returned by aggregate queries (used in charts). Sometimes the default value of 50000 does not suffice and exceptions such asAggregateQueryRecordLimit exceeded
are thrown.Increase the value of the
AggregateQueryRecordLimit
parameter to a larger value. Use the following SQL command to modify the value:USE MSCRM_CONFIG GO UPDATE DeploymentProperties SET IntColumn=100000 WHERE ColumnName='AggregateQueryRecordLimit '
- Control the growth of the PrincipleObjectAccess (POA) table: As almost every access to the Dynamics CRM 2011 data interacts with the POA table of the
<OrganizationName>_MSCRM
database, a large-size POA table would significantly slow down the SQL queries' performance. Until Update Rollup 6 for Dynamics CRM 2011, when a record was deleted from the Dynamics CRM 2011 system, corresponding POA entries were not getting deleted. This was causing a significant growth of the POA table and slowness in SQL query execution.Update Rollup 6 applies control over the growth of the PrincipleObjectAccess (POA) table by deleting entries when a Dynamics CRM 2011 record gets deleted from the system. However after installing Update Rollup 6, there is a SQL query that is required to be executed on the Dynamics CRM 2011 database only once. The SQL query can be found at the following link:
- Control the growth of the AsyncOperationBase table: The
AsyncOperationBase
table of Dynamics CRM 2011's organization database grows to be very large in no time if the organization executes asynchronous operations such as workflows. Unless controlled, the growth of this table can affect performance adversely.A regular cleanup of the
AsyncOperationBase
table is recommended, although it will delete all the entries for every job run in a period. If completed system job entries have some business value and have to be stored for a longer period, cleaning up this table regularly is not possible.The SQL script to delete the
AsyncOperationsBase
table can be found at http://support.microsoft.com/kb/968520.- Rescheduling Dynamics CRM 2011 maintenance jobs:
By default, Dynamics CRM 2011 creates and runs a few maintenance jobs on a daily basis. These jobs are actually executed by Microsoft Dynamics CRM Asynchronous Processing Service (maintenance). This service schedules these jobs based on when the organization has been created. Therefore, these jobs can very well run when there is a pick user load on the system and will not slow down the system's performance or cause timeouts. Hence, it is recommended to reschedule these jobs when we expect very little of the user in the system. In the How it works… section of this recipe, we shall discuss a little more about these jobs.
Dynamics CRM stores scheduling information about these jobs in the
ScaleGroupOrganizationMaintenanceJobs
table of theMSCRM_CONFIG
database. However, there is a GUI tool available to modify the settings of these jobs. The GUI tool is known as CRM 2011 Maintenance Job Editor and is available at the following link for download:
- Rescheduling Dynamics CRM 2011 maintenance jobs:
Note
If you want to disable a job, please specify a future date (for example, 12/31/3000) under Next Run Time for that job.
How it works…
This recipe discusses the common performance tuning techniques performed on the Dynamics CRM 2011 database. These tuning techniques should be applied in conjunction with the specific performance bottlenecks identified in the Dynamics CRM 2011 environment.
In this recipe, we have first tried to check the firing of the autogrowth event. Frequent autogrowth event fire would significantly impact performance. Indexing data, which does not use indexes, is important to improve the performance of SQL queries. MAXDOP should be set to 1
for performance improvement. Additionally, the best practices around temdb
are also recommended.
Finally, control the growth of the PrincipleObjectAccess
and AsyncOperationBase
tables along with adjusting the timings for the Dynamics CRM 2011 maintenance tasks. A list of maintenance jobs run by Asynchronous Processing Service (maintenance) is as follows:
