:::: MENU ::::

Posts Categorized / Performance Tunning and Optimization

  • Jul 03 / 2008
  • 0
dbDigger, Performance Tunning and Optimization, Troubleshooting


Today i noticed that while my system was idle apparently, CPU consumption began to cross 80%. When i tracked the process, it was “helpsvc.exe”.
It is windows help service with name “help and support” in windows services. Some spywares may use it. So its better to change its start up type to manual or disable it on your server. So that it may not consume resources.

  • May 21 / 2008
  • 0
DBA best practices, dbDigger, Performance Tunning and Optimization

SQL Server T-SQL best practices

Sometimes abbreviated T-SQL, Transact-SQL is Microsoft’s and Sybase’s proprietary extension to the SQL language. In order to make it more powerful, SQL has been enhanced with additional features such as: Control-of-flow language and Local variables, Various support functions for string processing, date processing, mathematics, etc. Improvements to DELETE and UPDATE statements

Following are some tips for T-SQL scripts

  • Use the SQL Server Profiler Create Trace Wizard to run the “T-SQL By Duration” trace. You can specify the length of the long running queries you are trying to identify (such as over 1000 milliseconds), and then have these recorded in a log for you to investigate later.
  • In your queries, don’t return column data you don’t need. For example, you should not use SELECT * to return all the columns from a table if you don’t need all the data from each column. In addition, using SELECT * prevents the use of covered indexes, further potentially hurting query performance.
  • You may have heard of a SET command called SET ROWCOUNT. Like the TOP operator, it is designed to limit how many rows are returned from a SELECT statement. In effect, the SET ROWCOUNT and the TOP operator perform the same function. While is most cases, using either option works equally efficiently, there are some instances (such as rows returned from anunsorted heap) where the TOP operator is more efficient than using SET ROWCOUNT. Because of this, using the TOP operator is preferable to using SET ROWCOUNT to limit the number of rows returned by a query.
  • In a WHERE clause, the various operators used directly affect how fast a query is run. This is because some operators lend themselves to speed over other operators. Of course, you may not have any choice of which operator you use in your WHERE clauses, but sometimes you do.
  • Don’t be afraid to make liberal use of in-line and block comments in your Transact-SQL code, they will not affect the performance of your application and they will enhance your productivity when you or others come back to the code and try to modify it.
  • If possible, avoid using SQL Server cursors. They generally use a lot of SQL Server resources and reduce the performance and scalability of your applications. If you need to perform rowby- row operations, try to find another method to perform the task.
  • If you know that there are duplicate records, and this presents a problem for your application, then by all means use the UNION statement to eliminate the duplicate rows. But if not, use UNION ALL, which is less resource intensive.
  • Carefully evaluate whether your SELECT query needs the DISTINCT clause or not. Some developers automatically add this clause to every one of their SELECT statements, even when it is not necessary.
  • Always include a WHERE clause in your SELECT statement to narrow the number of rows returned. Only return those rows you need.
  • When you have a choice of using the IN or the BETWEEN clauses in your Transact-SQL, you will generally want to use the BETWEEN clause, as it is more efficient.
  • If you need to write a SELECT statement to retrieve data from a single table, don’t SELECT the data from a view that points to multiple tables. Instead, SELECT the data from the table directly, or from a view that only contains the table you are interested in. If you SELECT the data from the multi-table view, the query will experience unnecessary overhead, and performance will be hindered.
  • If your application allows users to run queries, but you are unable in your application to easily prevent users from returning hundreds, even thousands of unnecessary rows of data, consider using the TOP operator within the SELECT statement. This way, you can limit how many rows are returned, even if the user doesn’t enter any criteria to help reduce the number or rows returned to the client.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

  • May 21 / 2008
  • 0
Analysis Services SSAS, dbDigger, Hardware and Platform, Performance Tunning and Optimization

Memory Settings for SQL Server Analysis Services

Processes (such as Analysis Services) running in Windows 2000 Server or Windows Server 2003 Standard Edition can address a maximum of 2 gigabytes (GB) of RAM in the main process space. If you are working with large or complex cubes, Analysis Services may require more than 2 GB to load dimensions into memory, process dimensions, load replica dimensions, and still have sufficient memory for an effective query results cache. To allow Analysis Services to address more than 2 GB of RAM in a single process, you must install Windows 2000 Advanced Server; Windows 2000 Datacenter; Windows Server 2003 Enterprise Edition; or Windows Server 2003 Datacenter Edition. Windows Server 2003 Enterprise Edition and Windows Server 2003 Datacenter Edition, are available in 32-bit and 64-bit versions. The 64-bit version supports the 64-bit version of Analysis Services. Because Windows 2000 Advanced Server and Windows 2000 Datacenter Server are 32-bit operating systems, only the 32-bit version of Analysis Services can be installed. The 64-bit version of Analysis Services can address all available memory in the main process space without any special configuration (up to 64 GB with the Enterprise Edition and up to 512 GB with the Datacenter Edition). The 32-bit version of Analysis Services can address up to 3 GB of memory in the main process space, if you enable Application Memory Tuning. Unless you enable Application Memory Tuning, no process can address more than 2 GB in the main process space. To enable Application Memory Tuning on the Analysis Services computer, set the /3 GB switch in the boot.ini file and then use Analysis Manager to set an appropriate Memory conservation threshold value for Analysis Services. If you set the /3GB switch in boot.ini, the computer on which Analysis Services is running should have at least 4 GB of memory to ensure that the Windows operating system has sufficient memory for system services. If you are running other applications on the same computer, you must factor in their memory requirements as well. For example, if the SQL Server service and Analysis Services are installed on the same computer, SQL Server can address memory above 4 GB because SQL Server supports Address Windowing Extensions (AWE). In this case, you could install and use 8 GB or more on the server. However, because Analysis Services does not support AWE, Analysis Services cannot access more the 3 GB of memory in the main process space unless the 64-bit version is used. Having sufficient memory for Analysis Services increases query responsiveness and processing performance. Properly configuring available memory will maximize the use of memory, limit the use of disk resources for processing, and prevent the cleaner thread from evicting cache entries too quickly. The amount of memory used by Analysis Services for various purposes is regulated by a number of memory settings:

  • High and low memory settings
  • Very Large Dimension Memory (VLDM) threshold setting
  • Process buffer settings

These settings are configured using default values or based on the amount of physical memory in the computer during installation. Changing some of these memory settings is generally recommended.

High and Low Memory Settings
Analysis Services employs a number of mechanisms to keep the amount of memory allocated to it within the range that is defined by two settings on the Environment tab of the Server Properties dialog box in Analysis Manager: the Memory conservation threshold and the Minimum allocated memory settings (the HighMemoryLimit and the LowMemoryLimit values in the registry). The default value for the Memory conservation threshold setting is the amount of physical memory on the computer at the time of installation. The default value for the Minimum allocated memory setting is half the amount of physical memory on the computer at the time of installation. If you change the amount of memory on the computer after installation, you must manually modify these values. Otherwise, Analysis Services will not properly utilize the actual amount of physical memory on the computer.
When the amount of memory used by Analysis Services exceeds the Memory conservation threshold setting, Analysis Services increases the priority of the cleaner thread to normal in order to quickly reduce the allocated memory to the Minimum allocated memory setting. If the total memory allocated to all Analysis Services tasks exceeds the memory conservation threshold by more than approximately 6.25 percent, Analysis Services immediately begins dropping the cache entries for entire cubes in order to quickly reduce the amount memory used by Analysis Services. In this scenario, because Analysis Services is shedding memory extremely quickly, the total amount of allocated memory may drop below the Minimum allocated memory setting. If you set the Minimum allocated memory setting too low, the cleaner thread removes too many cached entries from the query results cache.

Consult us to explore the Databases. Contact us