Wednesday, February 25, 2009

SQL Server 2005 Performance Dashboard

SQL Server Performance Dashboard is a handy set of reports which I would highly recommend to anyone who wants to dig a little deeper into SQL Server. Requires SQL Server 2005 SP2 or above.

You can download the Performance Dashboard here:
http://www.microsoft.com/downloads/details.aspx?FamilyId=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en

Here are a couple of screenshots showing the installation process:




By default, the dashboard report files install to
c:\program files\Microsoft SQL Server\90\Tools\PerformanceDashboard\



To complete the setup process, locate the setup.sql script in the PerformanceDashboard folder and run it against the server you wish to monitor.



Right click on the server, go to reports and then custom reports.
The best place to start is performance_dashboard_main.rdl. From this report you can navigate to all of the others.



So that's it for the installation - there are some really good reports to browse straight out of the box.
In my next post I will write about some of the more interesting metrics available through the performance dashboard.

Saturday, February 14, 2009

Using AMO to interface with SQL Server Analysis Services (SSAS)

In SQL Server 2000, working programmatically with SSAS was a real challenge. Performing a simple operation such as cube processing was a non trivial process. One of the systems I have been working on recently was using the following technique to process a cube:

1. Client application calls a web service

2. The web service calls stored procedure

3. The stored procecure uses sp_oacreate() to invoke a DTS package

4. The DTS package uses an Analysis Services Processing Task to process the desired cube

Needless to say this chain of execution is more complicated that we would like, and there are many problems that can be encountered using this technique. For example, sp_oacreate has been known to cause memory fragmentation problems in the mem-to-leave area of SQL Server. This issue was resolved in SQL 2000 SP4, however I am still not convinced that the process of building up and tearing down objects is rock solid.

Wouldn’t it be nice to have an API straight into SSAS?

In SQL 2005 and 2008, we can use AMO (Analysis Management Objects)!

http://msdn.microsoft.com/en-us/library/ms345089(SQL.90).aspx

AMO is a straightforward, simple to use object model which allows you to do almost anything with SSAS programmatically. In fact, SQL Server Management Studio (SSMS) is nothing more than a well presented SMO/AMO client.

The point is that it is possible to build extremely powerful management applications by hooking into the AMO API. How do you get started with AMO?

Here is a simple starting point:

· Create a new Visual Studio 2008 Console Application

· Add a reference to Microsoft.AnalysisServices.dll ( I am using version 9.0.242.0)

· Add a using directive to the top of your code

using Microsoft.AnalysisServices;

· Now you are ready to start using the AMO object model. Instantiate a server object and connect to SSAS by providing a connection string

Server s = new Server();

s.Connect(“Provider=msolap.3;Datasource=localhost;Initial Catalog=FINANCIALS'”);

You can check the boolean value of s.IsConnected to ensure that you are ready to do some work, although I expect the Connect() method would throw an exception if it doesn’t find the designated server.

Once you have a reference to the server object, you can access other objects by traversing the hierarchy. The following two lines get a reference to a database by name and then process that database

Database d = s.Databases.GetByName("MY_FINANCIALS");

d.Process();

So there you have it – programmatically processing an SSAS cube in four lines of code. We’ve come a long way since SQL 2000!

Wednesday, January 21, 2009

This is a great reference poster which shows the most commonly used (and disused) keyboard shortcuts in VS 2008.
You can download it from Microsoft here:
http://www.microsoft.com/downloadS/thankyou.aspx?familyId=e5f902a8-5bb5-4cc6-907e-472809749973&displayLang=en


Monday, January 5, 2009

MCSD Upgrade Exams Expire March 31st

If you're in the same boat as me, then you probably use .NET almost every day but haven't upgrading your certifications from 1.1.

It is worth pointing out that the upgrade exams expire at the end of March 2009.

http://www.microsoft.com/learning/mcp/upgrade/vs2005/default.mspx

Time to hit the books!

Friday, September 19, 2008

Microsoft Beta Exams

Recently I sat a beta exam for SQL Server 2008. The exam was
70-451 PRO: Designing Database Solutions and Data Access Using Microsoft SQL Server 2008

For anybody interested in sitting beta exams, they can be a little rough around the edges. Some of the questions can be ambiguous and this particular exam was quite long.

On the upside, they are free to sit and count in full if you pass. Unlike the regular exams you will receive your results later on once the exams go mainstream

Beta exam announcements are here
http://blogs.technet.com/betaexams/

Ben

Monday, September 8, 2008

Writing Applications That Benefit From Database Mirroring

One of the most useful high-availability features introduced in SQL Server 2005 is without doubt database mirroring. Some of the features that stand out when you compare this technology with database clustering include:

  • Failover time can be sub-second depending on the configuration you are using. We are talking much faster than a cluster failover
  • You have the option of using synchronous or asynchronous commits across the two servers:
    • When you use synchronous mirroring, control doesn't return to the client until a commit happens on both the principal and mirror databases
    • When you use asynchronous, SQL Server returns to the client in the usual fashion and commits on the mirror when it can. Of course this can lead to some potential for data loss in the event of a failover
  • The buffer cache on the mirror server is kept "hot". That is to say that recently used pages on the principal server are also present in the buffer cache on the mirror server. The benefit here is that the redo time will be shortened in the circumstance that a failover occurs. Also users are less likely to notice a performance hit.
  • You can use database snapshots on the mirror database for reporting purposes
  • You can implement mirroring with cheap hardware.
  • Easier to configure than log shipping

So far I have seen quite a few implementations where customers are using asynchronous mirroring with manual failover. Using the witness role is yet to gain popularity based on my experience.

Wiring up the Client

Perhaps the main drawback with mirroring when compared to clustering is that the client needs a little extra intelligence when it comes reconnecting after a failover. The diagram below (from MSDN) shows how the client connects initially.


Take a look at the connection string below. You will notice that there is an extra bit which specifies the IP address and port of the Failover Partner.

"Server=148.182.197.186,1433; Failover Partner=148.182.197.186,1234; Database=MirrorTest; Network=dbmssocn; Integrated Security=SSPI";

The SQL Native Client or .NET Data Provider for SQL Server will initially try to connect to the server specified in the Server part of the connection string. If it can't connect, it will try the failover partner name. If the failover partner is in fact the current principal server, then the client recognises this and downloads the server/instance name of the current mirror server and stores it in memory, overwriting the value provided by the connection string.

Of course if the principal database is offline, and the mirror is yet to be promoted then the client will fail to connect. This situation is highly likely in the event that a witness server is not being used to facilitate automatic failover.


Client Behaviour during a Failover

If a client is connected when a failover takes place, the client unfortunately has to be smart enough to attempt to reconnect. This is a bit of a drawback for legacy applications, which might need a restart.

I wrote a few lines of C# to see how long a manual failover would knock my test client off the air for.

public string GetData()
{

try
{
DataSet ds = new DataSet();
SqlDataAdapter da = new SqlDataAdapter("SELECT @@SERVERNAME", conn);
da.Fill(ds);

return DateTime.Now.ToString("hh:mm:ss") + ": " + ds.Tables[0].Rows[0][0].ToString();

}
catch (Exception ex)
{
string retVal = "Connection Failed. Retrying Connection...";

conn = new
SqlConnection();
conn.ConnectionString = connstring;

try
{
conn.Open();
}
catch( Exception x )
{
retVal = "No Connection...";
}
return retVal;
}
}

I just set up a timer that called this method every 100 milliseconds, and it outputs something like the following:


Then I issued the following command on my principal server. (Note you have to be in the master database to issue this command):

ALTER DATABASE MIRRORTEST SET PARTNER FAILOVER

And viola! My client reconnects to the new principal server within 100ms


Database mirroring is an exciting technology, and I would encourage people to consider using it as an underlying high-availability platform when building applications.