Feeds:
Posts
Comments

Archive for October, 2012

Yes, it is allowed to add multiple column level CONSTRAINTs to the same column in a table.

All the CONSTRAINTs are checked for every INSERT, UPDATE operation and all must be satisfied for the INSERT/UPDATE operation to be successful.

Following is an example (DBA.SampleTable) with a column (Col2) with 3 constraints.

  1. Default: DF_SampleTable_ID
  2. Foreign Key : To a column in table DBA.ForeignKeyTable: FK_SampleTable_ForeignKeyTable_ID
  3. Check: CK_SampleTable_ID
CREATE TABLE DBA.SampleTable
(
 ID INT NOT NULL CONSTRAINT DF_SampleTable_ID DEFAULT (1) -- First Constraint : DEFAULT
, Col2 VARCHAR(5) NOT NULL CONSTRAINT DF_SampleTable_Col2 DEFAULT 'AA'
)
GO

CREATE TABLE DBA.ForeignKeyTable
(
ID INT NOT NULL IDENTITY(1,1) CONSTRAINT PK_ForeignKeyTable_ID PRIMARY KEY CLUSTERED
, Col2 VARCHAR(5) NOT NULL CONSTRAINT DF_ForeignKeyTable_Col2 DEFAULT 'BB'
)
GO

-- Second Constraint: CHECK
ALTER TABLE DBA.SampleTable ADD CONSTRAINT CK_SampleTable_ID CHECK (ID < 5)

-- Third Constraint: Foreign Key
ALTER TABLE DBA.SampleTable ADD CONSTRAINT FK_SampleTable_ForeignKeyTable_ID FOREIGN KEY (ID) REFERENCES DBA.ForeignKeyTable(ID)
GO

Result:

  • First INSERT into DBA.SampleTable throws an error. Since there are no records in DBA.ForeignKeyTable, the constraint ‘FK_SampleTable_ForeignKeyTable_ID’ does not allow any inserts. After inserting a record into DBA.ForeignKeyTable (with ID = 1 ), inserts into DBA.SampleTable are successful.
Multiple Constraints on a Single Column Foreign Key

Multiple Constraints on a Single Column Foreign Key

  • But when I try to INSERT a record with ID = 6 into DBA.SampleTable, the CHECK CONSTRAINT CK_SampleTable_ID, throws error. Even after you enter more than 6 records in the second table (ID =6), no record with ID = 6 is allowed into the first table (DBA.SampleTable) as it conflicts with the CHECK CONSTRAINT (ID < 5)
Multiple Constraints on a Single Column CHECK

Multiple Constraints on a Single Column CHECK

Hope this helps,

_SqlTimes

Advertisements

Read Full Post »

When you are executing long running queries or executing a long batch, sometime, you want to send some quick responses back to the client (i.e. Management Studio [SSMS], etc.) to show the progress.

Usually I add PRINT statements at every block of statements to show the progress made from one block of statements to the next block of statements. This gives me confirmation about the progress as it is happening.

At the end of running the batch Sql Server will send every response to the client, but if you want to do this at intermittent steps use RAISERROR WITH NOWAIT to flush everything to the client.

Example:

PRINT '-------------------------------------------------------------'
PRINT 'Starting Batch: ' + CONVERT(VARCHAR, GETDATE(), 109)

RAISERROR ('Start of Step 1', 10, 1) WITH NOWAIT
 --
 -- Step 1 Sql statements
 --
 WAITFOR DELAY '00:00:10'
 --
 -- More Step 1 Sql statements
 --
RAISERROR ('End of Step 1', 10, 1) WITH NOWAIT
 --
 -- Step 2 Sql statements
 --
 WAITFOR DELAY '00:00:10'
 --
 -- More Step 2 Sql statements
 --
RAISERROR ('End of Step 2', 10, 1) WITH NOWAIT
PRINT 'End of Batch: ' + CONVERT(VARCHAR, GETDATE(), 109)
PRINT '-------------------------------------------------------------'

RAISERROR with NOWAIT allows to send messages immediately to client. Refer to MSDN article here

Hope this helps,
_SqlTimes

Hat Tip: I learnt this from SqlTeam blog

Read Full Post »

“Performance Studio” is a new tool available natively starting Sql Server 2008.

In Sql Server 2005 days, we had a tool (plugin) called “Performance Dashboard” (SP2 and later only) that relies on DMV & DMF to show the performance statistics of Sql Server almost real time. Its reports would query the DMV’s real time and display the data in nice graphs and tables. It was quite a useful, but one feature that I noticed missing and very badly wanted  was data retention. It does not retain data for trend-analysis, it focusses only on how the server is performing as of now. But if you wanted to compare today’s statistics with yesterday or a month ago, you are out of luck. There is no built-in way to do that (or I did not know). But overall ‘Performance Dashboard’ is a great tool. If any one needs it for their Sql Server 2005 SP2 environment, it is available for download here

Now, Microsoft made the next version much better. Its not a plugin anymore. It’s part of the product itself and it can monitor multiple server at the same time (and the graphs are much prettier too). All data is stored into a dedicated database called as Management Data Warehouse (MDW), a relational database with three schemas, i.e. Core, Snapshots and Custom_Snapshots.

‘Performance Studio’ is a framework that ties together collection, analysis, troubleshooting and persistence of SQL Server diagnostics information. It has tools for

  • Low overheard data collection
  • Performance Monitoring, troubleshooting and tuning.
  • Persistence of diagnostics data
  • Reporting

In simple terms, this tools allows you to capture data from “PerfMon stats”, DMV’s, and other T-SQL (i.e. Top 10 queries by CPU, disk, etc) and store in one central location. The components that perform the gathering of these metrics (BufferCacheHitRatio, Top 10 Queries, etc) are called ‘Collection Sets’. In each collection set you have a bunch of such counters grouped into one. All of these counters are captured together and stored. The central location where all the data is store is called Management Data Warehouse (MDW).

To enable this collection, go to ‘Management’ >> ‘Data Collection’ >> right click >> ‘Configure Management Data Warehouse’. First you want to select ‘Create or upgrade a Management Data Warehouse’, then follow the wizard. Its fairly intuitive.

Performance Studio Setup

Performance Studio Setup

Again go to  ‘Management’ >> ‘Data Collection’ >> right click >> ‘Configure Management Data Warehouse’ and this time select ‘Setup Data Collection’. This is where we set up the source server that you want to monitor. Cache is a temporary location on your disk where all the transient data is stored until it is uploaded into MDW database.

Collection provided by default are:

  1. Disk Usage
  2. Query Statistics
  3. Server Activity

1. Disk Usage Collection Set: Monitors disk usage for each database

  • Collects and uploads every 6 hours
  • Retains data for 90 days
  • Stores the collected data in these tables: ‘Snapshots.disk_usage’ & ‘Snapshots.log_usage’

Query Statistics Collection Set: Records interesting queries

  • Collects and uploads every 15 minutes
  • Gathers data for the top 3 interesting queries based on highest values for max_worker_time, max_physical_reads, max_logical_reads and many more factors [refer to the Webcast below for more info].
  • Gathered data is loaded into ‘shapshots.query_stats’, ‘snapshots.notable_query_text’, ‘snapshots.notable_query_plan’ tables

Server Activity Collection Set: Tracks waits states, memory and performance counters.

  • Collects data every 10/60 seconds and uploads that data to the MDW database every 15 minutes.
  • Retained for 14 days
  • Gathers data from:
    • Active Sessions and Requests (every 10 seconds)
    • Memory (every 60 seconds)
    • Performance Counters (every 60 seconds)
    • Wait Statistics (every 60 seconds)
    • Schedulers (every 60 seconds)

More points to note:

  • You can create your own “Collection Sets” to collect custom data.
  • You can have just one “Data Collector” to collect data from several servers.
  • Set up this collector (and MDW database) on a dedicated server or a server separate from the servers you are trying to monitor.
  • Plan to allow the disk usage to grow approximately at 250 – 350 MB per day.
  • Collects only from Sql Server 2008 and beyond (does not collect data from Sql Server 2005 or 2000).
  • Cache location is a directory configured during set up. This is where all the data is stored until it is time to upload the data into MDW database.

Reports:

To open reports, you want to go to ‘Management’ >> ‘Data Collection’ >> right click >>’Reports’ >> ‘Management Data Warehouse’ >> select a category.

Open Performance Studio Reports

Open Performance Studio Reports

A Sample report:

Performance Studio Sample Report

Performance Studio Sample Report

I would encourage everyone to watch this Technet webcast on this new feature.

https://msevents.microsoft.com/cui/WebCastEventDetails.aspx?culture=en-US&EventID=1032349947&CountryCode=US

Hope this helps,

_SqlTimes

Read Full Post »

A few days ago, I was setting up Change Data Capture (CDC) on a new database on my machine and I ran into this error.

Msg 22830, Level 16, State 1, Procedure sp_cdc_enable_db_internal, Line 186
Could not update the metadata that indicates database 'MyDB' is enabled for Change Data Capture.
The failure occurred when executing the command 'SetCDCTracked(Value = 1)'.
The error returned was 15404: 'Could not obtain information about Windows NT group/user 'AD\Sql1', error code 0x6e.'.
Use the action and error to determine the cause of the failure and resubmit the request.

Msg 266, Level 16, State 2, Procedure sp_cdc_enable_db_internal, Line 0
Transaction count after EXECUTE indicates a mismatching number of BEGIN and COMMIT statements.
Previous count = 0, current count = 1.

Msg 266, Level 16, State 2, Procedure sp_cdc_enable_db, Line 0
Transaction count after EXECUTE indicates a mismatching number of BEGIN and COMMIT statements. Previous count = 0, current count = 1.

Msg 3998, Level 16, State 1, Line 1
Uncommittable transaction is detected at the end of the batch. The transaction is rolled back.

This database ‘MyDatabase’ is created using domain account ‘AD\Sql1’ and I am setting up CDC using the same domain account (Windows Authentication). But for some reason CDC could not be enabled on this database. If you check the properties of this database you can see that the owner of this database is still ‘AD\Sql1’.

I am not sure why it is not able to obtain information about this domain account, but once I changed the owner from domain account to ‘sa’, it works !!

EXEC sys.sp_changedbowner 'sa'

Then try this and it works:

EXECUTE sys.sp_cdc_enable_db
Enable_CDC_on_DB

Enable_CDC_on_DB

I need to figure out why this error happens and why changing the owner to ‘sa’ works.

Hope this helps,

_SqlTimes

Read Full Post »

This is an interesting error that I ran into recently. After some research and gathering useful information I stumbled up on this link from one of the Microsoft folks. This helped me appreciate the error message a bit more.

Login failures are common. We see them everyday (hopefully not on production machines). But the error message is very generic. From initial glance, all you can gather is that the login failed (Eureka !! I already know that). But this articles gives insight into what this seemingly simple (or obscure) message is by design.  The key here is the State.

Lets look at a sample error message from Management Studio (SSMS):

TITLE: Connect to Server
------------------------------
Cannot connect to xxx.xxx.xxx.xxx. (IP Address of Server)
------------------------------
ADDITIONAL INFORMATION:
Login failed for user ''. (Microsoft SQL Server, Error: 18456)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=18456&LinkId=20476
------------------------------
BUTTONS:
OK
------------------------------

As you can see the first part says the obvious, albeit important. Gives name of the server you are trying to connect to. “ADDITIONAL INFORMATION:” section gives the meat of the error message that it’s a “Login Failure” with error  number “18456”. Note the time when this error occurred (not displayed in the error message, but have an approximate idea when the error occurred).

Now let’s find the real reason why the login attempt failed.

Go to the server and look at “Sql Server Error Logs”. Open the most recent error log and scroll down to the approximate time when the error occurred. You see something like this.

Open Sql Server Error Logs

Drill down to the Sql Server Error Logs

2012-10-09 11:14:25.21 Logon Error: 18456, Severity: 14, State: 8.

2012-10-09 11:14:25.21 Logon Login failed for user '<user name>'. Reason: Password did not match that for the login provided. [CLIENT: xxx.xxx.xxx.xxx]

This server error log provides the missing link to correctly understand what went wrong during login. Mostly it is either “password mismatch” or “Login disabled and password mismatch” or “Invalid userid”. This is identified by State in the error message.

Use the following table to map each state to its description.

ERROR STATE ERROR DESCRIPTION
2 and 5 Invalid userid
6 Attempt to use a Windows login name with SQL Authentication
7 Login disabled and password mismatch
8 Password mismatch
9 Invalid password
11 and 12 Valid login but server access failure
13 SQL Server service paused
18 Change password required

The State value in the error message from Management Studio (SSMS) always says 1, it is by design. To keep it more obscure and secure. The puzzle is solved only by looking at the remaining description from the Error Log.

Thank you Il-Sung Lee (Program Manager, SQL Server Protocols) for your blog.

Update on December 04, 2012: For more Error States, please refer to this blog here. It saved me several times, recently, in figuring out what was going on.

Hope it helps,

_SqlTimes

Read Full Post »

This is a quick post to show some details on this undocumented stored procedure used to delete older backup files. Parameters for this stored procedure are not documented in BoL or not readily available, may be for a good reason. But I wanted to add some details to shed some light.


EXECUTE master.dbo.xp_delete_file
   0
 , N'S:\SqlMaintFolder\'
 , N'bak'
 , N'2012-10-10T01:05:34'
 , 1

&amp;nbsp;

Some points to note:

  1. First parameter is ‘FileTypeSelected’, it says what kind of files to delete; 1 means Report Files, 0 means Backup files.
  2. Second parameter indicates, the path where the files are located. Make sure there is a trailing ‘\’
  3. Third parameter indicates the file extension. Make sure there is no dot in the file extension  [ ‘.bak’ ]
  4. Fourth parameter: delete all files before this date and time.
  5. Fifth parameter indicates if you want to delete files from sub-folders or not.

Hope it helps,

_SqlTimes

Read Full Post »