Feeds:
Posts
Comments

Archive for March, 2014

EXCEPT is a great operator. More importantly it is a set operator. So, by design it is a bit more efficient. This is part of the great set operators like:

  • INTERSECT
  • UNION
  • EXCEPT
  • UNION ALL

EXCEPT

Except does two things:

  • Returns values from left query (or data  set) that are not in the right query (or data set)
  • Return distinct values (like UNION does)

It checks the data from both the data sets and returns distinct values from left side that do not exist in right side data set. This is a great operator to use when cleaning data or comparing data to perform some clean up operations i.e. de-dup or data-sanitize. See the venn diagram below:

Except Operator in Venn Diagram

Except Operator in Venn Diagram

This is great for several reasons, but the best reasons are ones based on its prerequisites:

  • Data types must match
  • Columns and order must match
  • Not a prerequisite, but it can compare NULL values. [i.e. SET ANSI_NULLS OFF]. It can compare NULL values as if they are regular literals.

It automatically makes sure the data type matches and you can pick and choose the data set you generate to compare.

Let’s take an example:

Step 1: Create tables

--
-- Create MainTable (left hand side table in comparision)
--
DROP TABLE dbo.MainTable
GO

CREATE TABLE dbo.MainTable
(
   ID             INT           NOT NULL IDENTITY(1,1) PRIMARY KEY CLUSTERED
 , Name           VARCHAR(20)   NULL
 , Age            INT           NULL
 , Date_Of_Birth  DATETIME      NULL
)
GO

--
-- Create 'Compare' table. Right hand side table
--
DROP TABLE dbo.Compare
GO

CREATE TABLE dbo.Compare
(
   ID             INT           NOT NULL IDENTITY(1,1) PRIMARY KEY CLUSTERED
 , Name           VARCHAR(20)   NULL
 , Age            INT           NULL
 , Date_Of_Birth  DATETIME      NULL
)
GO

Step 2: Load some records

--
-- Load data into both the tables
--
INSERT INTO dbo.MainTable (Name, Age, Date_Of_Birth)
VALUES
   ('Jon', 20, NULL)
 , ('Ben', 40, '2012-02-02')
 , ('Test', 45, '2013-05-05')

GO

INSERT INTO dbo.Compare (Name, Age, Date_Of_Birth)
VALUES
   ('Jon', 20, GETDATE())
 , ('Ben', 40, '2012-02-01')
 , ('Test', 45, '2013-05-05')
GO

Step 3: Let’ compare the data

Looking at the data, it could be noticed that there are two records that are different between the tables (ID 1 & 2). But in dbo.MainTable, for record ID = 1, has NULL values in Date_of_Birth column. So, lets see how Sql Server performs the comparision.

--
-- Use EXCEPT operator to compare
--
SELECT  M.ID
      , M.Name
      , M.Age
      , M.Date_Of_Birth
FROM dbo.MainTable AS M
EXCEPT
SELECT  C.ID
      , C.Name
      , C.Age
      , C.Date_Of_Birth
FROM dbo.Compare AS C
GO

Step 4: Results

As you can see the results match with our expectations. It identifies both ID 1 & 2 are different from data set on the right. This is a great tool in many ways.

 

Except Operator Results

Except Operator Results

 

Hope this helps,
_Sqltimes

Read Full Post »

Quick one today:

Lately, Table Value Constructor has become a common place in every day usage. It is easy, awesome and intuitive !!

Let’s look at an example with dbo.MainTable


CREATE TABLE dbo.MainTable
(
      ID            INT         NOT NULL IDENTITY(1,1) PRIMARY KEY CLUSTERED
    , Name          VARCHAR(20) NULL
    , Age           INT         NULL
    , Date_Of_Birth DATETIME    NULL
)
GO

 

In the past, the way I would load data into this table would be using INSERT statement; Like this:

--
-- Traditional Insert statements
--
INSERT INTO dbo.MainTable (Name, Age, Date_Of_Birth) VALUES ('Jon', 20, GETDATE())
INSERT INTO dbo.MainTable (Name, Age, Date_Of_Birth) VALUES ('Ben', 40, '2012-02-02')
INSERT INTO dbo.MainTable (Name, Age, Date_Of_Birth) VALUES ('Test', 45, '2013-05-05')
GO

 

Starting with Sql Server 2008, now we have more intuitive options.

 

Approach 1: Simple

--
-- Using Table Value constructor - Example 1
--
INSERT INTO dbo.MainTable (Name, Age, Date_Of_Birth)
VALUES
   ('Jon', 20, GETDATE())
 , ('Ben', 40, '2012-02-02')
 , ('Test', 45, '2013-05-05')
GO

Approach 2: Hybrid

--
-- Hybrid example : Table Value Constructor
--
INSERT INTO dbo.MainTable (Name, Age, Date_Of_Birth)
VALUES
   (  'Jon'           , 20           , GETDATE()            )
 , (  (SELECT 'Jim')  , (SELECT 34)  , (SELECT GETDATE())   )
GO

Hope this helps,
_Sqltimes

Read Full Post »

T-Sql has evolved much over the years and has several ways to write code to accomplish a task. As database people our goal is to choose the most optimal way to write code. There are points a database resources take into consideration when writing optimal code; Below are some of those points:

IF EXISTS

Sometimes, in procedural code, we write code like:

SET STATISTICS IO ON
SET NOCOUNT ON

DECLARE @RecCount INT

SELECT @RecCount = COUNT(*)        -- check if there are any records
FROM dbo.SampleTable WHERE ID < 23

IF @RecCount > 0                  -- do something if records exists
BEGIN
    --- do something
END

Result

As you can see below, the number of logical reads is 6. For a larger table, this will be even larger. We need to minimize this.

Table 'SampleTable'. Scan count 1, logical reads 6, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

(1 row(s) affected)

In this case, we are checking to see if there are any qualifying records before running business logic on them. This is a common approach and it could be improved.

Instead of checking for the “existence of records” in its entirety, there is a way to arrive at a decision sooner and using lesser resources. IF EXISTS is just looking for a TRUE or FALSE value; If there are records it is TRUE, if not FALSE. Even if there is one record, the return value from IF EXISTS is TRUE — so we don’t have to wait for the query to check if there are any more records, once we identify existence of even one qualifying records.

This code could be re-written in two ways:

Approach 1:

IF EXISTS ( SELECT 1 FROM dbo.SampleTable WHERE ID < 23)
BEGIN
 --- do something
END

Result:

As you can see below, the number of logical reads is reduced to 2.

Table 'SampleTable'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

Approach 2:

Another way of writing the same code is using IF statement:

IF ( SELECT COUNT(*) FROM dbo.SampleTable WHERE ID < 23) > 0
BEGIN
 --- do something
END

Result: As you can see below, the number of logical reads is reduced to 2.

Table 'SampleTable'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

Both these above statements are optimal. In Approach 2, even though we used COUNT(*) inside IF statement, Sql Server is smart enough to see that we are only checking for a boolean return value. So it stops checking once at least one qualifying record is identified.

Question

What is the result of this query below? Please add your answers in comments below.

IF EXISTS (SELECT NULL)
BEGIN
    PRINT 'TRUE'
END
ELSE
BEGIN
    PRINT 'FALSE'
END

Hope this helps,
_Sqltimes

Read Full Post »

Another quick one today:

Sometimes, when we receive a new application environment (with in database servers) into our scope of responsibility, there are a list of “sanity” checks we run on them to be sure things are configured correctly. It is a big list, but some of them are listed here:

  • Appropriate backup jobs
  • Maintenance Jobs
  • System documentation
  • Optimal storage architecture configuration
  • Appropriate ownership of the database
  • Permissions
  • Health Checks
  • Monitors
  • …etc.

In the past, there were some blog posts that talked about some of these. In today’s post we’ll cover database ownership

This is a common mistake that happens on my part sometimes, where when a database is created or restored, Sql Server by default set the owner of the database to my login name. So, after letting it run like that for a few days, I realize my mistake and then attempt to change it to ‘sa’ or some other dedicated account using this script.

USE [database name]
GO

EXEC sp_changedbowner 'sa'
GO

Hope this helps,
_Sqltimes

Read Full Post »

Another quick one today:

Rare, but sometimes, the need to remove data files from existing databases does arise. Data files (MDF’s & NDF’s) are not like other files on the filesystem to just delete. They are controlled by Sql Server and need to be configured to be deleted; These operations needs to be executed from inside Sql Server.

Example: Recently on one of our production databases, we added new LUN’s. After adding, our operations people created dummy data files to test the “success of new drives by adding data files to an existing production database. I know…operations people do interesting things something.

As these files were created without following the standards we have set for data files, we need to delete them. But its been a few days since they were added. So, Sql Server has already started using them [they were added to existing filegroups].

So, this is what I did.

First, I ran SHRINKFILE command on the dummy file to see how much of the new file actually has data in it (how full is the file).

-- script to check how full the file is
DBCC SHRINKFILE ('DBADB_Dummy_File_3')

Once I saw that only a small number of pages actually had any data in them, it seems like emptying the files would not take too much time. The following command helped move the data to other files in the same FILEGROUP.

-- script to empty the contents of the data file
DBCC SHRINKFILE ('DBADB_Dummy_File_3', EMPTYFILE)

Once successfully completed, run this command to remove the file from database and OS.

-- script to remove data file
ALTER DATABASE DBADB
REMOVE FILE DBADB_Dummy_File_3
GO

Hope this helps,

_Sqltimes

 

Read Full Post »

Another quick one today:

A few days ago, there was a need to restore a LiteSpeed database backup file into our lab environment. Since we do not use LiteSpeed in our environment we were scrambling to figure out a way. Upon some light reading, it became apparent that there is a command line command to extract a native sql server backup file from this LiteSpeed backup file.


extractor.exe -F S:\MSSQL\Backups\PLAY\DBADB_LiteSpeed.bak -E S:\MSSQL\Backups\PLAY\DBADB_Native.bak

The LiteSpeed backup file is about 50 GB, after extraction it created about 30 files with 4 GB each. So LiteSpeed compressed 120 GB file into less than 50 GB.

Using the RESTORE command below, we were able to restore this database.


RESTORE DATABASE DBADB
 FROM DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak0'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak1'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak2'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak3'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak4'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak5'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak6'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak7'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak8'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak9'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak10'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak11'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak12'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak13'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak14'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak15'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak16'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak17'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak18'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak19'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak20'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak21'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak22'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak23'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak24'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak25'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak26'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak27'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak28'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak29'
 , DISK = N'S:\MSSQL\Backups\PLAY\DBADB_Native.bak30'
 WITH MOVE 'DBADB' TO 'H:\MSSQL\Data\MSSQLSERVER\DBADB.MDF'
 , MOVE 'DBADB_Data1' TO 'I:\MSSQL\Data\MSSQLSERVER\DBADB_Data1.NDF'
 , MOVE 'DBADB_Data2' TO 'I:\MSSQL\Data\MSSQLSERVER\DBADB_Data2.NDF'
 , MOVE 'DBADB_log' TO 'L:\MSSQL\Log\MSSQLSERVER\DBADB_log.LDF'
 GO

Hope this helps,
_Sqltimes

Read Full Post »

Quick one today:

Another one of those tasks that keep re-occurring every few days; It is time to blog about it.

Database snapshots are a great feature in Sql Server. It is easy to create and brings about immense value with it. In our lab environment, we use it a lot — and for the right reasons.

Let’s imagine a database DBADB with 3 data files (MDF’s & NDF’s) and one log file (LDF). Now, before making any DML changes, a snapshot is created to preserve the state of the database before any changes. Snapshot is the easiest and best way to do it — when restoring from FULL backups take a long time.

Snapshot databases capture any changes that happen on the main database and preserve the old version of each data page upon any change. This sounds simple and it is, but they way Microsoft implements it is remarkable. You may want to read all about it here. When you run a DML operation, a set of data pages are change/updated. Those pages, before changes, are copied to snapshot database; Changes are then made to the main database.

So, for every data file (MDF & NDF) on the main database, we need a data file (.SS) for the snapshot database. For example, let’s take a sample database called DBADB with 3 data files. Below is a sample script to create snapshot for DBADB.

CREATE DATABASE DBADB_Snapshot
ON
    (     NAME      =   DBADB_Data1
        , FILENAME	=   'H:\SqlData\Sandbox\DBADB_Data1.ss'
    ),
    (	  NAME      =   DBADB_Data2
        , FILENAME	=   'H:\SqlData\Sandbox\DBADB_Data2.ss'
    ),
    (     NAME      =   DBADB_Data3
        , FILENAME	=  'H:\SqlData\Sandbox\DBADB_Data3.ss'
    )
AS SNAPSHOT OF DBADB
GO

There are several points that you want to keep in mind, when you create snapshots.

  • One of them is space.
    • Snapshot files (.SS) take up as much space as the original data file (MDF or NDF), so make sure you have enough space. Sql Server is making sure that it has enough space in case all the data pages are changed in a data file — smart and safe.
  • Only available in Enterprise edition.
  • All recovery models are supported
  • Must drop snapshot databases before dropping main database. If not you’ll run into error like the one below:

Msg 3709, Level 16, State 2, Line 1
Cannot drop the database while the database snapshot “DBADB_Snapshot” refers to it. Drop that database first.

Now, when you need to restore to previous state, you just RESTORE from snapshot database as shown below. This is much faster.

RESTORE DATABASE DBADB
FROM DATABASE_SNAPSHOT = 'DBADB_Snapshot'
GO

Hope this helps,
_Sqltimes

Read Full Post »

Another quick one today:

A lot too often, we find ourselves in a situation where we are pulled into support a new system or help troubleshoot a new database environment. There are a series of checks, I prefer to run when a new system is added to my list of database environments for our team.

Often, when I get pulled into troubleshooting a production situation, among other things I perform a series of checks. Depending on the nature of the problem description, sometimes checking ‘application login’ permissions details database helps. It is possible that the application login is assigned to appropriate roles, but as application grows and evolves it goes through a lot of changes; Some of those changes result in some seemingly minor permission aspects falling through the cracks. Some, but not all.

In these situation, I use the below script to give me a detailed listing of permissions that this account has. A quick glance of this helps me rule out a few things. Below is the script:

-- * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
--
-- Purpose : List permissions on all the objects in the given database for the current user
-- Author : SqlTimes on Oct 28, 2009
--
-- * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

--
-- Start of the script
--

--
-- Run as 'user'
--

    --EXECUTE AS USER = 'kelly'
    --SELECT SYSTEM_USER, SESSION_USER

--
-- Variable declaration
--
    DECLARE @v_Sql VARCHAR(MAX)
    DECLARE @Tbl_Perms TABLE (entity_name VARCHAR(100) NULL, Type VARCHAR(50) NULL, permission_name VARCHAR(50) NULL, Permission_value INT NULL)

--
-- Generate Sql Query to get permissions on all the objects
--
    SELECT @v_Sql = ''

    SELECT @v_Sql = @v_Sql + 'SELECT entity_name, ''' + type_desc + ''' AS [Type], permission_name, 1 AS [Permission_value] FROM fn_my_permissions(''' + s.name + '.' + o.name + ''',''object'') WHERE subentity_name IS NULL OR subentity_name = '''' UNION '
    FROM sys.objects AS O
    INNER JOIN sys.schemas AS S
        ON O.schema_id = S.schema_id
    WHERE O.type IN ( 'U' -- USER_TABLE
                    , 'V' -- VIEW
                    , 'P' -- SQL_STORED_PROCEDURE
                    , 'FN' -- SQL_SCALAR_FUNCTION
                    , 'IF' -- SQL_INLINE_TABLE_VALUED_FUNCTION
                    , 'TF' -- SQL_TABLE_VALUED_FUNCTION
                    , 'TR' -- SQL_TRIGGER
                    , 'S' -- SYSTEM_TABLE
                    , 'SN' -- SYNONYM
                    )
    ORDER BY O.type ASC, O.name ASC

--
-- Add Server and Database permissions
--
    SELECT @v_Sql = @v_Sql + ' SELECT entity_name, ''Database'' AS [Type], permission_name, 1 AS [Permission_value] FROM fn_my_permissions(NULL, ''Database'') UNION SELECT entity_name, ''Server'' AS [Type], permission_name, 1 AS [Permission_value] FROM fn_my_permissions(NULL, ''Server'')'

    PRINT @v_Sql

--
-- Insert into a temp table
--
    INSERT INTO @Tbl_Perms
    EXEC (@v_Sql)

--
-- Pivot into a matrix
--
    ; WITH Perms AS
    (
        SELECT entity_name, [Type], [SELECT], [INSERT], [UPDATE], [DELETE], [EXECUTE], [ALTER], [VIEW DEFINITION], [CONTROL], [REFERENCES], [TAKE OWNERSHIP], [CONNECT SQL], [ALTER TRACE], [VIEW ANY DATABASE], [CONTROL SERVER], [CONNECT]
        FROM (SELECT entity_name, [Type], permission_name, Permission_Value FROM @Tbl_Perms) AS S
        PIVOT
        (
            MAX(S.Permission_Value)
            FOR S.Permission_name IN ([SELECT], [INSERT], [UPDATE], [DELETE], [EXECUTE], [ALTER], [VIEW DEFINITION], [CONTROL], [REFERENCES], [TAKE OWNERSHIP], [CONNECT SQL], [ALTER TRACE], [VIEW ANY DATABASE], [CONTROL SERVER], [CONNECT])
        ) AS PVT
    )

--
-- Returning it all together
--
    SELECT * 
	FROM Perms
    ORDER BY [Type] ASC
    GO

--
-- Execute as Identified User above
--
    --REVERT
    --SELECT SYSTEM_USER, SESSION_USER

--
-- End of the script
--

Hope this helps,
_Sqltimes

Read Full Post »

Another quick one today:

A few days ago, I wrote a post that illustrates how to generate dummy data; In that there are a couple of DATETIME columns that have random datetime data. This random values are based completely on the RAND() function in Sql Server. Subtracting a random number of days from GETDATE() function results in a date that is reasonable random DATETIME values, with in the last 3 years, that is useful for a good percentage of daily test scenarios — but not all.

Here it goes:


SELECT DATEADD(DAY, -1 * CEILING(RAND()*1000) , GETDATE())

Result:


-----------------------
2012-11-22 06:02:44.563

(1 row(s) affected)

Hope this helps,
_Sqltimes

Read Full Post »

Quick post today:

Many times in the life of a DBA, we find ourselves in need to quickly create some sample table with dummy data to test some theory. Creating this from scratch every time takes some time (not much, but still some). This is a constant and repeating need and over years this adds up to a lot. So, I’ve created a sample table with immediately loadable dummy data.

--
-- Start of Script : Sample_Table_With_Data.sql
--

--
-- Sample Table Definition
--

USE DBADB
GO

DROP TABLE dbo.SampleTable
GO

CREATE TABLE dbo.SampleTable
(
      ID             INT                  NOT NULL     IDENTITY(1,1)    CONSTRAINT PK_SampleTable_ID    PRIMARY KEY
    , GUID_Col1      UNIQUEIDENTIFIER     NOT NULL                      CONSTRAINT DF_GUID_Col1       DEFAULT         NEWID()
    , GUID_Col2      UNIQUEIDENTIFIER     NOT NULL                      CONSTRAINT DF_GUID_Col2       DEFAULT         NEWID()
    , GUID_Col3      UNIQUEIDENTIFIER     NOT NULL                      CONSTRAINT DF_GUID_Col3       DEFAULT         NEWID()
    , SeqGUID_Col1   UNIQUEIDENTIFIER     NOT NULL                      CONSTRAINT DF_SqeGUID_Col1    DEFAULT         NEWSEQUENTIALID()
    , SeqGUID_Col2   UNIQUEIDENTIFIER     NOT NULL                      CONSTRAINT DF_SqeGUID_Col2    DEFAULT         NEWSEQUENTIALID()
    , SeqGUID_Col3   UNIQUEIDENTIFIER     NOT NULL                      CONSTRAINT DF_SqeGUID_Col3    DEFAULT         NEWSEQUENTIALID()
    , IntCol1        INT                  NOT NULL                      CONSTRAINT DF_IntCol1         DEFAULT         CEILING(RAND() * 10000)
    , IntCol2        INT                  NOT NULL                      CONSTRAINT DF_IntCol2         DEFAULT         CEILING(RAND() * 10000)
    , IntCol3        INT                  NOT NULL                      CONSTRAINT DF_IntCol3         DEFAULT         CEILING(RAND() * 10000)
    , CharCol1       VARCHAR(50)          NOT NULL                      CONSTRAINT DF_CharCol1        DEFAULT         REPLICATE( CHAR((CEILING(RAND()*10) + 65)),CEILING((RAND()*10)))
    , CharCol2       VARCHAR(50)          NOT NULL                      CONSTRAINT DF_CharCol2        DEFAULT         REPLICATE( CHAR((CEILING(RAND()*10) + 65)),CEILING((RAND()*10)))
    , CharCol3       VARCHAR(50)          NOT NULL                      CONSTRAINT DF_CharCol3        DEFAULT         REPLICATE( CHAR((CEILING(RAND()*10) + 65)),CEILING((RAND()*10)))
	, DateCol1       DATETIME             NOT NULL                      CONSTRAINT DF_DateCol1        DEFAULT         DATEADD(DAY, -1 * CEILING(RAND()*1000) , GETDATE())
	, DateCol2       DATETIME             NOT NULL                      CONSTRAINT DF_DateCol2        DEFAULT         DATEADD(DAY, -1 * CEILING(RAND()*1000) , GETDATE())
    , Created_On     DATETIME             NOT NULL                      CONSTRAINT DF_Created_On      DEFAULT         GETDATE()
	, Created_By     VARCHAR(50)          NOT NULL                      CONSTRAINT DF_Created_By      DEFAULT         SYSTEM_USER
)
GO

As you can see this has good variety of columns;

  • Integers
  • Fixed length characters
  • Variable length characters
  • Unique identifiers
  • Datetime

All of them are populated with some random data. DEFAULT column constraints are used to insert random data, even for DATETIME column.

There are cases when I need it to have some indexes and statistics on the table. Below scripts add some of these:

CREATE INDEX IX_IntCol123_SampleTable         ON dbo.SampleTable(IntCol1, IntCol2, IntCol3)
CREATE INDEX IX_IntCol3_SampleTable           ON dbo.SampleTable(IntCol3)
CREATE INDEX IX_IntCol2_SampleTable           ON dbo.SampleTable(IntCol2)
CREATE INDEX IX_SqeGUID_Col1_SampleTable      ON dbo.SampleTable(SeqGUID_Col1)
CREATE INDEX IX_SqeGUID_Col2_SampleTable      ON dbo.SampleTable(SeqGUID_Col2)
CREATE INDEX IX_SqeGUID_Col3_SampleTable      ON dbo.SampleTable(SeqGUID_Col3)
GO

CREATE STATISTICS USX_GUID_Col1_SampleTable   ON dbo.SampleTable(GUID_Col1)
CREATE STATISTICS USX_GUID_Col2_SampleTable   ON dbo.SampleTable(GUID_Col2)
CREATE STATISTICS USX_GUID_Col3_SampleTable   ON dbo.SampleTable(GUID_Col3)
CREATE STATISTICS USX_CharCol3_SampleTable    ON dbo.SampleTable(CharCol3)
GO

Now it is time to load the data. Repeating the same INSERT statement multiple times allows us to load as many records as needed.

INSERT INTO dbo.SampleTable DEFAULT VALUES
GO 2000
SampleTable with data

SampleTable with data

Hope this helps,
_Sqltimes

Read Full Post »

Older Posts »