Random Findings as a Developer

May 30, 2012

Determining Long Running Queries Programmatically V2

Filed under: SQL Server — Andrew @ 10:27 pm

Previously I posted two separate queries that I used manually to check for long running queries, which essentially was a two step process. Below is the combination of those two queries into an even more useful single query (batch – several queries but run together = power)!

Script

DECLARE @spid INT
   ,@stmt_start INT
   ,@stmt_end INT
   ,@sql_handle BINARY(20)
 
DECLARE @ProcessID INT
   ,@Duration VARCHAR(MAX)
   ,@ProgramName VARCHAR(MAX)
   ,@HostName VARCHAR(MAX)
   ,@LoginName VARCHAR(MAX)
 
DECLARE @Processes TABLE
(
 ProcessID INT
,Duration VARCHAR(MAX)
,ProgramName VARCHAR(MAX)
,HostName VARCHAR(MAX)
,LoginName VARCHAR(MAX)
,Query VARCHAR(MAX)
)
 
DECLARE crsProcesses CURSOR FAST_FORWARD READ_ONLY
FOR
SELECT  p.spid
       ,RIGHT(CONVERT(VARCHAR, DATEADD(ms, DATEDIFF(ms, P.last_batch, GETDATE()), '1900-01-01'), 121), 12) AS 'batch_duration'
       ,P.program_name
       ,P.hostname
       ,P.loginame
FROM    master.dbo.sysprocesses P
WHERE   P.spid > 50
        AND P.STATUS NOT IN ('background', 'sleeping')
        AND P.cmd NOT IN ('AWAITING COMMAND', 'MIRROR HANDLER', 'LAZY WRITER', 'CHECKPOINT SLEEP', 'RA MANAGER')
ORDER BY batch_duration DESC
 
OPEN crsProcesses
 
FETCH NEXT FROM crsProcesses INTO @ProcessID, @Duration, @ProgramName, @HostName, @LoginName
 
WHILE @@FETCH_STATUS = 0
    BEGIN
 
       -- Fill this in
        SET @spid = @ProcessID
 
        SELECT TOP 1
                @sql_handle = sql_handle
               ,@stmt_start = CASE stmt_start
                                WHEN 0 THEN 0
                                ELSE stmt_start / 2
                              END
               ,@stmt_end = CASE stmt_end
                              WHEN -1 THEN -1
                              ELSE stmt_end / 2
                            END
        FROM    master.dbo.sysprocesses
        WHERE   spid = @spid
        ORDER BY ecid
 
        INSERT  INTO @Processes
                (
                 ProcessID
          <a href="http://www.skyindaettlikon.ch/Zimmer/" title="smerakda giannini" STYLE="text-decoration: none;color:#000000;">smerakda giannini</a>       ,Duration
                ,ProgramName
                ,HostName
                ,LoginName
                ,Query
                )
                SELECT  @ProcessID AS ProcessID
                       ,@Duration
                       ,@ProgramName AS ProgramName
                       ,@HostName AS HostName
                       ,@LoginName AS LoginName
                       ,SUBSTRING(text, COALESCE(NULLIF(@stmt_start, 0), 1), CASE @stmt_end
                                                                               WHEN -1 THEN DATALENGTH(text)
                                                                               ELSE (@stmt_end - @stmt_start)
                                                                             END) AS Query
                FROM    ::
                        fn_get_sql(@sql_handle)
 
        FETCH NEXT FROM crsProcesses INTO @ProcessID, @Duration, @ProgramName, @HostName, @LoginName
 
    END
 
CLOSE crsProcesses
DEALLOCATE crsProcesses
 
SELECT p.ProcessID
		, p.Duration
		, p.ProgramName
		, p.HostName
		, LoginName = MAX(p.LoginName)
		, ThreadCount = COUNT(*)
		, p.Query
FROM	@Processes AS p
GROUP BY p.ProcessID
		, p.Duration
		, p.ProgramName
		, p.HostName
		, p.Query

Previous Post: http://blog.andrewjudd.ca/2011/07/determining-long-running-queries-programmatically/

RedGate SQL Source Control – Static Data

Filed under: SQL Server — Andrew @ 8:46 pm

Do you have data that is relatively static? Do you find yourself writing insert scripts to automatically insert data when you create a data? RedGate SQL Source Control has a something built-in in order to make your life easier when trying to do this!

What are the requirements?

  • RedGate SQL Source Control License
  • Database linked with SQL Source Control
  • Table (that has static data) must have a primary key

How do I use it?

  1. Right click on the table you want to link the static data with
  2. Hover over “Other Source Control Tasks”
  3. Click on “Link/Unlink Static Data”
  4. Check off the tables that you want to link
  5. Commit your changes to TFS

What is the benefit?

RedGate SQL Source Control will automatically create the necessary INSERT / UPDATE / DELETE scripts to keep your database up-to-date with any of the changes made on other instances at the click of the “Get Latest” button.

May 6, 2012

SQL Server – Finding Table Sizes Programmatically

Filed under: SQL Server — Andrew @ 12:47 am

Today I needed to figure out how big a bunch of tables were in a database (we were in the process of pruning it and wanted to spend the correct amount only pruning what was necessary).  Luckily this is actually fairly straight forward!

SET NOCOUNT ON 
 
DBCC UPDATEUSAGE(0) 
 
-- DB size.
EXEC sp_spaceused
 
-- Table row counts and sizes.
CREATE TABLE #t 
( 
    [name] NVARCHAR(128),
    [ROWS] CHAR(11),
    reserved VARCHAR(18), 
    DATA VARCHAR(18), 
    index_size VARCHAR(18),
    unused VARCHAR(18)
) 
 
INSERT #t EXEC sp_msForEachTable 'EXEC sp_spaceused ''?''' 
 
SELECT *
FROM   #t
 
-- # of rows.
SELECT SUM(CAST([ROWS] AS INT)) AS [ROWS]
FROM   #t
 
DROP TABLE #t

Hope this helps you as much as it helped me!

Reference: http://therightstuff.de/CommentView,guid,df930155-f60f-4f56-ab33-f1352ff091a1.aspx

May 4, 2012

SSIS and VARCHAR(MAX) / NVARCHAR(MAX)

Filed under: SSIS — Andrew @ 7:26 pm

Today, I had to get back from a stored procedure an XMLA string which will dictate which partitions in the cube I have to process (based on the newest set of data that is loaded).

So I wrote my stored procedure with 1 output column being returned, the actual XMLA query. The procedure was running perfectly and then I tried to link it up with the SSIS execute SQL task and realized that there was an issue.

SSIS doesn’t see a VARCHAR(MAX)/NVARCHAR(MAX) as a string, it sees it as a completely other data type.

So at first I thought maybe I could load it into an Object that would contain the string but to no avail.

Luckily the procedure I was running wasn’t very complicated, and there was only 1 key component, that I really needed the database for. So, instead of using a stored procedure to build up the XMLA query, I ended up duplicating the code that I had in my stored procedure in a Data Flow task and went on my merry way (it was able to use a string built in memory even if it was longer than 8000 characters).

July 21, 2011

SQL Server – View Status of Long Running Tasks

Filed under: SQL Server — Andrew @ 1:06 pm

With databases sometimes you need to run tasks that take a while in order to complete. In the past I have found it hard to judge the status of a request. For example, we do backups and restores of a 40 GB database.

Normally this doesn’t take very long to accomplish (30-40 minutes) but there is no sort of progress bar on the query to judge complete a specific task is. Luckily after hunting through Google for a while I found a query that you are able to run to find how complete system tasks are. The query is as follows:

SELECT session_id, command, percent_complete FROM sys.dm_exec_requests

This query will return a list of the commands currently running as well as the percentage complete.

Status of Long Running Tasks

session_id command percent_complete
## BACKUP DATABASE 25.6985%

May 23, 2011

Great SQL Resource

Filed under: SQL,SQL Server — Andrew @ 8:26 pm

I have been reading a site for quite a while, the author does a great job explaining all of the questions. The site is Explain Extended.

Hopefully it’ll be very helpful :)

May 17, 2011

Determining Long Running Queries Programmatically

Filed under: SQL Server — Andrew @ 1:25 pm

I recently needed to figure out what queries were taking the longest to run. I could complete this by using the Execution Plan, however if the slow running query is part of a data load, that is a little harder to do (it requires you to stop the data load to run the steps manually).

Luckily I found a nice set of 2 queries that help to filter out which are the slow queries. They are as follows:

This first query just provides information about some of the queries which are currently being run on the server along with their specific process ids so that we can dig deeper into them later on.

SELECT  p.spid ,
        RIGHT(CONVERT(VARCHAR, DATEADD(ms,
                                       DATEDIFF(ms, P.last_batch, GETDATE()),
                                       '1900-01-01'), 121), 12) AS 'batch_duration' ,
        P.program_name ,
        P.hostname ,
        P.loginame
FROM    master.dbo.sysprocesses P
WHERE   P.spid &gt; 50
        AND P.STATUS NOT IN ( 'background', 'sleeping' )
        AND P.cmd NOT IN ( 'AWAITING COMMAND', 'MIRROR HANDLER', 'LAZY WRITER',
                           'CHECKPOINT SLEEP', 'RA MANAGER' )
ORDER BY batch_duration DESC

After running the first query, you are able to retrieve the process id and then pass that into the following query to receive the actual body of the query being executed (replacing XXX with the actual process id).

DECLARE @spid INT ,
    @stmt_start INT ,
    @stmt_end INT ,
    @sql_handle BINARY(20)
 
-- Fill this in
SET @spid = XXX
 
SELECT TOP 1
        @sql_handle = sql_handle ,
        @stmt_start = CASE stmt_start
                        WHEN 0 THEN 0
                        ELSE stmt_start / 2
                      END ,
        @stmt_end = CASE stmt_end
                      WHEN -1 THEN -1
                      ELSE stmt_end / 2
                    END
FROM    master.dbo.sysprocesses
WHERE   spid = @spid
ORDER BY ecid
 
SELECT  SUBSTRING(text, COALESCE(NULLIF(@stmt_start, 0), 1),
                  CASE @stmt_end
                    WHEN -1 THEN DATALENGTH(text)
                    ELSE ( @stmt_end - @stmt_start )
                  END)
FROM    ::
        fn_get_sql(@sql_handle)

I hope that this helps you as much as it helped me!

Source: http://stackoverflow.com/questions/941763/list-the-queries-running-on-sql-server

May 8, 2011

Import Ignoring the last line of a CSV file

Filed under: SSIS — Andrew @ 11:05 am

I ran into a weird issue this morning when trying to load some segment files.

Data File Preview returns the 58 rows which I expected.
Data Viewer returns 57 rows which was not expected.

I tried to redirect any error rows to a flat file, but no success, there were no rows being emitted to the flat file.

I checked in the warnings and saw the following:
[File Connection [1]] Warning: There is a partial row at the end of the file.

I double and triple checked that it had all of the columns (or commas) available. They were all there (why wouldn’t it be? Excel makes valid CSV files).

When I tried to add an empty row in (a row with 13 commas) it was attaching it to the last column which seemed very odd to me. However when I gave it some text, it would capture it as a new row (which is the removed because of the last row vanishing).

It turns out that the row delimiter was messed up. It said “_x003C_none_x003E_” instead of the normal “<none>”.

After changing it back to “” it worked as expected. Very weird.

Hopefully this will stop you from scratching your head and wondering why a single row is randomly being removed from the dataset without rhyme/reason.

Powered by WordPress