Showing posts with label B2B. Show all posts
Showing posts with label B2B. Show all posts

DBCC CHECKDB WITH DATA_PURITY Command

As I mentioned in my last post, many people may be considering upgrading their SQL Server Instance(s) due to the release of SQL Server 2012 and the upcoming release of SP1. Because of this reason I thought it would be a good idea to highlight the importance of the command for people who are planning to upgrade from SQL Server 2000.

Before I get into the details about the command, let me just highlight that DBCC CHECKDB is a command that checks the logical and physical integrity of all the objects in the specified database and should be run on a regular basis. If you are a SQL Server DBA but never used or heard about DBCC CHECKDB before then you need to do some reading about it and start using it! 

In versions prior to SQL Server 2005, it was possible to import invalid data into databases. This was resolved for databases created in SQL Server 2005 and later versions by adding column-value integrity checks. This was a great solution to avoid the issue in the future but how do you deal with the invalid data issue in older databases that are being upgraded?

Solution -
DBCC CHECKDB ([DATABASENAME]) WITH DATA_PURITY

When this command is executed it will confirm if there are any data issues in the specified database. Once there are no issues/errors then DBCC CHECKDB column will check the column value integrity by default aka you don’t have to specify the DATA_PURITY anymore.

For more information about this command, I recommend Paul Randal's post:
and of course MSDN: DBCC CHECKDB.
 
 
Continue Reading →

Script To Transfer Logins From SQL Server 2000/2005

A few years back when I was working on a project to with some Dynamics databases, I found a script on a Microsoft site that can capture SQL Logins for SQL Server 2000 and 2005 instances. The script is very useful for transferring SQL Logins not only to 2000 and 2005 instances but also 2008, 2008 R2 and 2012 as well. Since I can no longer find the link, I decided to upload the script – Capture_Logins.sql.

With the launch of SQL Server 2012 and the upcoming release of SP1, many people may be considering upgrading their 2000 and/or 2005 instances and this script might save you a great deal of time. Remember it isn’t my script aka I didn't write and I don't own it. Enjoy!

(Note: Even though the script mentions that it works on SQL Server 7.0, I didn’t test it on that edition so I can't guarantee that it works.)

 
Continue Reading →

T-SQL Tuesday #33 - Monitoring Log File Size Without Logging On To The Server

This month’s T-SQL Tuesday is hosted by Mike Fal (Blog | Twitter). The topic is TrickShots and the purpose of this topic is to highlight a cool trick or process and explained what you learned from this trick.

I encountered my first log file issue in my early days as a DBA using SQL Server 2005. While I was on vacation, I received a phone call from the company's system administrator because he noticed the reduced free space on one of the drives of the database server. When I logged on to the server I immediately noticed the large log files that were not present a few days ago. After fixing the issue one of the main questions that was on my mind was "How can I know the log file size without logging on to the server to check the logs folder?".

I did some research and came across the T-SQL Statement:

DBCC SQLPERF (LOGSPACE)

This provides transaction logs usage statistics for all the databases on the instance (for more information see MSDN: DBCC SQLPERF)

I found the information very interesting and thought it would be a good idea to store the data everyday and look for some trend so that I can predict the size of the log files.  The following code was used to store it in a table:

INSERT INTO Information(DatabaseName,LogSize,LogSpaceUsed,Status)
EXEC ('DBCC SQLPERF (LOGSPACE)')

Around the time I was working on this issue, I was also now getting familiar with SQL Server 2005 Reporting Services and anything that was being stored in a table was suitable test information for a report. This resulted in a report with the data from the 'Information' table but I had to run the script and the report manually.

Using the little knowledge that I had of SQL Server 2005 as well as the excitement to try out new things, I decided to use a stored procedure as well as the subscription capability of Reporting Services to automate the process. The stored procedure was created to execute the insert script and also returned the inserted data every time the report was executed. The subscription capability allowed me to schedule when the report would be executed, how it would be delivered and the specific format of the report. The end result of this process enabled me to get my log size information in an Excel file via email. A great advantage of this was the fact that I could receive the email on my phone and check the file even if I was out of office or away from my machine.

Subset of the Log File Information in the Excel File

While this process could have been done more easily back then and much more easily now, it was a great way for me to learn and test the new features of SQL Server 2005. Another benefit of this process was the fact that the information that I received from the DBCC statement encouraged me to learn and understand more about log files.

Continue Reading →

T-SQL Tuesday #025 – Reporting on Reports

This month's T-SQL Tuesday blog party is being hosted by Allen White (Blog | Twitter) and the theme for the party is 'Sharing your T-SQL Tricks'.

This topic sounds like an easy topic to blog about but when I sat down and thought about it the first thing that had me stumped was the concept of what is considered a 'trick'. What I consider a trick might not be a trick to some people so I decided to stop thinking about tricks and start focusing on the concept of what makes someone's life/job easier. Finding out about information that I didn't know existed and learning how to take advantage of existing information always creates an 'Ah-ha' moment in my life. This brings me to my T-SQL Tuesday focus on Reporting Services.

Continue Reading →

Recovery Models: Selecting and Switching

A recovery model is a database configuration option that controls transaction log maintenance. The recovery model basically specifies how SQL Server manages logs files and controls what kind of backup and recovery procedures can be used for a database. The three types of recovery models that you can choose from in SQL Server are:
Continue Reading →

Require Information About Your SQL Server Instance?

Once you are responsible for an instance of SQL Server, whether it is an Enterprise edition or an Express edition, you should know some basic information about the instance.  There are various reasons why this information may be necessary so it is important that you know how and where to find it.

Some of the basic information such as the instance name, SQL Server edition, SQL Server version, etc. can be found on the General Page of the Server Properties dialog box.
To access this dialog box, perform the following steps:
  1. Start SQL Server Management Studio (SSMS).
  2. Connect to your instance using the necessary Authentication.
  3. Right-click the server name in the Object Explorer View and select 'Properties' (see diagram below).

Continue Reading →

Back to Basics

In life there are some tasks that  seem difficult to complete initially but as time goes by it becomes easier due to various reasons such as knowledge, understanding and of course practice. The easier it is for us to complete, the easier it is for us to forget how difficult it was initially. Many people take it for granted that what is easy or simple for them should be easy or simple for other people as well.
Continue Reading →

Community Links