Friday, June 26, 2015

Auto Growth Option in SQL Server Database Part-2

Good Morning Friends,

This post is the continuation of Auto Growth Part 1.

Part 1 we have seen the options available of Auto Growth in SQL Database and the parameters available. Now here we will see how to select Parameters.

For "Parameter 2" easily we can set to "Unlimited" because no one will stop the growth of there Database. So it is set to "Unlimited" (It will grow until your disk get full or in simple terms you can say limited to your Drive size).

For "Parameter 1" best I will select "Growth in MB". Following are the cases I'll try to explain while selecting this option:

Case 1:

1. Suppose I've a Database Clis3, which has one Data and one Log File with size is 50 and 10 GB   respectively.
2. Growth for Log file is set to 100 MB. Recovery model in Full recovery mode.
3. Suppose there is a Bulk transaction in the Clis3 Database. Obviously the Log file will capture each and every transaction. SO the log file will start growing.
4. First it will start utilizing all the available VLF (Virtual Log File) up to 10 GB. Once it is full it will add 100 MB to Log file to write the transactions.
5. Now total Log size is 10.10 GB.
6. Once this 100 MB is also full it will add 100 MB more and so on. It will be done until the transaction is completed.
7. So now total Log size is 10.20 GB.

Here SQL Server is adding a defined amount of space (i.e. 100 MB) to the log file. This will indirectly control the growth of Log file.

Case 2:

1. Suppose for same Database Clis3, which has one Data and one Log File with size is 50 & 10   GB respectively.
2. Growth for Log file is set to 10 percent. Recovery model in Full recovery mode.
3. Suppose there is a Bulk transaction in the Clis3 Database. Obviously the Log file will capture each and every transaction. SO the log file will start growing.
4. First it will start utilizing all the available VLF up to 10 GB. Once it get full it will add 1024 MB (10% of 10 GB) to Log file to write the transactions.
5. Now total Log size is 11 GB.
6. Once this 1 GB is also full it will add 1127 MB more (10% of 11 GB) and so on. It will be done until the transaction is completed.
7. So now total Log size is 12.1 GB.

Here SQL Server is adding 10 percent of the current Log Size. Which means the size will vary depending upon the current Log File.


Auto Growth
Ideal Setting for Auto Growth
From the above image ideally for large Databases we can set for Auto Growth.

Case study for the above two cases:

1. Log File in Case 1 will have 10.20 GB at the end whereas;
2. Case 2 will have 12.10 GB at the end.
3. That means Log file will utilize 1.9 GB of unnecessary space from the disk which is unwanted.
4. Hence proves for larger Database it is good to define "Parameter 1" to "Growth in MB's" as it will avoid unnecessary utilization of disk space.


Keep in touch... and Happy Learning....
I appreciate and thank you for reading this post. :) 

Thursday, June 25, 2015

Auto Growth Option in SQL Server Database Part-1

Hi Guys,

Thanks a lot friends for your wishes for my certification completion Blog here.
There was a discussion couple of days back with one of my colleague DBA about the "Auto growth" option in SQL Server Database.

This topic came into consideration when one of my re-indexing job got failed due to less disk space on the log drive. As we are aware of rebuild requires  sufficient amount of disk space for the log file growth (We will discuss on separate post "Behind the scene- while Re-indexing").This will be a two part series.

We can check the "Auto growth" option under 
Right Click on Database, Next Properties ,Next Files, Next Autogrowth / Maxsize Column

Auto Growth GUI
Auto- Growth Properties
Here you will see depending upon total number of Data and Log Files you have in your Database. 
By default, it will inherited the value from Model Database but the Log growth will limit up to 2,097,152 MB (2 TB). So now the question is should we limit this to by default value? or we should change the setting? 

The answer is "It Depends" (Just like most of the answers in SQL Server). Well, it depends up to scenario to scenario. There are many factors come into consideration before changing the value in Production environment.

1.  How frequently is my log growing?
2.  What are the operations are happening in my Database?
3.  I have multiple Log files in different Drive. Still my log file will grow?
4.  Do we take Log backups?

What triggers to grow the Log file rapidly:
a. If your recovery model is on "Full" and there is a bulk operation such as BCP  it will capture each       and  every transaction in log. 
b. Even in rebuilding operations the log files will grow rapidly.
c. Recovery model is "Full" and there is no Log Backups taken regularly.   

So as you can see and as I said it depends upon many factors. So before changing anything in Production environment you should be well aware of all the scenarios.

Following are the parameters we need to set in "Auto Growth" for a Database?

1. We can set this option for both Data and Log File by two types:
a. In Percent
b. In Megabytes

2. We can set the Maximum File Size to
a. Limited
b. Unlimited

With the help of below query we can find out this setting for all the Databases:

--auto growth percentage for data and log files
Select DB_NAME(files.database_id) database_name, files.name logical_name, 
CONVERT (numeric (15,2) , (convert(numeric, size) * 8192)/1048576) [file_size (MB)],
[next_auto_growth_size (MB)] = case is_percent_growth
    when 1 then CONVERT(numeric(18,2), (((convert(numeric, size)*growth)/100)*8)/1024)
    when 0 then CONVERT(numeric(18,2), (convert(numeric, growth)*8)/1024)
end,
is_read_only = case is_read_only 
    when 1 then 'Yes'
    when 0 then 'No'
end,    
is_percent_growth = case is_percent_growth 
    when 1 then 'Yes'
    when 0 then 'No'
end, 
physical_name
from sys.master_files files
where files.type in (0,1)
and files.growth != 0

Now as a DBA needs to set these parameters after knowing the behavior of the Database in given environment. By default it will inherit the values from your Model Database.

We will continue in the next post Part 2 the cases which best suits for parameter 1. 

Just to share with you last year on 22nd June'14 I started blogging and it's now 1 Year. Special Thanks to Akhilesh Humbe for the guidance and Thank you friends for reading my post. 
If you have any suggestion to improve this blog feel free to comment. Suggestion are most and always welcome. 

Keep in touch... And Happy Learning....

I appreciate and thank you for reading this post.

Friday, June 19, 2015

MS SQL 2012 Certification

From past couple of weeks I was preparing for exam 70-461 (Querying Microsoft SQL Server 2012) certification after completing 70-462 (Administering Microsoft SQL Server 2012 Databases) and I got a break through last week when I cleared my exam. So YES!!!!! I've completed 2 Levels for a MCSA (Microsoft Certified Solution Associate) and now I am MCP (Microsoft Certified Professional).

Just because of your encouragement it boosted me to go ahead and complete this certification. A biggggg thank you to you all and special thanks to Sanjeevi K for his guidance.

As I maintain this blog for SQL Server and since now I have fair idea now about these exams so thought of sharing this with you so that I can guide you guys if you are planning for these certifications.

For those who don’t have much idea about how to give certification below is the simple procedure:

1.    Go to the Microsoft site
2.    Create a Live ID.
3.    Cost for the certification is 4500 INR (Till Now).
4.    Enter your complete address. Depending on your address it will automatically display your nearest prometric center.
5.    Select the best possible date and time for the exam which suits you.
6.    Once it is confirmed on exam date reach the center 15 min prior.

Following are the brief note on the exam:

Total Questions: 42
Total Marks: 1000
Duration: 120 Minutes
Passing Score: 700 (70% minimum score you should score to pass)

1. Most of the questions are MCQ i.e. Almost 32-35 questions.
2. Few of the questions are query designing i.e. Designing Views, SP's or Functions.
3. And rest Query designing but along with options.
4. Weightage of these questions is not same as each question are independent. Marks for each question are not displayed it is calculated internally. At the end of the exam the total score will be displayed.

You can follow the below official link from Microsoft for more detail:

https://www.microsoft.com/learning/en-in/sql-certification.aspx

Hope this will help you at least a bit about the exam 70-461. You can leave your comments here or call me for more information.

Keep in touch... And Happy Learning....

I appreciate and thank you for reading this post. J

Wednesday, June 17, 2015

Apart From SQL Server...

Hi Friends,

Here is a topic which you can relate or say not related to SQL Server but yes it is related to the Database world and this would be the most Hot and upcoming topic in the IT sector (or let me correct it is already!!!).


Any wild guesses what the topic it would be??


I expect some of you guessed it right. And the topic which I am mentioning is "BIG DATA".


How I came to know about this???


Frankly speaking I was not aware of the data storage unit terms above Terabyte (TB), so I just goggled around and at least now I know these units that are Petabyte, exabyte, zettabyte, yottabyte that's it till date.


After learning the data storage units I was curious about the company’s that stores the data above TB. I mean of course it would be a challenge to store such a huge data. Rather I'll say it would also be challenging task to maintain & process these data's.As per a site the most famous social network site "Facebook" generates approx 500 TB of data daily basis. As I said storing and maintaining these data's for companies would be challenging.


Now-a-days internet is flooded with lots of videos about the big data; some of them may be worth watching if you are interested to learn. Once you start learning you will come across many terms such as Hadoop, Hbase, NoSQL, Pig, Hive, Cassandra and many more. Even till date these are just a term for me as well as I am a beginner to this topic. As I progress and feed myself with these topics I will regularly update you through this blog. This is just a beginning to the new technology of "BIG DATA". In coming days I will post new topics as well as links about this topic.


According to one of the research company there would be a crunch of these skill set in the upcoming years. Looking from the job perspective you can find ample of designation under this topic also with lots of job opportunities. But beside this, one has to select which suits best for them.


Happy learning!!!