Tuesday, March 14, 2017

SQL Server SSL Connectivity Issue



I'm sure most of you may face different types of error while connecting to SQL Server via Management Studio or ODBC (or different methods). Exactly same thing happened in my client side when we encountered with the following:

SSL Security Error
SSL Security Error
 
Problem Description:

After monthly scheduled restart of one Application Server, the connectivity to Database Server from .Net Client was unable to establish connection to our Database Server. In addition, tried from connecting through ODBC that shows the error message which I mentioned above.

Application Server: Windows Server 2012 R2
Database Server: SQL Server 2005

Troubleshooting / Solution:

As the errors states "SSL Security error", it was pointing to something in the registry level encryption. After hours of troubleshooting (almost a day) and visiting 50+ blogs, we were able to solve this connectivity issue. 

Some observations while troubleshooting are as follows:

      a. This problem was specific to SQL Server 2005 version only because we tried the connectivity from Application Server to different version of SQL Server like 2008 and 2012 that was successfully able to connect. 

      b. TLS 1.0 in registry was not found. You can check here for this error in detail here

      c. No traces found in Windows Event Viewer as well as SQL Server Error Log.

      d. Tried playing with the dbnetlib.dll files too (May be for some this solution has worked).

      e. Installed SQL Server 2005 specific drives and tried connecting through ODBC.
       
      f. Raised this concerned with Microsoft. Since also the extended support for SQL Server 2005 ended last year on April 2016, Microsoft denied to support.

Last week there was some patching at Windows level. Due to this, two new entries (Triple DES & AES) were found in Ciphers on the below location:

Registry
After this discovery, we checked the connectivity by disabling both this algorithms. However, the issue still existed. Finally, tried the connectivity by deleting these algorithms and this time we were lucky and we were able to establish the connectivity to SQL Server 2005 again.

Thank you for reading. You can leave the feedback in the below comments space and stay tuned for more articles on SQL Server.

Thursday, February 4, 2016

Edition Upgrade in SQL Server

Hi Friends,

Check here for my last post of year 2015 on Hadoop. By this, lets start with this post on "How to perform in-place Edition upgrade". This procedure will work well with SQL Server 2005 and above.

Requirement:

There was a requirement from the client to upgrade the Edition from Enterprise to Standard for one of my SQL Server 2008 R2 instance.

So lets see how simple and time efficient is the in place upgradation:

Step 1: Launch the SQL Server setup file => Go to "Maintenance" tab on left hand side.



Step 2: Click "Edition Upgrade" link on Right hand side. This will check first check the rules.



(Next)



(Next)



Step 3: Very important step check the new edition here (Which I have highlighted below)



Step 4: Select the instance from the drop down for which the instance need to be upgraded.



(Next)



(Next)



Hold on for 3 Minitues and your in place edition upgradation is finished.

As you have seen how simple and time efficient it is to perform this, it comes with the cost. AND the cost is "if there is any error during the upgradation phase there is no procedure to rollback".

The only option left with us is complete uninstalltion and again installing the fresh setup. So before starting this procedure make sure you have taken the necessary precaution.

Have your ever faced this error.


Thursday, January 21, 2016

Error - Instant File Initialization Failed

Hi Friends,

Here comes the first post of the wonderful year 2016 ahead. Last year, we have ended with the post on Introduction to Hadoop. Lets start our first post with an Error in SQL Server.

Description:

One of the error which I faced very frequently now a days as you can see the below snapshot: "File initialization failed" because of which my Restoration activity was failed.

This error occurred when I was in the middle of a Migration activity, were I was suppose to Backup and Restore a Database from one Server to another one. After executing the Restoration command with stats=1, I was waiting for 1 percent to complete (after that I can have a nap because the backup file was huge and the activity was at mid night) but I was awaiting awaiting and awaiting for that 1 percent. It was getting suspicious because it should not take too long to complete even a percent.

So, I decided to stop the restoration. Once the session was stopped I found the below error message:

Instant File Initialization Failed

Instant File Initialization Failed
Basically, I will try to explain the behind the scene what exactly SQL Server does then we create or Restore a backup file in different post. In this post just lets look for the solution for the error.

Solution:

1. Run => Secpol.msc;

Open Local Security Policy => Local Policies => User Rights Assignment => "Perform Volume Maintenance Task" => Right click => Add the user through which SQL Server services are running. Like you see in the below snapshot.

2. No need to restart the Server.

3. Now, start the restoration process and this time it will work well.

Perform Volume Maintenance Task

Perform Volume Maintenance Task

So from now SQL Server will skip the Zero Initialization whenever we create or restore a Database. Later we will see what exactly does this mean in probably in different post.

Hope this will save your time and this will help you. Don't forget to drop a comment below. Also do vote below if it is Interesting, Informative or Boring.

Facing trouble while switching the Database from Single User Mode to Multi Mode check here for solution.

(It's been so long, more than couple of weeks I was away from my blog. I'm afraid this could further continue for few more, due to multiple projects on weekdays as well over the weekends. Due to this, my Blogging might also affected. So, stay tuned soon we will learn many things on SQL Server as well as Hadoop Administrator.)

Thanks,
Vikas B Sahu
Keep Learning and Enjoy Learning!!!

Tuesday, December 15, 2015

Introduction to Hadoop

Dear Friends,

Couple of months back I've published a post on SQL Server 2016 new features here.

Meanwhile, let me introduce you to Hadoop. We will learn this as a series of inter-related post. So don't miss any post in between and read it serially. Let's make it fun and interesting to learn Hadoop.

So, lets understand what are the formats of data that we handle in real word.
  • Flat File
  • Rows  and Columns
  • Images and Document
  • Audio and Video
  • XML Data and many more......
Big Data is an ocean of data which an organization stores. These data come in three V's i.e. Volume, Velocity and Variety.

Now-a-days huge Volume of data are getting generated by many sources such as Facebook,Whatsapp, E-commerce sites, etc, etc, etc. These huge volume of data are getting generated with high Velocity, can say it is multiplying every seconds, every minute, every hour. Along with the huge Volume and high Velocity numerous Variety of data is generated in different forms.

These data can be in any format i.e. structure, semi-structure as well as unstructured. Data stored in the form of row and column can be well defined as structured data whereas data in form of document, image, sms, video, audio,etc can be categories into unstructured and data in html or in XML format can be semi-structure data.

Q. I am sure you must be thinking that then how does a RDBMS handles these kind of unstructured or semi-structure data in there Database?

A. Well, to handle these kind of data's we have special data type such as Varbinary(Max), XML. Drawback of this is, if we are storing an image, it is stored in binary format within the database; whereas actual image is stored in Filestream or the Server itself. Hence there is an performance impact during storing and retrieving Petabyte of data's.

Moreover to this, Big Data is not just about maintaining and growing the data year on year, but it is also about how you manages these data's to make an informative decision. Data in Big Data can also comes in various complex format, to manage and process these type of data we need large set of cluster servers.
BIG DATA & HADOOP

With this introduction to Big Data, now let me introduce you to Hadoop. 

Hadoop is a large set of cluster servers which is built to process large set of data. It has two main core component i.e. 'Hadoop MapReduce' (Processing Part) and 'Hadoop Distributed File System' (Storage Part). Hadoop project comes under Apache and that is why it is called as 'Apache Hadoop'. The idea behind these two core component came into existence when Google has released there two white paper of there project on 'MapReduce' and 'Google File System (GFS)' in the year 2004.
Hadoop was created by Doug Cutting in 2005. Cutting, He was working at Yahoo! at the time he build the software, it was named after his son's toy elephant.

Wikipedia defines Hadoop as "an open-source software framework written in Java for distributed storage and distributed processing of very large data sets on computer clusters built from commodity hardware"

Hadoop is an open source framework available in free as well as commercial use under Apache license. It allows distributing and processing of dataset across the large Cluster set. On top of this, there are lot more application build by other organisation who use Hadoop or continuously work on this product which all comes under 'Hadoop Ecosystem'. Check here to find the list of projects under Hadoop Ecosystem. As we move further we will see post on the important projects under Hadoop Ecosystem.

Apache Hadoop Architecture consist of following components:
  • Hadoop Common: It contains the libraries and other utilities needed by other module of Hadoop.
  • Hadoop Distributed File System (HDFS): It is cluster of Servers with commodity storage which is used for data storage across the cluster.
  • Hadoop YARN: This component is used for Job scheduling and Resource management in Cluster.
  • Hadoop MapReduce: The processing part of the data is done by this component. 
At least now, I can consider that you are having a fair enough idea about this technology. But how can or who is the best person to get into Hadoop??

Technical answer for that would be those who are interested to learn this technology can get in two ways:
  • As a Developer
  • As a Administrator
  1. As a Developer: Hadoop is a framework which is built in JAVA language. So having JAVA background can get easy access to become a Hadoop Developer. Since the growing popularity of Hadoop, now a days this is the most common designation you can find in Job sites. 
  2. As a Administrator: Most organisation with Hadoop installation prefer for a Part time or a
    Hadoop Administration

    Full time Administrator to manage there Hadoop Clusters. It is not compulsion that the admin should have the knowledge of JAVA to learn this technology. Indeed! they should have some basics for troubleshooting. Candidate those who are having knowledge with Database Admin (SQL Server, Oracle, etc) background who already have troubleshooting, Server maintenance, Disaster Recovery knowledge are preferred or anyone with Network or Storage or Server Admin (Windows\Linux) skills  can be the other best choice. Here in this post it is mentioned in detail who suits best for Hadoop.     
Following might be the questions in your mind if we want to get start with Hadoop Admin:

  1. Do we need any DBA skills? Of course Yes; If we need to Admin the Hadoop Cluster (Maintaining, Monitoring, Configuration, Troubleshooting,etc). 
  2. Do we need to learn Java? Yes; At least some basics to understand the Java errors while troubleshooting any issue.
  3. Do we need to understand Non-RDBMS? Yes; Hadoop understand both SQL and NoSQL (Not only SQL). So having knowledge on Non-RDBMS product is most important.
  4. Do we need to learn Linux too? Yes;  at least the basics.
In our next post we will see the concept of HDFS (Hadoop Distributed File Structure).

Interested in learning SQL Server Clustering check here. Stay tuned for many more updates...

Keep Learning and Enjoy Learning!!!