Data & AI at Microsoft | Former MS MVP | CKA & CKAD
Author: Mohammad Darab
My name is Mohammad Darab. I’m a speaker and blogger with over 18 years of IT professional experience, 10+ years of that working with SQL Server. Some areas of focus are: database engine, high availability / disaster recovery, and security. I’m an MCITP, MCP, as well as an IDERA ACE, Class of 2019.
I wanted to publish a quick note about something a little near and dear to me. As of today, Feb 25th, 2022, Microsoft made the official announcement that they are retiring SQL Server Big Data Clusters. You can read the full statement here.
“Support for SQL Server 2019 Big Data Clusters will end on January 14, 2025.”
I am wrapping up my 4th week at Microsoft and part of my “ramp up training” is becoming familiar with Power BI. (Apparently it’s in high demand :) The only experience I have with BI / Reporting tool is setting up SSRS (SQL Server Reporting Services) and creating a couple basic reports using Report Builder.
In July of 2020 I was awarded Microsoft MVP in the Data Platform category for my community contributions on SQL Server Big Data Clusters. This was a huge accomplishment for me and I am truly honored. The rest of 2020 was spent studying and passing Kubernetes certifications and dedicating time for introspection. I dug a little deep and tackled the good’ol question, “where do I see myself in 5-10 years?” After some time pondering certain paths and outcomes, I came to a pretty solid answer which involved giving up my MVP award.
I don’t recall how I came across this Kubernetes IDE called Lens, but all I know is it’s cool as hec! It connects to a Kubernetes cluster (using the kube config file) and gives you an in depth view of all the different Kubernetes objects, their associated yaml files, health/metrics, etc. In this blog post I will show you how we can look into a Big Data Cluster’s Kubernetes infrastructure using Lens.
I remember when I first started deploying Big Data Clusters, they were on Azure Kubernetes Service utilizing the $200 credit for first time sign ups. By the time I got around to figuring out how to deploy the BDC, not only was my $200 credit gone, but I started to incur cost out of pocket.
If only there was a feature that would allow me to stop the VMs in AKS whenever I wasn’t using them. Well, I’m excited to share that Microsoft AKS (Azure Kubernetes Service) came out with a neat feature (currently in preview at the time of the publishing of this post) that allows you to stop and start your AKS cluster by running a simple command. Of course I had to try it out on BDCs and to my surprise it worked. Well, sort of. Let me explain…
Are you a data professional and curious about Kubernetes but not quite sure what type of opportunities are available? Maybe you’re hesitant because you think Kubernetes is a “fad”? Or perhaps you’re just starting out in IT and don’t know what path to take?
Whether you are new to IT, or a seasoned IT professional, pondering these questions can be exhausting. In this blog post I will go over the importance of learning Kubernetes and how it can massively level up your career!
About 2 years ago I started coming across a lot of online chatter on “containers” and “Kubernetes”. This was back in 2018 and around that time I had no interest of learning about it because it had no direct connection to SQL Server and my daily job duties as a DBA. Up until that point I had been working with SQL Server for about ten years. So like most people, “containers and Kubernetes” went in one ear and out the other.
That all changed with the hype, and eventual release, of SQL Server 2019. In SQL Server 2019 comes a feature called “Big Data Clusters”. This new feature in SQL Server really intrigued me because it was something completely different. I started to hear those terms again (containers and Kubernetes) because those are technologies behind Big Data Clusters. Over the next year, I heavily blogged, spoke, and created video content on Big Data Clusters. As a result of my deep passion and promotion of the product, I was awarded Microsoft MVP. My journey didn’t stop there as I have a natural “thirst for knowledge” and had to learn more about the underlying technology that makes Big Data Clusters feasible in the first place: Kubernetes.
So I started to study for the Certified Kubernetes Administrator (CKA) exam.
This whole “social distancing” is a perfect time to spruce up the resume. What better way than to add a new certificate? I recently took the Microsoft AZ 900 Azure Fundamentals exam and want to share the two-pronged approach I took to pass.
Before I go into that, I want to talk about the importance of the AZ 900 exam. This is my first certificate in Azure. I have some experience working with Azure through deploying Big Data Clusters on Azure Kubernetes Service. So not every concept was new to me, but I still decided it would be a good idea to expand my fundamental knowledge of Azure.
There are a few server-wide configurations that you cannot setup during inital Big Data Cluster deployment. One of those is enabling SQL Agent service. That’s right. If you have deployed a Big Data Cluster you will notice the SQL Server Agent is disabled by default (see screenshot below):
In my previous posts, I showed you how to deploy a single node cluster and a multi-node cluster. That’s find and dandy but how do you upgrade to the newest SQL Server CU? This blog post will show you how to easily upgrade a SQL Server Big Data Cluster. This method applies to a single node or multi-node cluster. It does not matter how many nodes your BDC has, this upgrade process will work.
In my previous post, I talked about deploying a Big Data Cluster on a single node Kubernetes cluster. That’s cool and all but what if you’re a business or organization that cannot have your data on the cloud for whatever reason? Is there a way to deploy a Big Data Cluster on-premise? Absolutely! I’ll walk you through setting that up this blog post. I will walk you through deploying a 3-node Kubernetes cluster, then deploying a Big Data Cluster on top of that.
One of my 2020 goals was to start presenting more. The easiest, and most cost efficient, way to do that is by presenting remotely. Another goal I had was to start creating Big Data Cluster videos for my YouTube channel. These two goals required a different set of tools. I also had no idea how complicated video recording could get. I finally got everything setup the way I want and a few friends asked me about my setup. This blog post will list everything I use and lessons learned.