Is it worth getting certified in Microsoft's big data solution?

22 September 2014

Between the large volumes of unstructured data produced by social media users and the digital information created by an increasing number of Internet-connected devices, enterprises across the world are wondering whether managing it all is even worth it.

Experts and bloggers have maintained that scrutinizing big data is imperative for modern businesses to survive. This may not necessarily be true, but C-suite decision-makers and their subordinates can't help but wonder what fresh insight they'll receive from untapped information.

With further scrutiny, they've realized Microsoft training and other certification courses provide those in IT with the wherewithal needed to efficiently harness and learn from big data.

What's to be learned?
"Managing big data" can be an ambiguous phrase. Tom's IT Pro contributor Ed Tittel brought some clarity to the situation, asserting those looking for accreditation in big data architectures will learn how to conduct the following tasks:

  • Data loading and removal
  • Cluster management
  • Information backup and restoration
  • Analytics and performance tuning
  • Data mining
  • Application development

Other database-related skills in disaster recovery, replication and provisions are apart of many programs, depending on which one a person chooses to enroll in. Microsoft recently came out with an Azure add-on developed on top of Hadoop.

The elephant in the room
Hadoop is a platform that is discussed quite often among those interested in big data, but many are confused as to what it actually does. O'Reilly Radar spoke with Cloudera CEO Mike Olson, who explained that the open source software runs on a large number of servers that don't share memory or disks. When a person wants to load large or complex data sets into Hadoop, the solution responds by provisioning the information into pieces spread across multiple machines.

One would think this would make finding the data incredibly difficult, but Hadoop keeps track of where it resides, making it easy for data analysts to pull the information whenever it's needed. In addition, processing the information requires less horsepower from central processing units because each server can operate on segregated bits of a particular set of information.

Microsoft's role
Although Microsoft didn't invent Hadoop, it's certainly made use of the program. According to the Azure website, the company recently developed Azure HDInsight, which allows users to perform several functions:

  • Create Hadoop clusters within minutes and delete them when work is completed
  • Combine HDInsight with the analytical capabilities of Excel using Power Query, PowerPivot and Power View
  • Operate big data applications on Windows Server or Linux
  • Develop and fabricate HDInsight with languages such as Java and .NET

In the near future, a select few Azure certification courses will concentrate on how to use HDInsight. The main appeal behind HDInsight is that it makes Apache Hadoop available in a cloud environment, enabling businesses to allocate workloads off-premise. This gives a company's application developers and data scientists spread across the globe access to the same information in a convenient manner.

Analysis add-ons such as Hive and Pig are also featured as a part of the package. While the former uses a language similar to SQL to provide structure to data sets residing in distributed storage, the latter is specialized for handling high-level languages.

Microsoft's customization of the Hadoop engine enables more enterprises with less physical resources to take advantage of big data.

Leave a Reply

Your email address will not be published. Required fields are marked *