According to the survey, 60% of respondents agree that data analysts in their organizations lack the technical skills to analyze data on Hadoop. Seventy percent agreed to the need for self-service access to Hadoop, defined as the ability to grab raw, unstructured detailed data and then create ad hoc queries and find insights, and nearly 37% of those strongly agreed to this need.
52% of respondents either have Hadoop in production or have a Hadoop cluster running. These professionals indicate an even more acute need for self-service access to Hadoop with 81% expressing this need.
The survey also revealed that Big Data teams are typically cross-functional and composed of data analysts, IT, BI and line-of-business analysts. Survey respondents indicated an average of 60 business user members assigned to Big Data teams, and an average of 17 data analysts, 21 system administrators and 8 BI specialists.
The survey shows that nearly two thirds, 74%, choose SQL, as the pervasive skill set for Big Data Analytics, with Java coming in second with 58%.
The survey also confirmed that vast amounts of data are being held in Hadoop clusters. Thirteen percent (13%) of respondents indicated they are storing 100 or more Terabytes of data in their Hadoop cluster, 6% indicated 41-100 Terabytes of data, nearly 50% said they are storing between 2-40 Terabyes of data and 32% indicated they are storing 1 Terabyte of data.
By subscribing to our early morning news update, you will receive a daily digest of the latest security news published on Help Net Security.
With over 500 issues so far, reading our newsletter every Monday morning will keep you up-to-date with security risks out there.