The term Big Data has a hazy genealogy but is generally considered to have come into use in the 1990's. Broadly speaking, the main attributes to determine Big Data have been Volume, Velocity and Variability. As vendors have joined the party, the original 3 V's have been extended to include Veracity, Volume and Various other V's!
With the expected explosion of data arising from IoE3 (Internet of Everything, Everywhere, Everyone) we are now going beyond Big Data and are heading into the era of Mega Data.
Each of the topics within the subject areas are worthy of an article in their own right.
In future blogs I hope to focus on each of the key areas:
Communicate - Analytics is forecast to become a $9.83 Billion market by 2020. The power of Data Visualisation continues to grow with many mainstream BI vendors providing toolsets with comprehensive visualisation capabilities.
Calculate - Moving from traditional statistical models, more and more is being applied to the use of Advanced Machine Learning. Several technology vendors have stepped into this market and there are also courses being promoted by universities.
Curate - This is what replaces the Extract - Transfer - Load phase of traditional Data Warehousing. There will still be a need for some ETL but with concepts such as Data Federation and Schema on Read then the amount of data transferred from source to target may need to radically change.
Capture - The starting point of the data journey. Estimates vary about how many devices there will be but forecasts of in excess of 50 Billion devices , proposed by Cisco, don't seem unrealistic.
With this exponential growth of devices to capture data, it will be interesting to see how our networks keep pace.
No comments:
Post a Comment