Big data is a term for data sets that are so large or complex that traditional data processing applications are inadequate to deal with them. Challenges include analysis, capture, data curation, search, sharing, storage, transfer, visualization, querying, updating and information privacy. The term "big data" often refers simply to the use of predictive analytics, user behavior analytics, or certain other advanced data analytics methods that extract value from data, and seldom to a particular size of data set. "There is little doubt that the quantities of data now available are indeed large, but that's not the most relevant characteristic of this new data ecosystem."
Big data can be described by the following characteristics (5Vs):
- Volume: The quantity of generated and stored data. The size of the data determines the value and potential insight- and whether it can actually be considered big data or not.
- Variety: The type and nature of the data. This helps people who analyze it to effectively use the resulting insight.
- Velocity: In this context, the speed at which the data is generated and processed to meet the demands and challenges that lie in the path of growth and development.
- Variability: Inconsistency of the data set can hamper processes to handle and manage it.
- Veracity: The quality of captured data can vary greatly, affecting accurate analysis.
Here in SMA, we have the know-how and use the best practices of data archiving, handling and indexing to bring safe, updated and reliable data to the end users in the fastest possible way.