The definition of Big Data is, to put it mildly, contentious. If you were to ask ten different experts in the field, it is highly likely that you would receive ten different answers. Is it a tool, a process or a result?
Even within the HICX office the answer is not a clear cut one.
Personally, I tend to take the view that Big Data cannot be any of the above. The clue is in the two words, big and data. Surely that makes it all very obvious, as such I will take the first part of the Gartner definition as correct ‘Big Data is high-volume, high-velocity and/or high-variety information assets’.
Simple. Gartner do continue the definition to say, ‘that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation’. This is fine too, but it does start to veer into how one should use Big Data. Not what it is. That said, my world is marketing, not the deeper aspects of technology.
So in short, this is Big Data:
- Too big to measure using standard analytics
- Shows trends at a high level
- Is at its best when silos are avoided
- A big overview helps business decisions
Now, onto Master Data.
Master data is, in itself, the data that is organised, the data that goes through a preordained process and is controlled. As such it is easy to analyse and can be dived into to the nth degree.
Master data management (MDM) is a technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise’s official shared master data assets. Master data is the consistent and uniform set of identifiers and extended attributes that describes the core entities of the enterprise including customers, prospects, citizens, suppliers, sites, hierarchies and chart of accounts.
In relation to Big Data it is vital as it provides the structure and integrity. As such these two variants of data work well together. MDM (Master Data Management) providing the structure and Big Data providing the additional information.
In short, Master Data
- Well structured
- Easily analysed (in theory)
- Process driven
It’s important to note the importance of high quality data here too. Garbage in, Garbage out still rings true. Well, rubbish in, rubbish out to me, but let’s not quibble about the foibles of the English language. However you choose to say it, it remains the truth and is linked directly to Master Data Management.
It’s all very well having a lot of data, but if you don’t ensure it is uploaded accurately and then managed appropriately until the moment it is required – you will find that there will be significant issues when trying to make decisions based on data that is questionable at best. As such, Master Data starts with the onboarding process. For us, this is within a supplier portal in which in suppliers provide the requisite information and the buyer checks everything is accurate. The data is then managed carefully from this point on.
In order for any of this to work though, a cultural change is required within an organisation in which data quality is a priority.
If this is done, then there are huge opportunities in analysing both the Master Data and the Big Data with one supporting the other. Allowing for analysis of the details as well as a more holistic view of the trends that exist within the Big Data.
To summarise, if you get your data input and the processes that are related to said input right – then you are a getting very close to achieving your goal of great business insights and a competitive advantage.