Data continues to breed like rabbits, just as it has always done. Data types that need to be stored dependably are exploding. What is more, different generations of data (and their associated tiers) are also breeding like rabbits. Enterprises today face an explosion of data, brought on by both internal demands and by ever more extensive legislative/regulatory requirements.
While enterprises of all sizes do not need to go as far as introducing an equivalent to the myxomatosis disease (as occurred in Australia to try to control its excess of rabbits), they do need to start to understand the problem at hand. If organizations need to know their Data Stored Factor (DSF); this opens the door to examine systematically what data they really needs to store. Constellation’s DSF looks at a combination of cost and storage necessity (as opposed to what can be place on storage).
Enterprises need to start understanding:
- What is their existing DSF?
- What should their DSF be for the future?
- What aspects of data are causing the greatest volume growth or management issues?
- What fundamental rethinking can be undertaken to contain data stored without compromising the business?
This Constellation Quark examines the pressures and requirements of managing this data explosion, and what this means for enterprises. Permitting ‘data stored’ to expand out of control is no longer an economic option for organizations. Yet, denying the problem – or subsuming it within ‘storage’ (‘stored’ and ‘storage’ are different – the former is about what needs retaining while the latter is about the technologies used to accomplish this) – is all too often the expensive reality. This is why Constellation has developed its Data Stored Factor (DSF) methodology – a tool to obtain first insights and then control.
