I came across an article in the Journal of Product Innovation Management regarding big data and its value to a firm.
I find big data and data science interesting and the author's efforts to prove the value to a firm is timely. It seems as if the term "big data" is sexy and a topic businesses are budgeting for or, at a minimum, wondering if they should. I agree that big data and the associated processes is something that should be valued like any other project in a firm. Especially if a 3 terabyte database can cost $1MM per month, my goodness!! No wonder big data is considered a competitive advantage. The companies that can spend that kind of capital are a subset in themselves. The framework of valuing volume, variety, and veracity is helpful in trying to understand if undertaking a big data project is worthwhile for a firm.
In some ways, learning that volume has a negative effect is surprising but it does make sense. If we collect and store all the data but do nothing with it, what was point? The authors suggest variety and veracity have a positive effort. They choose not to include velocity as suggested by other articles and I can appreciate why. What I wonder about in regards to velocity is velocity of the analyst. Cassie Kozyrkov has suggested, https://towardsdatascience.com/what-makes-a-data-analyst-excellent-17ee4651c6db, the speed of the analyst and thereby the analytics is critical. I have felt this in my own work, speed of delivery. It feels like this is limited by my own capacity and speed of learning. If I am limited to excel, then analysis happens at a certain speed. If I can program in R or Python, I can increase said speed. Further, what about creativity? I would argue insights are created by equal parts science and art. What if the analyst is only good at the science? Do we even consider artists analysts? Would you want an artist, with limited science, performing the analysis? On top of all this, how do we teach both? Is there time enough to teach both? I don't know.