Lots of thoughtful guidance for engineers, designers, scientists and businesses to focus on while doing the exciting interdisciplinary and real work that lies ahead.
The extremely thorough study made a few major recommendations, from my initial read:
- Deeply interdisciplinary engineering approach influenced by design and system thinking
- “Middleware” to link high level analysis with distributed computational resources that maximizes reuse [Consider parallel computing reference by Asanovic and team]
- Students and the workforce need foundational statistical and computational training [Think ubiquitous driver permits after mass adoption of automobiles]
They allude to the possibility of an FFT like standard tool for massive data analysis (anyone who has done signal processing knows how ubiquitous the FFT is in getting almost anything done more efficiently). However, they claim to be pessimistic given the scope/variety of this domain.
Perhaps these principles can be uncovered once and for all, such that each
successive generation of researchers does not need to reconsider the massive data problem afresh.
Will we create a platform that stands for decades (think Shannon and Schockley) or take short cuts that lead to yet another disappointing bubble of frenetic activity?
Time will tell – Metonymy Labs seeks to participate productively.