I hold a master's degree in computer science from the University of Zagreb. Over the years, I worked at Vrbovec Wireless, Calyx, Styria Media Group, and Memgraph (my current endeavor) to build and manage software products/teams. From a tech perspective, I mostly build C++, Python, and Rust software. My hobby is networking. In general, I like computer science and software engineering. And last but not least, I enjoy building teams and figuring out where the value is.
Is there a scale of how much data one can effectively process, something similar to the "Kardashev scale for data"? What would be a name for such a thing? During Memgraph's Community Call (https://youtu.be/ygr8yvIouZk?t=1307), the point is that AgenticRuntimes + GraphRAG moves you up on the "Kardashev scale for data" because you suddenly can get much more insight from any dataset, and everyone can use it (a large corporation does not control it). I found something similar under https://adamdrake.com/from-enterprise-decentralization-to-tokenization-and-beyond.html#productize, but the definition/example looks very narrow.