fbpx

Real-Time Data Analytics and The Metaverse

One of tech’s biggest news stories coming out of 2021 would be Facebook’s rebrand to “Meta.” Like Google, it’s likely that nobody will adhere to that new name, but everyone sure took note of the strategic direction change towards the metaverse. The Web 3.0 / creator economy / DiFi crowd quickly attached themselves to the theme, and everyone had a good group wank on Twitter over what visionaries they are. Truth be told, the metaverse is an idea that’s been around for decades.

Let’s get back to basics for a moment. Four years ago, we published a piece titled How to Invest in the Singularity – It’s Near which summarized a keynote given by Masayoshi Son, CEO and Founder of SoftBank, about the importance of ARM.

He says that the chip maker he acquired last year, ARM Holdings, is expected to ship over 1 trillion IoT chips in the next 20 years with ARM IoT chips commanding an 80% market share.

Nanalyze

The metaverse we want to invest in is exactly that. It’s the same vision that Jensen Huang of NVIDIA has – one built from scratch where the physical world is connected to the virtual world. All these IoT chips and sensors are turning our entire planet into a digital twin – a metaverse.

Masayoshi Son presentation showing how IoT is shaping AI which in turn is accelerating human evolution
Credit: Masayoshi Son presentation

Then there’s Facebook’s vision of everyone going into a virtual reality (VR) world where they show off their NFTs and desperately try not to offend anyone.

Digital twins produce a tremendous amount of big data exhaust that can be analyzed for insights. The old factoid, “90% percent of the world’s data was created in the last two years” is probably equally true today as the volume of data being produced has exploded and will continue to in the coming years.

Bar chart showing explosion of the volume of data being produced today which will continue on in the coming years
Credit: Statista

It’s why Warren Buffet set aside his aversion to tech stocks and invested in Snowflake, a cloud-based data management platform that helps companies do more with less. Such a value proposition will sell equally well in times of economic turmoil, and investors have clamored on board. The stock remains extremely overvalued – probably for good reason – and now has a market capitalization of over $100 billion.

  • Market cap / annualized revenues
    100 / 1.336 = 75

We won’t invest in any firm with a simple valuation ratio over 40, no matter how grand the value proposition might be. We might never be able to invest in Snowflake, but we’re fine with that, especially when there are other big data companies out there we might find equally compelling. One area that sparked our interest is real-time data analytics, something that’s particularly applicable to “the metaverse.”

At the Speed of Business

How do tech visionaries stay relevant when technologies change so quickly? Two reasons. First, they learn exceptionally fast. Back in the day, rock star programmers could fake it until they made it by learning languages and platforms at the speed of projects. Second, some things don’t change. More processing power and lower latency are always desirable attributes. NVIDIA (NVDA) became the biggest semiconductor company in the world because they built better processors. As for latency, it all comes down to one thing. How fast can we analyze the big data exhaust spewing forth from our growing population of digital twins? One answer is two words – Apache Kafka.

Opensource vs. DAOs

Web 3.0 wankers are always droning on about how decentralized autonomous organizations DAOs are the future. ConstitutionDAO was supposed to show the world this grand vision of technological socialism, but it only managed to burn millions in gas fees and fail miserably at what it set out to do. Truth be told, the DAO ethos has been around for a while. It’s called opensource software. Developers have a special place in their hearts for a technology that is shared in a community where everyone works to make it better. Few understand just how strong this allegiance is in the software development community.

Some of the smartest nerds on this planet will gladly dedicate every iota of their spare time to helping out the opensource community. It’s not because they’re losers, it’s because opensource communities are very effective in harnessing resources to collaborate and produce great things that couldn’t be possible if a single entity owned the software in question. The world’s largest opensource foundation is Apache. And if the name offends you, they could care less. They’re too busy making great things happen instead of complaining on Twitter to get attention.

About Apache

Anyone who slings code for a living has probably used at least some of the $22 billion worth of Apache Open Source software products made available to the public-at-large at 100% no cost. Apache Projects are overseen by a self-selected team of active volunteers who contribute to their respective project for no compensation except street cred. Projects are auto-governing with a heavy slant towards driving consensus to maintain momentum and productivity. This incredibly successful decentralized autonomous organization has managed to produce over 300 projects, some of which probably even sound familiar to the general population – like Hadoop.

One project in the above list is Apache’s Kafka, one of the most successful projects Apache has ever delivered, and one being used by over 100,000 organizations globally.

About Apache Kafka

Click for website

Apache Kafka is the most popular open-source stream-processing software for collecting, processing, storing, and analyzing data at scale. Most known for its excellent performance, low latency, fault tolerance, and high throughput, it’s capable of handling thousands of messages per second. It’s now being used by 80% of the Fortune-500.

Infographic showing how Apache Kafka now being used by 80% of the Fortune-500.
Credit: Apache Kafka

Example use cases include the following:

  • To process payments and financial transactions in real-time, such as in stock exchanges, banks, and insurances.
  • To track and monitor cars, trucks, fleets, and shipments in real-time, such as in logistics and the automotive industry.
  • To continuously capture and analyze sensor data from IoT devices or other equipment, such as in factories and wind parks.
  • To collect and immediately react to customer interactions and orders, such as in retail, the hotel and travel industry, and mobile applications.
  • To monitor patients in hospital care and predict changes in condition to ensure timely treatment in emergencies.

The ability to glean insights from data while it is being generated is the final frontier when it comes to predictive analytics for business decision making.

The Real-Time Data Thesis

Snowflake was the largest software initial public offering (IPO) ever coming out of the gates with a $68 billion valuation. With a current market cap of over $100 billion (IBM’s market cap is $114 billion), it’s already grown enough to fall into our “mega” size bucket.

Nanalyze Growth funnel
Credit: Nanalyze

Another big player in this space, Databricks, is private but said to command a valuation of $100 billion or more in a possible IPO. Both these companies are likely to maintain their lofty valuations over time which means we need to think about other opportunities to invest in the exponential growth of big data. There’s one pure-play company that a subscriber brought to our attention which somehow missed our daily IPO screening process. It’s a $16 billion real-time data company that was founded by the creators of Apache Kafka. From what we’ve seen so far, it may find a place in our own tech stock portfolio. Coming soon, a deep dive into a firm with a very compelling value proposition.

Conclusion

If we think of the big data opportunity holistically, we can divide it into two components – static data and real-time data. Everyone understands why providing information to make business decisions quicker is more valuable. The most responsive way to provide data analytics is while the data is being generated. Real-time data analytics is the new paradigm for organizations of all kinds as we slowly build up the metaverse. Next up, we’ll look at how we might be able to invest in this theme.

Share

Leave a Reply

Your email address will not be published.