Those decisions pose particular relevance in the context of artificial intelligence, which the company nodded at with its “generation AI” theme this year. So how might the company plot its route ahead? And how did it perform during the Data + AI Summit? In the view of industry analyst John Furrier, the event hit a “home run” with analysts, demonstrating that generative AI is hitting everything from the physical storage layer, to compute, all the way to the application.
“To me, that’s the big story here … generative AI isn’t just a fad or any specific thing in the stack,” Furrier said on a recent episode of theCUBE podcast. “It’s going to enable innovation up and down the stack, from the physical layer, all the way up to the application layer.”
More insights were provided during the conference by Furrier, host of theCUBE, SiliconANGLE Media’s livestreaming studio. Furrier, along with company executives and industry analysts, discussed the latest news and insights while taking a look at how Databricks might seek to capitalize on those critical strategic decisions. (* Disclosure below.)
Here are three key insights you may have missed:
1. There’s hope to end the format wars.
Databricks, with an eye on open-source adoption, has been moving forward with an effort to change how data management and analytics are approached. With that in mind, a keynote by Databricks Chief Executive Officer Ali Ghodsi brought with it a big promise: Databricks is planning to end the format wars by introducing a uniform layer, bringing together different formats together on metadata.
For months now, theCUBE has been on record saying that open source will win, according to Furrier.
“It’s very clear that open source is a big theme of their history, and their future, and their present. And what’s more exciting is, he even dropped a few bombs out there by saying, ‘We’re going to end the format wars,’” Furrier said. “Bold move.”
Such a move would directly compete with industry giant Snowflake Inc. But the ultimate way all of this will play out remains to be seen, according to theCUBE analysts.
“I think a lot of it was around how is generative AI going to develop and how do you make it easier?” said theCUBE industry analyst Rob Strechay. “Also, it was unified governance and how you can bring that together — and democratization of analytics as well. I think that’s a big piece of it. Those were kind of the three big themes that they’re taking with them.”
For Furrier and Strechay, the event was also an opportunity to assess the latest when it came to Delta Lake, the optimized storage layer for storing data and tables in the Databricks Lakehouse Platform. All told, the growth of Delta Lake indicated that data is moving from a cottage industry to a full-blown architectural enterprise.
“Large language model engineering is a term we heard yesterday from some of these hot startups,” Furrier said. “What that means is data is going to be a completely different industry. It’s not about the database; it’s about the data corpus and some of the new techniques to get these LLM and foundation models built into applications.”
Here’s the complete keynote analysis analyst panel with John Furrier and Rob Strechay, part of SiliconANGLE’s and theCUBE’s coverage of the Data + AI Summit event:
2. There’s an effort to position AI for the generative AI future.
When it comes to engines, such as ChatGPT, some tech industry leaders have speculated that massive datasets and common access to information will precede new forms of customization in enterprises. Enterprises will indeed need solutions geared toward the problems at hand, which is something Databricks is focusing on, according to Matei Zaharia, co-founder and chief technologist of Databricks.
“In the enterprise … you need a level of precision and reliability that’s quite a bit higher. We think enterprises would want to control it in very domain-specific ways, and we’re building the governance tools for AI based on the rich governance tools we already have for data,” Zaharia said.
With AI being considered the universal enabler, Databricks has sought to empower its users through its lakehouse ethos by seamlessly unifying the AI and data platform, according to Joel Minnick, vice president of marketing at Databricks. That’s because the goal of the company has always been to be in pursuit of the democratization of data and AI, according to Minnick.
“When we thought about Delta Lake and the 10 years that it’s taken for the Lakehouse to mature to the place it is today, it was always about … there are all these proprietary data platforms out there, and that restricts what people can do. We should disrupt that,” he said. “We should find a new way for folks to use data. And that was what led to Delta Lake and to knock those silos down.”
Here’s theCUBE’s complete video interview with Matei Zaharia:
Here’s theCUBE’s complete video interview with Joel Minnick:
3. Data is going to be everywhere.
Data is going through a big shift right now, being influenced by broader trends in generative AI and large language models. In that new world, Databricks is back in its wheelhouse, according to Doug Henschen (pictured, right), VP and principal analyst at Constellation Research Inc.
“I think generative AI for the last three years, they’ve been building up the warehouse side of their Lakehouse and making a case,” he said. “All this time data science has been their wheelhouse, and their strength and their customers are here, while others are making announcements of previews that’ll help eventually down the road on AI. This is where it’s really happening, and they’re building generative models today.”
With its uniform Lakehouse Platform, Databricks has got the ball rolling, according to Tony Baer (pictured), principal at dbInsight LLC. That’s because it combines the superior elements of data warehouses and data lakes for better AI, analytical and data initiatives.
“When I did my market landscape research on the lakehouses, I was saying, ‘Look, right now they each have various different capabilities, but given the amount of community and involvement in each of these projects, they’re all essentially going to be … a level playing field,’” he said. “I think UniForm was a very mature move on Databricks’ part.”
Meanwhile, amid the big shift in data, some companies, such as Vast Data Inc., are rethinking their approaches. These days, people are starting to architect strategies to collect much more data, because, for the first time, they can actually go and process it, according to Jeff Denworth, co-founder of Vast Data.
“Big-time investors in the field and successful companies are realizing the value of data hoarding. This shift from real-time data to historical data is a trend that is not hard to achieve anymore, and democratizing access to this valuable resource is the true game-changer,” he said.
Here’s theCUBE’s complete video interview with Doug Henschen and Tony Baer:
Here’s theCUBE’s complete video interview with Jeff Denworth:
To watch more of theCUBE’s coverage of the Data + AI Summit event, here’s our complete event video playlist:
(* Disclosure: TheCUBE is a paid media partner for the Data + AI Summit event. Neither Databricks Inc., the sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Your vote of support is important to us and it helps us keep the content FREE.
One-click below supports our mission to provide free, deep and relevant content.
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.