- info
- Transcript

From theCUBE Studios in Palo Alto and Boston, bringing you data-driven insights from theCUBE and ETR. This is "Breaking Analysis" with Dave Vellante. The AI Gold Rush is on. The paths to monetization are seemingly endless, but the most obvious converge on making humans more productive or supercharging existing business models, like search advertising and maybe some others that we haven't thought of. Much of AI adoption in enterprise IT is hidden. Our research shows a very high overlap, around 40 to 60% between AI adoption in enterprise tech and embedded AI inside software from the likes of Salesforce, ServiceNow, Workday, SAP, Oracle, and other major players. But a rapidly emerging group of independent AI firms is gaining traction, catalyzed, of course by OpenAI and Microsoft partnerships. These pure plays are positioning themselves to ride the AI wave while new AI startups are being formed daily with much less capital than previous cycles.
Hello and welcome to this week's "Wikibon Cube Insights, powered by ETR." In this breaking analysis, we review the state of AI spending in the enterprise and look at the positions of several key players in the space that offer AI tools and platforms. And to do this, we invite Andy Thurai, who is the, a CUBE contributor and vice principal, Vice President and Principal Analyst at Constellation Research, he's an AI expert. Andy's going to help us unpack the hits and misses from this past week's Google I/O conference, and give us his perspectives on what it takes to catch the AI wave and avoid becoming driftwood. Andy, hello, thanks for coming on theCUBE. Thanks for having me on. Always good to collaborate with you. I want to, before we get started, I want to set the context on the overall IT spending environment, and then we'll get into it. Let's show a chart here. This is ETR survey data going back to January, 2021. And the survey reaches more than 1500 IT decision makers. The last one was almost 1700 ended in April. This is a quarterly survey. The lime green bars show the percent of customers adding new platforms. The forest green shows the percent of spending that is 6%, of customers spending 6% or more relative to the previous years. The gray is flat spend and the pinkish area is spending down 6% or worse. And that bright red, that is the percentage of customers that are retiring platforms are churning. You subtract the reds from the greens and you get net score, which measures spending momentum, which is shown in the blue line.
And you can see coming into 2022, there was a lot of optimism that, you know, that deteriorated throughout the year in 2022. And while the entry into 2023 brought an optimistic or a more optimistic outlook, we got hit by interest rate rises and earnings revisions. And that put a damper on the enthusiasm and the trend line has continued to decline. So it's no surprise there, but it sets the backdrop. And now let's take a look at the same data for AI, and we'll bring in Andy some colored, same colored bars, but the pace of decline of the blue line is less steep. And the other really important point is the blue line has consistently stayed at above the 40% mark, shown by that red dotted line. 40% is considered a highly elevated level.
So we're talking about spending momentum in AI being 22 percentage points higher than overall enterprise tech spending. Now the other key point is the yellow line, which shows how pervasive AI is in the dataset. That's a measure of AI's relative share of the total survey. And you can see it peaked in April 2022, that survey and then declined last summer. But since the launch of ChatGPT, it's steadily increasing in reaching new highs. Andy, I wonder if you have any thoughts on this? Absolutely, so that's a great survey, by the way. My sampling set is not as big, but based on my conversations with the, some of the bi-level executives, which is almost in line with what your survey finds, if you could put the slide back on, I want to point something out. In that, if you look at that before the ChatGPT became wildly famous, even though ChatGPT has been around, the GPT models have been around for a while, the successful POC that Microsoft has been doing started when this, early this year, so it went wild. So as you can see in the chart over there, it started off, you know, somewhere around January it started ramping up. And then you can also see the adoption of the AI curve, it went much faster. So there are two things I see happening in here. One, because, you know, as I tell people that the AI adoption or ChatGPT is the wildly, most widely successful POC anybody can ever dream of.
So everybody, you know, jumped on the bandwagon, that's one. Two, because of that, enterprises are looking for ways to minimize their cost using any AI use case they can, whether it's whether they'll have productivity increment by using, you know, more efficient code, you have AI, you know, increase developers productivity, by writing two or three times more code or even introducing AI into IT operations and whatnot. Right, so by doing operations, reducing incident management. So in other words, the overall spending in IT is reducing, but the overall spend on IT, AI is not reducing. They're trying to make up for that, spending what they're doing, by putting more money into IT. That's what I'm seeing. Yeah, you're right on. I mean, it's definitely getting a greater mindshare. I mean, nobody's really figured out yet, but virtually a hundred percent of the people I talk to are trying to figure it out. I know, you know, I use ChatGPT virtually every day. I mean, our developers were able to develop a CUBE AI demo in less than a week. And it's actually very good and it keeps getting better and better and better. Okay, it's triple crown season. So we got to line up some of the AI horses on the track. This next chart shows ETR survey data. It's like I say, just almost 1700 IT decision makers from the April survey. The vertical axis is net score or spending momentum. And again, that red dotted line at 40% is a highly elevated spending velocity. Anything over that is considered highly elevated. The horizontal axis is pervasiveness in the data set.
And the first thing to note is the big three hyperscalers, they're dominant in this space along with Databricks, which as we've reported has very strong market momentum. And you'll also notice how the position of Microsoft has changed since its announcement with OpenAI and the Bing integration. You know, Anaconda is an AI platform for data scientists, and there's a pack of companies including H2O.ai, DataRobot, Dataiku and some others that we will talk about in a moment. But, and you can see both Oracle and IBM Watson, you know, show up. First Andy, do you have any comments on this? And anything that surprises you here at a high level? So as you can see in the chart, the Microsoft, as we discussed about the numbers, the jump on Microsoft, as one would expect after ChatGPT is phenomenal, while the other two either lost market share slightly or went up a little bit. Microsoft, as you can see, went from, you know, lower left all the way up to straight up to, you know, upper right, which is really- It's dramatic. >> Dream chart, dream chart for anybody, you know, for the matter. So, you know, that's all riding on the ChatGPT you know hype and also I actually sat in and listened to Satya Nadellas' Q3 earnings script and there are some things that jumped out at me, right? One is he is talking about the new AI wave solution and expanding the TAM, and he called it as a new wave, which means this is not a workload that never existed in Microsoft Cloud before. Now they are trying to bring in this AI workload to move to them. And this is actually an afterthought for them because it never used to happen for them before. The, when people think of AI workload, whether it's Google or it's, you know, AWS was the number one or two for the longest time. Now they're throwing Microsoft in the mix. You know what, maybe I should do that, that's one.
And two, if you listen to the call again, you will see that he claims that one, he has the largest powerful AI infrastructure that he wants people to train their models in it, moving away from ChatGPT, know what it can do for you, and then train your own large models in that. So that's another the market they're going after. And the third one that's from the conference call that surprised me is, you know how many customers they have for OpenAI right now in spite of all this hype? 2,500, that's nothing, that's a rounding error, right? But if you had to look at the Azure Arc, which is a hybrid management, deployment security solution, that has 15,000 customers. So they're barely scratching the surface of the AI and they're hoping against hope by throwing in, you know, things like Cosmos DB, which is a AI database, and the combination there of, they are hoping that when people think of AI they will think of Microsoft first instead of looking at other companies. Well, they've, they, I mean, it's completely changed the way people think about technology, think about interacting with, I've often said many times that AWS turned the data center into an API and now ChatGPT, people are rethinking how we interact with technology. Let's dig into some of these players in a bit more detail. But before going there, Can I make another comment? Yeah, please. So to point about Microsoft AI, in spite of all the hype, people are thinking that, you know, Microsoft wants to get this newer workload of AI, but it's initial projections and initial goal is not to get that workload, it's to make a dent in Google's search business. If you are to prove that AI can do all this for you, Microsoft is hoping to revive their search business. You know what a huge market that is, Google pretty much owned it for a number of years. They're hoping that by doing a AI powered search, Microsoft can get some of that. That's what they're going after more than AI workloads. I mean, it's the most profitable business there is. It is, it is. Look at the pie chart. >> It's a huge market. If you take the pie's chart, of search advertising, it's like maybe a sliver is other, and the Google has- Pretty much the whole market. You know 98% of it, it's unbelievable. All right, bring up that other, those talking points because I want to first talk about IBM. IBM, Andy, they could have had this thing sewn up. I mean, it had all the mindshare back at the beginning of last decade. Alex, I wonder if you could play the clip. Now we come to Watson, we're looking for Bram Stoker, and we find, "Who is Bram Stoker?" And the wager? (audience cheers) Hello 17,973, 41,413 and a two day total of 77,147. So, I mean, I remember that moment thinking, Wow, Ken Jennings was such an accomplished jeopardy player. I mean, this is incredible, a machine beats a human at jeopardy. It beat, you know, Garry Kasparov I dunno, 10 years earlier at chess, this is like that times a hundred. What happened? Why did IBM Watson fail? Well, so I remember that moment. I was at IBM at that time. So we, it was a very proud moment for all of IBMers at that time. Amazing. It is, it is. But the problem is, again, this goes to prove a point, no knock on IBM, but good research companies may or may not be able to figure out a way to go to market at times, you know, so they could have owned it, but there were limitations. There were technology limitation at the time. Usability cloud is not a big thing at that time. There are limitations on how you can use the technology, the implementation, the cost, the value that you'll get out of it for the money you're doing at that time is not feasible. So IBM kind of delayed a little bit. And also there was, again, I'm not speaking from internal knowledge. I'm talking about overall observation. The other thing is that the AI related regulation, governance, ethical, responsible AI and all of that, were not mature by them, right? So now there are companies starting to do it, still it's not mature in my view, right?
But if you had to look at the Think 2023 announcement they made last year, some of them were pretty good. It caught my attention. So there are a lot of things like, you know, AI studio, data store, governance, and even the things like GP as a service, all of those things, they're just catching up with the market. They are so far behind, they're catching up with the market. The thing that did stand out to me, couple of things. One, they have this center of excellence, of AI experts over a thousand experts. Imagine that, that's huge. None of those other companies have. And these guys are in front of customer all the time. So if you are able to do this long cycle, long tale sales cycle with the customers, handholding them as a customer advocates, IBM has a way to maybe gain some traction with customers.
That's one, and two, they also have this, what's called an environmental intelligence suite that will provide you information about, you know, carbon footprint, how much it takes to train your models, how expensive it is, the carbon footprint, the whole calculation. So for the companies which are very environmentally sensitive, because the training large models is not cheap as you know, we are talking about tens of millions of dollars, you know, and it's reducing in a faster pace. It's coming to our towards a few hundred thousands now. But still, there's a cost involved. So if the company is very, I'm worried about the environment, I want to say that it'll give you an idea of, you know, how much it'll cost and where and all kind of things. So there are a couple of them are standing out to me, but overall, they're still catching up. I feel like IBM's greatest strength is its biggest hindrance in that it's a very services oriented company, when they brought out Watson, you know, Ginni Rometty was the CEO, she was the, you know, understudy of Sam Palmisano, both very services mindset. They've got a phenomenal services organization. And I think a lot of IBM said, Hey, we can use services and drive services revenue to implement all this AI. And it's just, it's the wrong scale model. And now we're sort of. (no audio) (laughs) Yes and no, I mean, what other brand they got? What are you going to say, IBM AI? Watson is a well-known brand, right? I mean, you know, for, in spite of it's not gaining that much traction. Right, so you would've gone with it. All right, how about Microsoft? I've said a number of times that they went from third place in AI technology relative to Google and Amazon, and then all of a sudden they cut the line with OpenAI and that deal they cut. And Satya, he's got that sort of Cheshire cat grin we were talking about this before in terms of he's only got a fraction of the search ad market, and now, you know, even a small share gain is going to both hurt Google and drop to Microsoft's bottom line. But, you know, word is Microsoft is actually talking to Firefox about embedding Bing Chat into Firefox. They're probably going to pay a bunch of dough to do that. And it's because really Bing chat hasn't really moved the needle. What are your thoughts on this? Well, there it goes, we we're talking about search, right? I mean, at the end of the day right now, the money is not in the AI workload. That's an innovative futuristic, I'm thinking down 10 years down the, again, things could change fast, but I'm thinking about 10 years, 20 years down the road that, you know, that's my workload. But right now what my workload, the major workload that I can get is the search, search advertisement and Bing was almost never existed. So now they think they can revamp that by a combination of Bing or Firefox or, you know, doing AI infused. There is, I got to tell you before this, I mean, have you ever used Bing? Hardly ever anybody uses, right? No, and even now, you know, I've used it. I'm like, eh, it's okay. I'd rather use ChatGPT honestly, you know. So that's the point, by infusing ChatGPT into Bing, they're hoping to revive their search, but again, we'll talk about Google in a minute. By doing certain things, Google either caught up, in my view or catapulted ahead of that. So it's going to be a two or three horse race when it comes to AI and Microsoft and Google is at it. I actually think Bard's really good. I've used Bard a bunch, and I find it, its accuracy is sometimes better, but anyway- I agree. You wrote a piece called "Google's Generative AI Strategy From Google I/O 2023 Hits and Misses." We'll put it into the show notes so that people can get access to this. I thought it was very good. You know, you basically, I mean it was a fire hose of announcements at Google I/O, it was unbelievable. And this piece, like you say, you laid it out really nicely. And I agree with you, workspace and email that helped me write, the photo editing is pretty cool, we've seen that before. You see the commercials of a Google Pixels, Google Pixel the phone, you know, the Bard uplift, I thought was really impressive. And I, and I've, I said I personally have had better experiences with Bard than I have with, you know, ChatGPT from a quality standpoint, believe it or not. But you also had some misses in there too. You didn't think Codey went far enough, didn't have enough scope in terms of the languages that it supported. You were disappointed with the industry breadth, even though they did have like a Med Palm or Palm Med, I can't remember exactly what they called it, for the medical, it doesn't sound like you were overly impressed, nor was I with the Google Cloud platforms announcements and integration. But maybe you could explain sort of what Google did, you know, Palm and your take on those announcements. Okay, so let me talk about a couple of items in the hits, which I thought was pretty good. One was the, what people didn't probably get in that is the AI enables cognitive search that Google introduced. In my view, that's going to be a game changer from the search market, snatching it back from Microsoft again. Yeah that's the, that's the Bing competitor, right? It's a Bing competitor big time, but here's the thing though. The ChatGPT equal, and if you are to do a search, ChatGPT models are not real time. We are talking about training the model in a incremental way. I mean the first ChatGPT when was the hype came in, it just like a two year old model. People were wowed even with the data that is about two to three years old, but Google claims, again and need to be proven. The hearsay is that it's real time cognitive search, which means if I'm searching, imagine ChatGPT giving you answers real time, versus two year data. That's what Google is going after, right? That's one that I thought was pretty good by doing it- Versus 2021, right? So that's one, the other thing is the Bard synthesized AI content when it shows that they don't want to miss, because there's a lot of people who love the original way of Google showing, you know, all the documents, pick what it was kind of thing. They, if you look at the demo, what they did at the Google I/O show, they give you both, they give you that Bard search, synthesized search results on the top, which is equal to ChatGPT, you can see that synthesized results. And then at the bottom of it, regular Google results, right? So I think they don't want to lose this market and they want to get the other market. So they kind of mix it up. Innovative dilemma. So that's one. And the other one I thought was pretty good was that actually could be useful, not just on search, but now we could move things into more like a e-commerce market. You know, the, when I'm building stores and stuff, I could search using that, which ChatGPT cannot do, by the way. So I could do search for pictures, images, combination thereof, and I'm looking for something, I search for it. And then that enables the e-commerce stores, which is not an option with Microsoft search right now with Bing, I mean with ChatGPT right now. So those two actually stood out to me, in addition to the Google workspace we talked about, photo image editing, multiple LLM models and blah and all that, right? I'll bring this up later on. But I think that point you're making about e-commerce and shopping is really, I mean, you can do some basic shopping at Google, at least it's from a search standpoint, but you really don't, you don't do the transaction. You'd rather go to Amazon or some other site- For now, for now. Maybe you go directly to the site. But what if you could bypass Amazon's warehouses, go directly to the site, and then have them drop ship it no warehouse needed, okay? That is what they're doing. And that's kind, kind of the model that Alibaba uses. So that could be very disruptive. Yeah, it could be, that's why that that stood out to me, right, I was, I was watching that. I don't know if people got it. And the other one actually they were talking about was, you know, you know that Google, when it comes to maps, it's Google, that's it. No one else you'd go, right? And they were talking about, you know, not only that, you know, I'm able to map the coordinates to see a 3D view of, you know, where I want to go and a total view. For example, if I want to go for running, walking, I want to see that, but on top of it, they could also add things like, you know, how was the traffic going to be, the air quality going to be, the temperature going to be into the futuristic, into the tomorrow morning, tomorrow afternoon kind of thing. So it's like a back to the future movies. I thought that was, that wowed me. But the flip side of that is, if Google, if we are watching this, the people who do maps make it better. I mean, they should be, if you're going to use, they should be using AI already, I'm sure they are. But yeah, if I go into like, say Waze, which is owned by Google. And I, and I want to pick a time, like a choose a better time to go. And I choose, like, let's say I live, you know, out in the sticks, so if I want to go to Boston, it's if I, there's no traffic, it's 45 minutes. If there's traffic it's an hour and a half, an hour and 45 minutes, and it will tell me at eight o'clock, you know, leave at whatever, seven o'clock for an eight o'clock meeting. And I'm like, there's no way I'm going to get there in an hour. And it'll say, yep, yep, yep. But then when I go, I can just see the traffic building up, building up, building up, and I can predict it, as a human better than the machine can. So they should be able to do a better job. That's the demo that they're showing. I'm hoping that they'll be, because they'll be able to predict, and I'm hoping that this comes out. That's what got me excited. It's just, I'm skeptical because I'm like, why doesn't that intelligence already get in there? The only answer that I can, that I can give, I'm ranting, is that they care, they're optimizing for ads. You know, maybe they want to keep you in the car longer. All right, let's talk about AWS's Lego block approach. I'm a fan of targeting builders, which both GCP and Microsoft will do. But those two definitely have other consumer and advertising aspirations, whereas AWS does not. Of course, there's always Alexa. So you've got Titan, you've got Bedrock, which is a large language model as a service, you've got CodeWhisperer. What do you think of AWS's Lego block approach and their chances? So each one have their own approach as we discuss, right? And Amazon's approaches, you know, as we talked many times, I'll give you all the building blocks, you need Legos. And used to be, they used to give it at the, you know, much of the infrastructure level, right? Now they are moving the thing up a little bit because you know of the competition and you know, the pressure they're getting from them. So your Bedrock obviously is going to include the foundational models as they call it, multi models. And they have both for text vessels, for images and stuff. And they got Anthropic in there, and they got AI21 Labs, Stability AI and Amazon's own Titan. They're all accessible via APIs. That's all the base, you know, table stakes. But the thing that impressed me about that is they're also making a claim, I don't know if, I don't think it's available yet, but whenever it's available, you can privately customize their firm using your own organizational data.
That could be a differentiator. Because basically you take the models, whatever the model is offered to you, and then you train the models on your own data and keep a private instance of it. So basically what that means is if I want to do a support chat, you take the knowledge corpus you have and then train the whole model using your data, then all of a sudden you have the super support persons available that you can talk to any given time based on the data that's available that I'd really like. And the other thing I like is the Amazon Code, well, I wouldn't say like it, Amazon CodeWhisperers is decent, but it's not at the level of CodePilot yet because CodePilot has been around for a while now. So they are up and running much better than these guys are. And then I think they hit it out of the park with the Hugging Face partnership because if I'm going to build the blocks, then I got to give option for people to get the models as well from Hugging Face and start using that.
I thought that was huge. And then also, obviously as you know, they have their own easy two versions of customized. I mean, Google came and talked about it multiple times saying that I'm going to both Microsoft and Google, saying that I'm going to have customized infrastructure or optimized infrastructure for Amazon. Yeah, they follow, they're 5 years on. But also the, the easy, you know, they'll have a Trainium-based for training of the models. They have a specific easy models and then, you know, chips. And then same thing for Inferentia, you have for Inferentia-based models. Yeah, so you got, so Titan is the new chip. Titan is actually the model LM, FM model for- Oh I say, okay and so new Inferentia The Trainium was for training of the models. It was Trainium and Inferentia. which are ARM-based silicon from the Annapurna acquisition. So I mean, Amazon's probably five years at least ahead of those guys. In certain areas, yeah, I mean, you know, look, again, going back to this, the Microsoft's, you know, set up a thing saying that, you know, I will give you at a top level, I'll give you, you ask for what you want, I'll cook the whole thing and give you as a chef, you know, the final finished product. But Amazon is like, you know what, I'll give you all the ingredient, best of the ingredients and best of the best and you can build what you want. And it's, there's nothing wrong with either model. You got to choose what you want. But aren't isn't, aren't Google and Microsoft going to do the same? The difference is they're also competing on the full stack with, you know, the completely integrated, I mean, Google's going to be going after search, right? They're going to go after, you know, actually, you know, some e-commerce models potentially. Microsoft's going to be infusing it into their application software. Whereas Amazon, I think is generally saying, here go build it and then go compete against, whether it's, you know, up to stack, they're not today anyway, getting, you know, deeply into building application software. They've got certain verticals when you think about their call centers. But that's the thing. It's not like, you know, it's not like I just give you infrastructure or chip level, you know, I'll go up the foundational models, I go up even applications if you want, I'll integrate with the full corpus of knowledge, data, what you want. So they can go up the stack a little bit, one at a time. But again, they are, they're models that, you know, I'll give you the components you want and my, Amazon's primary goal is to drive traffic to their cloud. It's not about, you know, selling AI tools and whatnot. That's my model, you know. All right, let's take a look at some of the emerging companies that aren't yet public. This is data from ETR's ETS, the Emerging Technology Survey. It's exclusively tracking privately held companies. And plots net sentiment, which is a measure of the net percent of customers that intend to engage i.e. either evaluate or adopt on the Y axis, and it's plotted against mindshare, which is measured based on the presence and the survey of these 1200 IT decision makers, and note that OpenAI, OpenAI wasn't even in the data until last fall when ChatGPT was announced. And then they, now that at the time, and now they still have the number one position on both dimensions, by far. Databricks is very prominent and always has been with its ML and AI tool chain, but also some of its other products that may be seeping into the sector, but it's intended, anyway, this survey to be clean with AI/ML there's a whole 'nother data platforms database, data warehouse section, and so Lakehouse should fit in there, but they also show up as I say in that section.
So it's intended to be AI only. And you can see the other emerging companies like Anaconda, which is a platform for data science, DataRobot, Dataiku, Hugging Face, which we've had at our AWS startup showcase doing partnerships with AWS. And then this month they announced with IBM and others. So what do you make of this data? Are there any surprises in here to you? Yeah, a few surprises, right? So one is, well, I mean obviously OpenAI is not a surprise, that everybody should know. I'd be surprised if it is not rated that high. You know, it went from, came from the left field as they say, took everybody by surprise. So that one is fine, but the one that actually most surprising to me is Hugging Face, because Hugging Face has been all over the place. You know, if, you know, if you know OpenAI pretty much, you know, Hugging Face- All over the place in a good way you mean? In a good way, and they have been forging relationship with, you know, from, they actually have a relationship with Azure as well with Azure, Amazon, with IBM recently while they released and Google, they're all over the place in arms and relationships. You know, they have become the defacto model repository for all the model, you know, distribution. So essentially when you're a data scientist, you know, if you create a model, you need a place to, you know, share it. And Hugging Face has become that. It's not just that they also provide, you know, a platform for you to train models as well. But again, it's well known in the model management, model training, model repository area. I'm surprised that, you know, it's that low. I don't know why and how this is measure, right? So that's a surprise to me. Well there's a bias toward large companies, large US companies. I don't know if that makes a difference. Potentially because it's a much smaller, comparatively size wise, that's possible. Yeah, yeah, so that could be, you know, maybe a lot of the sort of, you know, this is hardcore enterprise IT, right? It's big banks, it's insurance companies, it's large manufacturers, it's the global, you know, 2000, that's kind of where it's at. So, you know, maybe they're the fat middle and the later adopters, you know, they're probably not so much the early adopters, although some of the financial, you know, sector is going to be early adopters, so. Yeah, as well in certain use cases, yeah, it's useful. The other other thing is Databricks. So Databricks has become more of company that's more, you know, they are providing all the tools, almost taking the Amazon way. Suggesting, you know what? They showed, remember that they had the employee, their own, the employees collected dataset and they showed them how to train an LLM. Basically, they're trying to showcase the same way that Amazon is doing. So, hey, you know, you can build a model with me, why are you going somewhere else, right? Because at the end of the day- Wasn't that GPT washing by Databricks, a little bit? In an extent, yeah. Actually I wrote an article about that as well. We talked about that. So at the end of the day, if you're going to pick everything up and move to another cloud, if you have no need for me, then I'll cease to exist. So I'll show you how to do with mine. I mean, yes, that's valuable, and I'll show you how to do with mine and then you stay with me, right? So. (laughs) All right, okay, so it's one thing to have Mindshare, but to really get ROI, you got to have adoption and show real business value. So this next data aims to do just that. It plots the Mindshare data that we showed earlier, that's that blue line, and then the percent of those customers that are familiar with the AI platform, so they're aware of it, but those that have also evaluated the platform and they intend to use it or expand their existing usage. So it's a measure of adoption. Databricks has the highest adoption rate at 26%, as I just defined, followed by Anaconda, then OpenAI at 13%, we could talk about that. Everyone is using ChatGPT, you know, but personally, but what about adoption in the enterprise? Then you see Hugging Face to your point, at 13%, I mean, it's not off the charts, but it's still very solid. DataRobot, Dataiku, in the low double digits, and then the rest. Anything here surprise you, What do you make of this? So pretty much all of them in that list, that's in line, except the one, two, and four, right? The first one, Databricks, as we talked about, you know, they're trying to have companies stay with them and rather than going somewhere else. So I don't think Databrick will ever be in a business of providing AI tools. They are more of a AI platform. I will figure out a way to help you create models using my platform, that's the goal. And I think they're succeeding in that, which your data shows. And the second one, Anaconda is a little bit surprising to me because Anaconda is not necessarily top of the mind for a lot of people to use it as AI platform. I mean, it's a data science platform. It's a very good one at it, but as a AI platform, I'm not sure because it's missing a lot of components that's in place, and of course, Hugging Face is there, as I was telling you earlier, right? All right, let's close with just some final thoughts to get your thinking on this. So enterprise AI is different, is really the first point that we want to make here. IT got they, they're rigorous, they need guardrails. They're concerned about IP leakage, I'm concerned about I IP leakage. They want super strong governance, privacy, security, transparency, explainability bias controls, et cetera. These are hard things to do. And the second point here is most enterprises, they're not going to build AI, rather, they're going to buy it as embedded within their enterprise apps. It's Salesforce, it's, we didn't talk about Oracle, we kind of skipped over Oracle, but they're embedding AI into everything. It's that AI powered, you know, infrastructure that could be, you know, physical infrastructure from, whether it's Dell or HPE, they're infusing AI, but up the stack is really where you're going to touch it. Einstein, ServiceNow is going to be, you know, big in this. What do you think about, you know, those two points, Andy, that first of all, you've got to have more rigor in AI and it's going to be largely purchased through your application vendor embedded, as opposed to I'm going to go build it. Yeah, so there are two schools of thought there, right? One is if you, if you're a vendor, you, in order to drive efficiency, you have to infuse AI into all of your applications. We talked about AI being, increasing productivity. We talked about their email. You know, when I'm writing an email, I got to do a short email, long email or mid email long email. I just have to choose that. Then boom, it'll give you what you need. So that kind of productivity or coding efficiency, all those things. You need to be built in as part of the applications to make the applications more efficient, right? So in that sense if a vendor does it, one, probably the vendor takes a responsibility and liability for it, right? All of the issues that come with that. But if you were to do on your own, we talked about the governance, we had talked about the security aspect of it, we talked about the ethical aspect of it, we talked about a responsibility aspect of it.
There are so many issues in there that CEOs need to be, CIOs need to be start thinking about. Matter of fact, in last, all of the advisory calls that I have over the last few months, I would say beginning of this year, almost obviously as you know, everyone is talking about ChatGPT, how do I use it? How do I use ChatGPT in my whatever I have, and am I liable for it, if something were to happen? How do I reduce my risk more importantly, right? So it it, when it comes down to it, if you are an enterprise, if you're going to use that, my recommendation would be you start off with your business case first. Yes, there's so much of hype, don't get caught in that, start off with your business use case first. What is the problem you have? What do you want to solve? Work backwards to that. And once you have figured out the use case that's going to work, build all the rigor and governance and controls that we talked about onto it. Then you'll have good application. So I mean, I'm sort of basically putting forth the premise that it's going to be embedded. At the same time, I think a lot of companies will think, okay, we have this corpus of data. How can we apply, you know, AI to it? Very, most certainly they're going to be using open source tools and they're not going to be building AI tools themselves, but they're going to be using AI tools. But I can say, I'm kind of contradicting what I said earlier. I can see specific use cases for companies, searching documentation, you know, would be an obvious thing. Again, theCUBE, we apply it to our corpus of data. Now we're doing that with, again, open source tooling, you know, certain APIs, but it's. People as I say, I was just trying to figure this out. I mean, the last time Andy came on "Breaking Analysis" in December 2022, we said AI, the premise was AI goes mainstream, but ROI remains elusive.
And I think, Andy, we got that right. ChatGPT is very intriguing and has kicked off responses from all sorts of, you know, competitors. We're seeing AI washing everywhere. We kind of just talked about that, you know, but we've seen also Google's Code Red, IBM has responded. AWS, everybody's announcing AI. As well, everybody's thinking about, okay, how can I apply it to my own corpus of data? We're hearing a lot about, oh, we're not going to need BDRs anymore. Analysts who are building dashboards, they're going to disappear. Everyone's trying to figure out the ROI and the right business model. And it's pretty clear people see a path to better productivity with humans in the loop. But radical business models are not as clear. We're going to pave the cow path and search and ads and subscription license models that we know. But I think people should expect, you know, new radical business models, you know, to emerge.
Imagine a disruptive, we were talking about an e-commerce model that goes after Amazon's massive warehouse infrastructure, which has been a competitive advantage. What if Google enables better shopping and direct shipping from a manufacturer? More along the lines of what Alibaba does, and what about industry specific use cases and business models that could emerge? You know, something new is going to come out of this that's going to surprise a lot of people. You know, the kind of the iPhone moment. It's like ChatGPT was the new software iPhone, and now it's like, here, figure out how to apply it. And everybody's scrambling to do so. I think in addition to ChatGPT and OpenAI, they're learning what to do with this and what are your thoughts? >> You're exactly right. Again, I don't know if it is fair to compare ChatGPT to iPhone, because ChatGPT is a much bigger moment than iPhone moment, in my view, based on that. >> Wow, wow. Think about that for a second, right? Yeah it is. I tend to agree it's like bigger than the internet. If I can even go that far. It is very big. So the bottom line is this, you know, just because you have a Swiss Army knife, you know, as I say, if you got a hammer, don't go looking for the nail, you know? Yes, all these capabilities are built in. But again, go back to your original point. You know, what are the use cases? What are my problematic areas? Where am I inefficient? For example, you know, in the IT operations area, I see that it's not necessarily ChatGPT but again, a combination of AI and some of the chatbots and ChatGPT equal and of that LLM models, there's a lot of applicable use cases, all of these AI ops companies, IT ops companies, service ticket companies, you know, try to reduce that time or reduce the incidence, so your system's solely up and running all the time. People don't realize that an incident can cost you upwards of millions of dollars for every hour it's down. So if you're able to efficiently manage that, manage your AI, optimize infrastructure, you could be having your business running on all the time. So, let's see. So Andy, you should follow Andy on LinkedIn. I mean, you're constantly posting on LinkedIn. You do a lot of great research, you talk to a ton of people. So I want to thank you for coming in here. Really nice job today. Thank you, appreciate it, thanks for having me. All right, that's a wrap folks. Many thanks to Andy Thurai for his outstanding collaboration and input to today's episodes. And he and I talk all the time. We brainstorm and he's just a great friend and a wonderful collaborator and super AI mind. I want to thank Alex Myerson who's on production and manages the podcast Ken Schifman as well. Kristen Martin and Cheryl Knight, they help get the word out on social media and in our newsletters. And Rob Hoof is our editor-in-chief over at SiliconAngle.com, he does some great editing, thank you. Remember, all these episodes are available as podcasts. Wherever you listen, just search "Breaking Analysis Podcast" I publish each week on wikibon.com and SiliconAngle.com. And don't forget to check out theCUBE.net for all our events and videos where theCUBE's going to be next. You want to get in touch, email me at David.Vellante@SiliconAngle.com or DM me @DVellante. You can comment, reach me on LinkedIn, pitch me, love to hear ideas.
If you got a good one, I'll respond, if not, don't take offense, we just, we get a lot of inbounds. Please do check out ETR.ai, they got great survey data, the best I think in the enterprise tech business. This is Dave Vellante for "theCUBE Insights, powered by ETR." Thanks for watching and we'll see you next time on "Breaking Analysis." (upbeat music)