Companies increasingly see the value in mining their data for deeper insights. According to a NewVantage survey, 97.6% of major worldwide organizations are focusing investments into big data and AI.
But challenges stand in the way of executing big data analytics. One recent poll found that 65% percent of organizations feel they have “too much” data to analyze.
Google’s proposed solution is BigQuery Studio, a new service within BigQuery, its fully managed serverless data warehouse, that provides a single experience to edit programming languages including SQL, Python and Spark to run analytics and machine learning workloads at “petabyte scale.”
BigQuery Studio is available in preview as of this week.
“BigQuery Studio is a new experience that really puts people who are working on data on the one side and people working on AI on the other side in a common environment,” Gerrit Kazmaier, VP and GM of data and analytics at Google, told TechCrunch in a phone interview. “It basically provides access to all of the services that those people need to work — there’s an element of simplification on the user experience side. ”
BigQuery Studio is designed to enable users to discover, explore, analyze and predict data. Users can start in a programming notebook to validate and prep data, then open that notebook in other services, including Vertex AI, Google’s managed machine learning platform, to continue their work with more specialized AI infrastructure and tooling.
With BigQuery Studio, teams can directly access data wherever they’re working, Kazmaier says. And they have added controls for “enterprise-level” governance, regulation and compliance.
“[BigQuery Studio shows] how data is being generated to how it’s being processed and how it’s being used in AI models, which sounds technical, but it’s really important,” he added. “You can push down code for machine learning models directly into BigQuery as infrastructure, and that means that you can evaluate it at scale.”
BigQuery Studio can be seen as a natural progression of Google’s overarching strategy to move organizations adopting AI to the cloud. With worldwide spending on public cloud services set grow about 21% to about $592 billion this year, according to one estimate, the tech giant is clearly intent on capturing as large a slice of the expenditure as possible — as are its rivals.
It’s not an ill-informed gameplan. Gartner predicts that through 2023, AI will be one of the top workloads that drive IT infrastructure decisions. And tech market research firm Tractica forecasts that AI will account for as much as 50% of total public cloud services revenue by 2025.
“Generative AI really has the potential to unlock all of these hidden insights,” Kazmaier said. “What we tend to see is that AI really makes sense when you can combine it with [a company’s] data. AI is a method if you will — a way of working with the data … to drive the most value.”