During the AI & Big Data Expo, AI News interviewed Piero Molino, the CEO and co-founder of Predibase, discussing the significance of low-code in machine learning and the trends surrounding Large Language Models (LLMs).
Predibase is a declarative machine learning platform that aims to simplify and democratize the development and deployment of machine learning models. The company’s mission is to make machine learning accessible to both expert organizations and developers new to the field.
The platform enables organizations with in-house experts to enhance their capabilities and reduce development time from months to just days. It also caters to developers seeking to integrate machine learning into their products but lacking expertise.
Predibase eliminates the need for extensive low-level machine learning code by providing a simple configuration file called a YAML file. This file, consisting of only 10 lines specifying the data schema, allows developers to avoid the complexities of coding.
At the expo, Predibase announced the general availability of its platform. One of its key features is the abstraction of infrastructure provisioning complexities. Users can effortlessly run training, deployment, and inference jobs on a single CPU machine or scale up to 1000 GPU machines with a few clicks. The platform also facilitates easy integration with various data sources, regardless of the data structure.
Molino emphasized the importance of low-code development in driving machine learning adoption. Simplifying the process reduces development time, lowering barriers for organizations to experiment with new use cases and unlock value.
Molino also discussed the increasing interest in Large Language Models. He recognized their transformative power and the shift they bring to AI and machine learning. Large Language Models enable querying the model directly for predictions, eliminating the need for extensive data collection and labeling.
However, Molino highlighted some challenges, such as the cost and scalability of per-query pricing models, slow inference speeds, and concerns over data privacy when using third-party APIs. Predibase addresses these challenges by allowing customers to deploy their models in a virtual private cloud, ensuring data privacy and control.
Molino shared insights into common mistakes made by businesses venturing into machine learning. He emphasized the importance of understanding the data, use case, and business context before diving into development. Predibase’s platform enables hypothesis testing and integrates data understanding with model training to validate the suitability of models for specific tasks.
The general availability launch of Predibase’s platform signifies a significant milestone in their mission to democratize machine learning. By simplifying the development process, Predibase aims to unlock the full potential of machine learning for organizations and developers alike.