Yonatan Geifman is the CEO and co-founder of Deci that turns AI models into production solutions on any hardware. Deci has been recognized as a technology innovator for Edge AI by Gartner and included in the CB Insights AI 100 list. The performance of its proprietary technology set new records at MLPerf with Intel.
What initially attracted you to machine learning?
From an early age, I’ve always been fascinated with cutting-edge technologies – not just using them, but really understanding how they work.
This lifelong fascination paved the way for my eventual PhD studies in Computer Science where my research focused on Deep Neural Networks (DNN). As I came to understand this critical technology in an academic setting, I began to truly understand how AI can positively impact the world around us. From smart cities that can better monitor traffic and reduce accidents, to autonomous vehicles that require little or no human intervention, to life-saving medical devices – there are endless applications where AI could improve society. I always knew that I wanted to be part of this revolution.
Could you share the genesis story behind Deci AI?
It’s not hard to recognize – as I did when I was in school for my PhD – how beneficial AI can be in any use case. Yet many companies are struggling to realize the full potential of AI as developers continually face an uphill battle to develop production-ready deep learning models to deploy. In other words, it remains super difficult to produce AI.
These challenges can largely be attributed to the inefficiencies of AI the industry is facing. Algorithms are becoming more powerful and require more computing power, but at the same time they need to be deployed cost-effectively, often on resource-constrained edge devices.
My co-founders, Professor Ran El-Yaniv, Jonathan Elial and I co-founded Deci to meet this challenge. And we did it the only way we’ve seen possible – using AI itself to create the next generation of deep learning. We have taken an algorithm-first approach, working to improve the efficiency of AI algorithms in the early stages, which will allow developers to build and work with models that provide the highest levels of accuracy and efficiency. for any given inference material.
Deep learning is at the heart of Deci AI, could you define it for us?
Deep learning, like machine learning, is a subfield of AI, intended to empower a new era of applications. Deep learning draws heavily from the structure of the human brain, which is why when we talk about deep learning, we talk about “neural networks”. This is extremely relevant for edge applications (think cameras in smart cities, sensors on autonomous vehicles, analytics solutions in healthcare) where on-premises deep learning models are crucial for generating insights. such information in real time.
What is neural architecture research?
Neural Architecture Search (NAS) is a technological discipline aimed at obtaining better deep learning models.
Google’s pioneering work on NAS in 2017 helped bring the topic into the mainstream, at least in research and academic circles.
The goal of NAS is to find the best neural network architecture for a given problem. It automates the design of DNNs, ensuring higher performance and lower loss than manually designed architectures. It is a process by which an algorithm searches through an aggregated space of millions of arcuitecures of available models, to produce an architecture uniquely suited to solve that particular problem. To put it simply, it uses AI to design a new AI, based on the specific needs of a given project.
It is used by teams to simplify the development process, reduce trial and error iterations, and ensure they end up with the ultimate model that can best serve application accuracy and performance goals.
What are some of the limitations of Neural Architecture Search?
The main limitations of traditional NAS are accessibility and scalability. Today, NAS is used primarily in research environments and is typically only used by tech giants like Google and Facebook, or academic institutes like Stanford, because traditional NAS techniques are complicated to implement. work and require a lot of computer resources.
That’s why I’m so proud of our accomplishments in developing Deci’s groundbreaking AutoNAC (Automated Neural Architecture Construction) technology, which democratizes NAS and enables businesses of all sizes to easily build custom model architectures with precision. superior to the state of the art and speed for their applications.
How is learning to detect objections different depending on the type of image?
Surprisingly, the image domain does not significantly affect the object detection model training process. Whether you’re looking for a pedestrian on the street, a tumor in a medical scanner, or a weapon concealed in an x-ray taken by airport security, the process is pretty much the same. The data you use to train your model should be representative of the task at hand, and the size and structure of the model can be affected by the size, shape, and complexity of the objects in your image.
How does Deci AI provide an end-to-end platform for deep learning?
Deci’s platform enables developers to build, train, and deploy accurate and fast deep learning models in production. By doing so, teams can leverage the most advanced research and engineering best practices with a single line of code, shorten time to market from months to weeks, and ensure production success.
You started with a team of 6 people, and you are now serving large companies. Could you tell us about the growth of the business and some of the challenges you have faced?
We’re thrilled with the growth we’ve achieved since the start of 2019. Now, with over 50 employees and over $55 million in funding to date, we’re confident we can continue to help developers realize and harness the true potential of AI. Since launch, we’ve been included in CB Insights’ AI 100, achieved breakthrough achievements, such as our family of models that deliver breakthrough deep learning performance on processors, and solidified meaningful collaborations, including with big names like Intel.
Is there anything else you would like to share about Deci AI?
As I mentioned before, the AI efficiency gap continues to pose major barriers to the productization of AI. “Left Shift” – taking into account production constraints early in the development cycle, reduces the time and cost spent on addressing potential roadblocks when deploying deep learning models in production across the entire lifecycle. line. Our platform has proven itself capable of doing just that by providing businesses with the tools to successfully develop and deploy world-changing AI solutions.
Our goal is simple: to make AI widely accessible, affordable and scalable.
Thanks for this great interview, readers who want to know more should visit Deci.
#Yonatan #Geifman #CEO #CoFounder #Deci #Interview #Series