PRICING
PRODUCT
SOLUTIONS
by use cases
AI Lead ManagementInvoicingSocial MediaProject ManagementData Managementby Industry
learn more
BlogTemplatesVideosYoutubeRESOURCES
COMMUNITIES AND SOCIAL MEDIA
PARTNERS
Google DeepMind announced the release of a new open-source AI model, Gemma 3, aimed at AI services developers. The presentation took place during an active week of announcements from Google, highlighting their commitment to expanding access to advanced AI technologies.
The Gemma 3 family models were released in Hugging Face, Ollama, and Kaggle with sizes of 1, 4, 12, and 27 billion parameters. The 27B version is available for free in AI Studio. Google says they can run on any device from smartphones to workstations. The new models accept not only text queries but also images. Support for over 140 languages is claimed. The context window length is up to 128 thousand tokens.
The largest Gemma 3 27B in the Chatbot Arena beat DeepSeek v3 and o3-mini, losing only to DeepSeek R1. According to Google representatives, the model combines performance and simplicity, making it attractive to startups and independent developers looking to integrate AI into their products.
This lightweight neural network is designed specifically for developers needing a powerful yet economical tool to create applications and automate their workflow. Gemma 3 can operate on a single graphics processor, making it ideal for text, image, and video analysis without requiring complex infrastructure.
Additionally, use Gemma 3 directly with Google GenAI SDK on Latenode. Connect API key to your automation workflow with HTTP request and integrate it with any other tool.
While we at Latenode are working on adding direct integration with Gemma 3, you can already try other popular models on our platform. Experiment with Claude, Gemini, ChatGPT, DeepSeek, or other AIs – these tools are available right now for your automation projects! Stay tuned for updates on Discord to be among the first to test Gemma 3 once the integration is complete.
Here is simple, ready-made idea: