Google Releases Veo 3.1 Lite, a Cost-Effective AI Model for Developers
Google has launched its latest artificial intelligence (AI) model for video creation, Veo 3.1 Lite, on Tuesday (31/3/2026). Veo 3.1 Lite is a more “cost-effective” version from the Veo 3.1 family, targeted at developers. This text-to-video AI model was developed by Google through its AI division, DeepMind. It is named as such because the model is designed to reduce operational costs without compromising processing speed or main features possessed by the Veo line. Google states that Veo 3.1 Lite is introduced to complement the Veo family, allowing developers to choose the model that best fits their budget and needs. For example, in video creation, the standard Veo 3.1 model costs $0.15 per second (approximately Rp 2,549). Meanwhile, Veo 3.1 Lite is only $0.05 (approximately Rp 850.09). Although cheaper, this model still offers the same processing speed as other Veo variants. Veo 3.1 Lite also retains the main capabilities of the Veo line, namely generating videos equipped with audio that matches the original sound of the content. Like general Veo models, Veo 3.1 Lite supports several features, including video creation from text (text-to-video) and from images (image-to-video). The model also supports two aspect ratios, namely 16:9 and 9:16, with video resolution up to 1080p. For 4K resolution itself, the Lite version does not yet support it, so it may be less ideal for professional video production. Regarding duration, developers can adjust the generated video duration, namely 4 seconds, 6 seconds, or 8 seconds. Usage costs will adjust according to the video duration. Veo 3.1 Lite itself began rolling out on Tuesday (31/3/2026) and is now accessible to developers through the paid Gemini API service and Google AI Studio. The model is also available on Google’s AI-based film creation platform, Flow. Google has also updated pricing for the Veo 3.1 Fast model, which is said to be available soon, based on developer documentation. These are the details: