/socialsamosa/media/media_files/2026/02/25/21-3-2026-02-25-11-20-51.jpg)
Google has introduced updates to create new workflow systems to its vibe-coding app Opal, shifting from fixed model-based processes to what it describes as agent-driven automation aimed at handling tasks with less manual configuration.
The users will now be able to select an agent in the ‘generate’ step instead of manually choosing a specific model. The agent determines the approach based on the user’s goal and can trigger different tools and models, such as web search for research or video-generation tools, to complete tasks.
Previously, users creating projects such as digital storybooks had to define details like page counts and prompts in advance. With the updated system, the agent can determine what information is required, suggest plot points and adjust the narrative flow. This allows for more flexible and dynamic outputs.
The changes also affect interactive use cases. For example, in an interior design workflow, users earlier uploaded an image and selected a style to receive a redesigned version. With the agent system, the workflow can ask follow-up questions, request clarifications and adjust results based on user input during the process.
The update introduces additional features intended to expand agent capabilities. A new memory function allows workflows to retain information such as user preferences across sessions.
Dynamic routing enables users to define multiple paths in a workflow based on specified conditions, allowing the agent to choose the appropriate next step. An interactive chat feature allows the agent to seek clarification or offer options before proceeding.
Users can continue to use fixed-step workflows for tasks that require more precise or rigid logic, while the agent-based approach is designed to provide greater automation and flexibility.
/socialsamosa/media/agency_attachments/PrjL49L3c0mVA7YcMDHB.png)
/socialsamosa/media/media_files/2026/02/17/desktop-leaderboard-1-2026-02-17-13-11-15.jpg)
Follow Us