Access Google’s most capable multimodal models. Train, test, and deploy AI with 200+ foundation models on one platform.
Vertex AI gives developers access to Gemini 3—Google’s most advanced reasoning and coding model—plus 200+ foundation models including Claude, Llama, and Gemma. Build generative AI apps with Vertex AI Studio, customize with fine-tuning, and deploy to production with enterprise-grade MLOps. New customers get $300 in free credits.
Try Vertex AI Free
Run Any Workload on Compute Engine VMs
From dev environments to AI training, choose preset or custom VMs with 1–96 vCPUs and industry-leading 99.95% uptime SLA.
Compute Engine delivers high-performance virtual machines for web apps, databases, containers, and AI workloads. Choose from general-purpose, compute-optimized, or GPU/TPU-accelerated machine types—or build custom VMs to match your exact specs. With live migration and automatic failover, your workloads stay online. New customers get $300 in free credits.
Win flex-bison is a port Flex & Bison tools to the Windows platform
Win flex-bison is a windows port the Flex (the fast lexical analyser) and Bison (GNU parser generator). win_flex based on Flex version 2.6.3 source code and win_bison based on Bison version 2.7 and they depend on system libraries only.
Git repository: https://github.com/lexxmark/winflexbison
UPDATE1: Bison version 3.x.x available in Files section in win_flex_bison3-latest.zip package.
UPDATE2: Now "winflexbison" available as package in Chocolatey (https://chocolatey.org/packages/winflexbison and https://chocolatey.org/packages/winflexbison3)
UPDATE3: You can use VS custom build rules to simplify working with winflexbison in Visual Studio 2010 and upper (https://sourceforge.net/p/winflexbison/wiki/Visual%20Studio%20custom%20build%20rules/).