Access Google’s most capable multimodal models. Train, test, and deploy AI with 200+ foundation models on one platform.
Vertex AI gives developers access to Gemini 3—Google’s most advanced reasoning and coding model—plus 200+ foundation models including Claude, Llama, and Gemma. Build generative AI apps with Vertex AI Studio, customize with fine-tuning, and deploy to production with enterprise-grade MLOps. New customers get $300 in free credits.
Try Vertex AI Free
Run Any Workload on Compute Engine VMs
From dev environments to AI training, choose preset or custom VMs with 1–96 vCPUs and industry-leading 99.95% uptime SLA.
Compute Engine delivers high-performance virtual machines for web apps, databases, containers, and AI workloads. Choose from general-purpose, compute-optimized, or GPU/TPU-accelerated machine types—or build custom VMs to match your exact specs. With live migration and automatic failover, your workloads stay online. New customers get $300 in free credits.
Cross-platform code editor, with syntax highlight for 300+ languages. Has lite interface with tabs. Has JSON config files instead of the options dialog. Supports Python extensions. Documentation wiki: http://wiki.freepascal.org/CudaText
Yet another scripting language
Main features:
- Pointers
- Structures
- Objectoriented programming (yet to come)
- code in plain english
- easy to learn (based on C, C++, Java, Pascal, Python, Lua)
$300 in Free Credit for Your Google Cloud Projects
Build, test, and explore on Google Cloud with $300 in free credit. No hidden charges. No surprise bills.
Launch your next project with $300 in free Google Cloud credit—no hidden charges. Test, build, and deploy without risk. Use your credit across the Google Cloud platform to find what works best for your needs. After your credits are used, continue building with free monthly usage products. Only pay when you're ready to scale. Sign up in minutes and start exploring.
RUG is an universal machine-friendly binary file generator. Together with suitable macro packages, it can be used to assemble executables for almost any known processor. The main goal is to build a backend for a future Scheme translator.