Run everything from popular models with on-demand NVIDIA L4 GPUs to web apps without infrastructure management.
Run frontend and backend services, batch jobs, host LLMs, and queue processing workloads without the need to manage infrastructure. Cloud Run gives you on-demand GPU access for hosting LLMs and running real-time AI—with 5-second cold starts and automatic scale-to-zero so you only pay for actual usage. New customers get $300 in free credit to start.
Try Cloud Run Free
Ship AI Apps Faster with Vertex AI
Go from idea to deployed AI app without managing infrastructure. Vertex AI offers one platform for the entire AI development lifecycle.
Ship AI apps and features faster with Vertex AI—your end-to-end AI platform. Access Gemini 3 and 200+ foundation models, fine-tune for your needs, and deploy with enterprise-grade MLOps. Build chatbots, agents, or custom models. New customers get $300 in free credit.
SOA infrastracture initially developed by NICT Language Grid Project
Service Grid is an infrastructure for accumulating and sharing Web services. Resources with complicated intellectual property issues are wrapped as Web services and shared on the Service Grid.
If you release your software by using the software of this project, please include the following description in the documents or on the website.
* This software uses the [SOFTWARE] by the Language Grid project (http://langrid.org/).