Run everything from popular models with on-demand NVIDIA L4 GPUs to web apps without infrastructure management.
Run frontend and backend services, batch jobs, host LLMs, and queue processing workloads without the need to manage infrastructure. Cloud Run gives you on-demand GPU access for hosting LLMs and running real-time AI—with 5-second cold starts and automatic scale-to-zero so you only pay for actual usage. New customers get $300 in free credit to start.
Try Cloud Run Free
Go from Data Warehouse to Data and AI platform with BigQuery
Build, train, and run ML models with simple SQL. Automate data prep, analysis, and predictions with built-in AI assistance from Gemini.
BigQuery is more than a data warehouse—it's an autonomous data-to-AI platform. Use familiar SQL to train ML models, run time-series forecasts, and generate AI-powered insights with native Gemini integration. Built-in agents handle data engineering and data science workflows automatically. Get $300 in free credit, query 1 TB, and store 10 GB free monthly.
This project offers a python binding to the libindi library. It uses the Simplified Wrapper and Interface Generator (SWIG). You may generate a PyIndi module which mainly defines an IndiClient class. This class could be used to build python scripts able to interact with indi servers using sendNew* methods and implementing new* methods of the BaseMediator class.