Go from idea to deployed AI app without managing infrastructure. Vertex AI offers one platform for the entire AI development lifecycle.
Ship AI apps and features faster with Vertex AI—your end-to-end AI platform. Access Gemini 3 and 200+ foundation models, fine-tune for your needs, and deploy with enterprise-grade MLOps. Build chatbots, agents, or custom models. New customers get $300 in free credit.
Try Vertex AI Free
Easily Host LLMs and Web Apps on Cloud Run
Run everything from popular models with on-demand NVIDIA L4 GPUs to web apps without infrastructure management.
Run frontend and backend services, batch jobs, host LLMs, and queue processing workloads without the need to manage infrastructure. Cloud Run gives you on-demand GPU access for hosting LLMs and running real-time AI—with 5-second cold starts and automatic scale-to-zero so you only pay for actual usage. New customers get $300 in free credit to start.
CANopen based stack for communication in embeded control systems.
Project was moved to
https://github.com/canopennode
----
CANopenNode is an open source software stack used for serial communication of multiple devices over the CAN/CANopen Network. It works on different microcontrollers, is reliable, simple and powerful and is suitable for industrial or home automation. CANopen is the internationally standardized (EN 50325-4) (CiA DS-301) CAN-based higher-layer protocol for embedded control system.
A service-oriented, single-source, Java integration server with virtual Service Stores(TM) that expose services via web services, HTTP and more. Service Flows provide process automation. Mainframe, database, email, web, soap & ftp adapters are included.