Showing 378 open source projects for "python compile source code"

View related business solutions
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • Secure remote access solution to your private network, in the cloud or on-prem. Icon
    Secure remote access solution to your private network, in the cloud or on-prem.

    Deliver secure remote access with OpenVPN.

    OpenVPN is here to bring simple, flexible, and cost-effective secure remote access to companies of all sizes, regardless of where their resources are located.
    Get started — no credit card required.
  • 1
    Curated Transformers

    Curated Transformers

    PyTorch library of curated Transformer models and their components

    State-of-the-art transformers, brick by brick. Curated Transformers is a transformer library for PyTorch. It provides state-of-the-art models that are composed of a set of reusable components. Supports state-of-the-art transformer models, including LLMs such as Falcon, Llama, and Dolly v2. Implementing a feature or bugfix benefits all models. For example, all models support 4/8-bit inference through the bitsandbytes library and each model can use the PyTorch meta device to avoid unnecessary...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    imodelsX

    imodelsX

    Interpretable prompting and models for NLP

    Interpretable prompting and models for NLP (using large language models). Generates a prompt that explains patterns in data (Official) Explain the difference between two distributions. Find a natural-language prompt using input-gradients. Fit a better linear model using an LLM to extract embeddings. Fit better decision trees using an LLM to expand features. Finetune a single linear layer on top of LLM embeddings. Use these just a like a sci-kit-learn model. During training, they fit better...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    SSD in PyTorch 1.0

    SSD in PyTorch 1.0

    High quality, fast, modular reference implementation of SSD in PyTorch

    This repository implements SSD (Single Shot MultiBox Detector). The implementation is heavily influenced by the projects ssd.pytorch, pytorch-ssd and maskrcnn-benchmark. This repository aims to be the code base for research based on SSD. Multi-GPU training and inference: We use DistributedDataParallel, you can train or test with arbitrary GPU(s), the training schema will change accordingly. Add your own modules without pain. We abstract backbone, Detector, BoxHead, BoxPredictor, etc. You can...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    Sonnet

    Sonnet

    TensorFlow-based neural network library

    ... ship with Sonnet, making it quite powerful and yet simple at the same time. Users are also encouraged to build their own modules. Sonnet is designed to be extremely unopinionated about your use of modules. It is simple to understand, and offers clear and focused code.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Gen AI apps are built with MongoDB Atlas Icon
    Gen AI apps are built with MongoDB Atlas

    Build gen AI apps with an all-in-one modern database: MongoDB Atlas

    MongoDB Atlas provides built-in vector search and a flexible document model so developers can build, scale, and run gen AI apps without stitching together multiple databases. From LLM integration to semantic search, Atlas simplifies your AI architecture—and it’s free to get started.
    Start Free
  • 5
    Taipy

    Taipy

    Turns Data and AI algorithms into production-ready web applications

    .... Struggle with sluggish performance and excessive memory usage, as every data point demands processing. Large datasets become cumbersome, complicating the user experience and data analysis. Scenarios are made easy with Taipy Studio. A powerful VS Code extension that unlocks a convenient graphical editor. Get your methods invoked at a certain time or intervals. Enjoy a variety of predefined themes or build your own.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    EvaDB

    EvaDB

    Database system for building simpler and faster AI-powered application

    Over the last decade, AI models have radically changed the world of natural language processing and computer vision. They are accurate on various tasks ranging from question answering to object tracking in videos. To use an AI model, the user needs to program against multiple low-level libraries, like PyTorch, Hugging Face, Open AI, etc. This tedious process often leads to a complex AI app that glues together these libraries to accomplish the given task. This programming complexity prevents...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Norfair

    Norfair

    Lightweight Python library for adding real-time multi-object tracking

    Norfair is a customizable lightweight Python library for real-time multi-object tracking. Using Norfair, you can add tracking capabilities to any detector with just a few lines of code. Any detector expressing its detections as a series of (x, y) coordinates can be used with Norfair. This includes detectors performing tasks such as object or keypoint detection. It can easily be inserted into complex video processing pipelines to add tracking to existing projects. At the same time...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    ktrain

    ktrain

    ktrain is a Python library that makes deep learning AI more accessible

    ktrain is a Python library that makes deep learning and AI more accessible and easier to apply. ktrain is a lightweight wrapper for the deep learning library TensorFlow Keras (and other libraries) to help build, train, and deploy neural networks and other machine learning models. Inspired by ML framework extensions like fastai and ludwig, ktrain is designed to make deep learning and AI more accessible and easier to apply for both newcomers and experienced practitioners. With only a few lines...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Jittor

    Jittor

    Jittor is a high-performance deep learning framework

    Jittor is a high-performance deep learning framework based on JIT compiling and meta-operators. The whole framework and meta-operators are compiled just in time. A powerful op compiler and tuner are integrated into Jittor. It allowed us to generate high-performance code specialized for your model. Jittor also contains a wealth of high-performance model libraries, including image recognition, detection, segmentation, generation, differentiable rendering, geometric learning, reinforcement...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Get Avast Free Antivirus | Your top-rated shield against malware and online scams Icon
    Get Avast Free Antivirus | Your top-rated shield against malware and online scams

    Boost your PC's defense against cyberthreats and web-based scams.

    Our antivirus software scans for security and performance issues and helps you to fix them instantly. It also protects you in real time by analyzing unknown files before they reach your desktop PC or laptop — all for free.
    Free Download
  • 10
    LangChain Apps on Production with Jina

    LangChain Apps on Production with Jina

    Langchain Apps on Production with Jina & FastAPI

    Jina is an open-source framework for building scalable multi-modal AI apps on Production. LangChain is another open-source framework for building applications powered by LLMs. long-chain-serve helps you deploy your LangChain apps on Jina AI Cloud in a matter of seconds. You can benefit from the scalability and serverless architecture of the cloud without sacrificing the ease and convenience of local development. And if you prefer, you can also deploy your LangChain apps on your own...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    Ludwig AI

    Ludwig AI

    Low-code framework for building custom LLMs, neural networks

    Declarative deep learning framework built for scale and efficiency. Ludwig is a low-code framework for building custom AI models like LLMs and other deep neural networks. Declarative YAML configuration file is all you need to train a state-of-the-art LLM on your data. Support for multi-task and multi-modality learning. Comprehensive config validation detects invalid parameter combinations and prevents runtime failures. Automatic batch size selection, distributed training (DDP, DeepSpeed...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    FastChat

    FastChat

    Open platform for training, serving, and evaluating language models

    FastChat is an open platform for training, serving, and evaluating large language model-based chatbots. If you do not have enough memory, you can enable 8-bit compression by adding --load-8bit to the commands above. This can reduce memory usage by around half with slightly degraded model quality. It is compatible with the CPU, GPU, and Metal backend. Vicuna-13B with 8-bit compression can run on a single NVIDIA 3090/4080/T4/V100(16GB) GPU. In addition to that, you can add --cpu-offloading to...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    IVY

    IVY

    The Unified Machine Learning Framework

    Take any code that you'd like to include. For example, an existing TensorFlow model, and some useful functions from both PyTorch and NumPy libraries. Choose any framework for writing your higher-level pipeline, including data loading, distributed training, analytics, logging, visualization etc. Choose any backend framework which should be used under the hood, for running this entire pipeline. Choose the most appropriate device or combination of devices for your needs. DeepMind releases...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    SageMaker Inference Toolkit

    SageMaker Inference Toolkit

    Serve machine learning models within a Docker container

    Serve machine learning models within a Docker container using Amazon SageMaker. Amazon SageMaker is a fully managed service for data science and machine learning (ML) workflows. You can use Amazon SageMaker to simplify the process of building, training, and deploying ML models. Once you have a trained model, you can include it in a Docker container that runs your inference code. A container provides an effectively isolated environment, ensuring a consistent runtime regardless of where...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    PyTorch Ignite

    PyTorch Ignite

    Library to help with training and evaluating neural networks

    High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently. Less code than pure PyTorch while ensuring maximum control and simplicity. Library approach and no program's control inversion. Use ignite where and when you need. Extensible API for metrics, experiment managers, and other components. The cool thing with handlers is that they offer unparalleled flexibility (compared to, for example, callbacks). Handlers can be any function: e.g. lambda...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    TensorFlow.NET

    TensorFlow.NET

    .NET Standard bindings for Google's TensorFlow for developing models

    ... with a powerful Machine Learning tool set without reinventing the wheel. Since the APIs are kept as similar as possible you can immediately adapt any existing TensorFlow code in C# or F# with a zero learning curve. Take a look at a comparison picture and see how comfortably a TensorFlow/Python script translates into a C# program with TensorFlow.NET.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    aisuite

    aisuite

    Simple, unified interface to multiple Generative AI providers

    Simple, unified interface to multiple Generative AI providers. aisuite makes it easy for developers to use multiple LLM through a standardized interface. Using an interface similar to OpenAI's, aisuite makes it easy to interact with the most popular LLMs and compare the results. It is a thin wrapper around Python client libraries and allows creators to seamlessly swap out and test responses from different LLM providers without changing their code. Today, the library is primarily focused on chat...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    Mosec

    Mosec

    A high-performance ML model serving framework, offers dynamic batching

    Mosec is a high-performance and flexible model-serving framework for building ML model-enabled backend and microservices. It bridges the gap between any machine learning models you just trained and the efficient online service API.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    Face Alignment

    Face Alignment

    2D and 3D Face alignment library build using pytorch

    Detect facial landmarks from Python using the world's most accurate face alignment network, capable of detecting points in both 2D and 3D coordinates. Build using FAN's state-of-the-art deep learning-based face alignment method. For numerical evaluations, it is highly recommended to use the lua version which uses identical models with the ones evaluated in the paper. More models will be added soon. By default, the package will use the SFD face detector. However, the users can alternatively use...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    The Operator Splitting QP Solver

    The Operator Splitting QP Solver

    The Operator Splitting QP Solver

    OSQP uses a specialized ADMM-based first-order method with custom sparse linear algebra routines that exploit structure in problem data. The algorithm is absolutely division-free after the setup and it requires no assumptions on problem data (the problem only needs to be convex). It just works. OSQP has an easy interface to generate customized embeddable C code with no memory manager required. OSQP supports many interfaces including C/C++, Fortran, Matlab, Python, R, Julia, Rust.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    Basaran

    Basaran

    Basaran, an open-source alternative to the OpenAI text completion API

    Basaran is an open-source alternative to the OpenAI text completion API. It provides a compatible streaming API for your Hugging Face Transformers-based text generation models. The open source community will eventually witness the Stable Diffusion moment for large language models (LLMs), and Basaran allows you to replace OpenAI's service with the latest open-source model to power your application without modifying a single line of code. Stream generation using various decoding strategies...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    Xorbits Inference

    Xorbits Inference

    Replace OpenAI GPT with another LLM in your app

    Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop. Xorbits Inference(Xinference) is a powerful and versatile library designed to serve language, speech recognition, and multimodal models. With Xorbits Inference...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    Lightweight' GAN

    Lightweight' GAN

    Implementation of 'lightweight' GAN, proposed in ICLR 2021

    ... they pass into a neural network (if you use augmentation). The general recommendation is to use suitable augs for your data and as many as possible, then after some time of training disable the most destructive (for image) augs. You can turn on automatic mixed precision with one flag --amp. You should expect it to be 33% faster and save up to 40% memory. Aim is an open-source experiment tracker that logs your training runs, and enables a beautiful UI to compare them.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    SageMaker Hugging Face Inference Toolkit

    SageMaker Hugging Face Inference Toolkit

    Library for serving Transformers models on Amazon SageMaker

    SageMaker Hugging Face Inference Toolkit is an open-source library for serving Transformers models on Amazon SageMaker. This library provides default pre-processing, predict and postprocessing for certain Transformers models and tasks. It utilizes the SageMaker Inference Toolkit for starting up the model server, which is responsible for handling inference requests. For the Dockerfiles used for building SageMaker Hugging Face Containers, see AWS Deep Learning Containers. The SageMaker Hugging...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    SWE-agent

    SWE-agent

    SWE-agent takes a GitHub issue and tries to automatically fix it

    SWE-agent turns LMs (e.g. GPT-4) into software engineering agents that can resolve issues in real GitHub repositories. On the SWE-bench, the SWE-agent resolves 12.47% of issues, achieving state-of-the-art performance on the full test set. We accomplish our results by designing simple LM-centric commands and feedback formats to make it easier for the LM to browse the repository, and view, edit, and execute code files. We call this an Agent-Computer Interface (ACI).
    Downloads: 0 This Week
    Last Update:
    See Project
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.