Showing 164 open source projects for "transformers"

View related business solutions
  • AI-powered service management for IT and enterprise teams Icon
    AI-powered service management for IT and enterprise teams

    Enterprise-grade ITSM, for every business

    Give your IT, operations, and business teams the ability to deliver exceptional services—without the complexity. Maximize operational efficiency with refreshingly simple, AI-powered Freshservice.
    Try it Free
  • AI-generated apps that pass security review Icon
    AI-generated apps that pass security review

    Stop waiting on engineering. Build production-ready internal tools with AI—on your company data, in your cloud.

    Retool lets you generate dashboards, admin panels, and workflows directly on your data. Type something like “Build me a revenue dashboard on my Stripe data” and get a working app with security, permissions, and compliance built in from day one. Whether on our cloud or self-hosted, create the internal software your team needs without compromising enterprise standards or control.
    Try Retool free
  • 1
    gulp-babel

    gulp-babel

    Gulp plugin for Babel

    Use next-generation JavaScript, today, with Babel. Install gulp-babel if you want to get the pre-release of the next version of gulp-babel. See the Babel options, except for sourceMaps and filename which are handled for you. Also, keep in mind that options will be loaded from config files that apply to each file. Files in the stream are annotated with a babel property, which contains the metadata from babel.transform(). If you're attempting to use features such as generators, you'll need to...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    BlockSparse

    BlockSparse

    Efficient GPU kernels for block-sparse matrix multiplication

    ...The idea is to exploit block-level sparsity — i.e. treat matrices or weight tensors as composed of blocks, many of which may be zero or unused — to save compute and memory when sparsity patterns are structured. This is particularly useful in models like Sparse Transformers, where attention matrices or intermediate layers may adopt block-sparse patterns to scale better. The repo implements both blocksparse and blockwise convolution/transpose-convolution primitives, with support for preparing, executing, and verifying those ops on NVIDIA GPUs. In addition to low-level kernels, it includes wrapper code for integrating with TensorFlow, example scripts (e.g. a transformer on the enwik8 dataset), transformer logic that uses blocksparse operations, and debugging helpers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Groot

    Groot

    From JSON to Core Data and back

    Groot provides a simple way of serializing Core Data object graphs from or into JSON. It uses annotations in the Core Data model to perform the serialization.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 4

    HL7 Soup Custom Transformers

    Create Custom Transformers for HL7 Soup tutorial

    HL7 Soup allows the creation of custom transformers for use in your HL7 workflows. Transformers are used to change the structure of a message to suit the user's needs. For instance, it can be used to adjust an incoming HL7 Value ready to be exported in another HL7 message. In this tutorial, we show you how to create a simple name concatenation transformer in c#. With it, we can append a patient first and last names together, and place a comma between them.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • 5
    AST explorer

    AST explorer

    A web tool to explore the ASTs generated by various parsers

    ...Depending on the parser settings, it not only supports ES5/CSS3. Since the future syntax is supported, the AST explorer is a useful tool for developers who want to create AST transforms. In fact, transformers are included so you can prototype your own plugins. Save and fork code snippets. Copy the URL to share them. Copying an AST or dropping a file containing an AST into the window will parse the AST and update the code using escodegen. Otherwise, the content of the text editor will be replaced with the content of the file (i.e. you can drag and drop JS files). ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    PEG.js

    PEG.js

    PEG.js is a parser generator for JavaScript

    PEG.js is a simple parser generator for JavaScript that produces fast parsers with excellent error reporting. You can use it to process complex data or computer languages and build transformers, interpreters, compilers and other tools easily. PEG.js is still very much work in progress. There are no compatibility guarantees until version 1.0. Based on parsing expression grammar formalism, more powerful than traditional LL(k) and LR(k) parsers. Usable from your browser, from the command line, or via JavaScript API.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7

    Edcalc-english

    Electrical and electronic circuits calculator

    Edcalc is a program to calculate and design basic electrical and electronic circuits also can be designed transformers, frequency dividers (crossover), coils and basic amplifiers. This is a useful tool for: teachers, students and technicians related to the field of electronics and electrical.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8

    javastim

    An e-stim signal generator in java

    ...Output the signal via a soundcard to an amplifier that can output e-stim compliant signals (100+V, current limited to about 100mA, high impedance, short circuit proof). Care to have a potential free output for every electrode (use separating NF transformers). Author claims no responsibility for physical damage from misuse or routing the signal through unsuitable circuitry. Any use happens at the sole risk of the user.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 9
    Transformers-CG
    Uma aplicação fácil e eficiente. Tem duvidas se seus cálculos de computação gráfica estão realmente corretos? TransformersCG1.0(C) pode ajudar com apenas alguns cliques. Interface simples e intuitiva. Aproveite o máximo que puder !
    Downloads: 0 This Week
    Last Update:
    See Project
  • Find Hidden Risks in Windows Task Scheduler Icon
    Find Hidden Risks in Windows Task Scheduler

    Free diagnostic script reveals configuration issues, error patterns, and security risks. Instant HTML report.

    Windows Task Scheduler might be hiding critical failures. Download the free JAMS diagnostic tool to uncover problems before they impact production—get a color-coded risk report with clear remediation steps in minutes.
    Download Free Tool
  • 10
    trafo

    trafo

    trafo is a console-based "E-I" transformer calculator

    ...It allows you to easily calculate all the necessary information to build an "E-I" transformer by only providing a few input parameters. This programs is for all of those who don't want to spend more time calculating transformers manually amd would like some automation.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 11
    Annotate your POJOs to create a fully-functional application, with table support, collection filters, and persistence hooks.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    PyFlo is a simple, python based Power Flow solver, based on Newton Raphson. Network elements and their topology are passed via a set of input files, representing nodes, lines, non-linear loads and transformers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    rSIGNAL - ruby Simple Inter-Interface Generic Aggregation Layer. It allows external data sources to be aggregated, store the aggregated data and print the aggregated data out in a transformed format to a file (or integrated HTTP server) on demand.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    RDF-compatible Wiki Data Model contains: Wiki Event Model (like SAX for XML) and Wiki Object Model (like DOM for XML). Tools: serializers and parsers for WikiSyntax/WikiXML, serializer to XHTML, transformers "events to objects", "objects to events"...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    The Cocoon Adaption Kit is a NetKernel module which enables the use of Cocoon Components (Generators, Transformers, Serializer, Actions) from within the NetKernel XML Application Server.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    XML and Python-based webserving framework, featuring XSLT, embedded Python evaluation, pipeline processing model and caching. Works with several XSL transformers and webserver interfaces.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    ZopeXMLMethods provides methods to apply to Zope objects for XML/XSLT processing. XSLTMethod associates XSLT transformers with XML documents. ZopeXMLMethods succeeds XMLTransform. It features file-system caching and works with many XML/XSLT libraries.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    XML Processing Benchmark for Java (XPB4J) is a Java based performance measurement and comparison program for XML processing software such as parsers, transformers, processing pipelines etc.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    bge-small-en-v1.5

    bge-small-en-v1.5

    Compact English sentence embedding model for semantic search tasks

    ...The model is optimized for speed and efficiency, making it suitable for resource-constrained environments. It is compatible with popular libraries such as FlagEmbedding, Sentence-Transformers, and Hugging Face Transformers. The model achieves competitive results on the MTEB benchmark, especially in retrieval and classification tasks. With only 33.4M parameters, it provides a strong balance of accuracy and performance for English-only use cases.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    translategemma-4b-it

    translategemma-4b-it

    Lightweight multimodal translation model for 55 languages

    ...TranslateGemma uses a structured chat template that enforces explicit source and target language codes, ensuring consistent, deterministic behavior and reducing ambiguity in multilingual pipelines. It integrates seamlessly with Hugging Face Transformers through pipelines or direct model initialization, supporting GPU acceleration and scalable deployment.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    gpt-oss-20b

    gpt-oss-20b

    OpenAI’s compact 20B open model for fast, agentic, and local use

    ...Like its larger sibling (gpt-oss-120b), it offers adjustable reasoning depth and full chain-of-thought visibility for better interpretability. It’s released under a permissive Apache 2.0 license, allowing unrestricted commercial and research use. GPT-OSS-20B is compatible with Transformers, vLLM, Ollama, PyTorch, and other tools. It is ideal for developers building lightweight AI agents or experimenting with fine-tuning on consumer-grade hardware.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    gpt-oss-120b

    gpt-oss-120b

    OpenAI’s open-weight 120B model optimized for reasoning and tooling

    ...The model supports fine-tuning, chain-of-thought reasoning, and structured outputs, making it ideal for complex workflows. It operates in OpenAI’s Harmony response format and can be deployed via Transformers, vLLM, Ollama, LM Studio, and PyTorch. Developers can control the reasoning level (low, medium, high) to balance speed and depth depending on the task. Released under the Apache 2.0 license, it enables both commercial and research applications. The model supports function calling, web browsing, and code execution, streamlining intelligent agent development.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    bge-large-en-v1.5

    bge-large-en-v1.5

    BGE-Large v1.5: High-accuracy English embedding model for retrieval

    ...It is recommended for use in document retrieval tasks, semantic search, and passage reranking, particularly when paired with a reranker like BGE-Reranker. The model supports inference through multiple frameworks, including FlagEmbedding, Sentence-Transformers, LangChain, and Hugging Face Transformers. It accepts English text as input and returns normalized 1024-dimensional embeddings suitable for cosine similarity comparisons.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    Qwen2-7B-Instruct

    Qwen2-7B-Instruct

    Instruction-tuned 7B language model for chat and complex tasks

    ...It shows strong performance across benchmarks such as MMLU, MT-Bench, GSM8K, and Humaneval, often surpassing similarly sized open-source models. Designed for conversational use, it integrates with Hugging Face Transformers and supports long-context applications via YARN and vLLM for efficient deployment.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    NuMarkdown-8B-Thinking

    NuMarkdown-8B-Thinking

    Reasoning-powered OCR VLM for converting complex documents to Markdown

    ...Thinking token usage can range from 20% to 500% of the final answer, depending on task difficulty. NuMarkdown-8B-Thinking is released under the MIT license and supports vLLM and Transformers for deployment.
    Downloads: 0 This Week
    Last Update:
    See Project