AI Text Generators for Mac

View 36 business solutions

Browse free open source AI Text Generators and projects for Mac below. Use the toggles on the left to filter open source AI Text Generators by OS, license, language, programming language, and project status.

  • EBizCharge Payment Platform for Accounts Receivable Icon
    EBizCharge Payment Platform for Accounts Receivable

    Getting paid has never been easier.

    Don’t let unpaid invoices limit your business’s growth. EBizCharge plugs directly into the tools your business already uses to speed up payment collection.
  • ConnectWise Cybersecurity Management for MSPs Icon
    ConnectWise Cybersecurity Management for MSPs

    Software and support solutions to protect your clients’ critical business assets

    ConnectWise SIEM (formerly Perch) offers threat detection and response backed by an in-house Security Operations Center (SOC). Defend against business email compromise, account takeovers, and see beyond your network traffic. Our team of threat analysts does all the tedium for you, eliminating the noise and sending only identified and verified treats to action on. Built with multi-tenancy, ConnectWise SIEM helps you keep clients safe with the best threat intel on the market.
  • 1
    AI Atelier

    AI Atelier

    Based on the Disco Diffusion, version of the AI art creation software

    Based on the Disco Diffusion, we have developed a Chinese & English version of the AI art creation software "AI Atelier". We offer both Text-To-Image models (Disco Diffusion and VQGAN+CLIP) and Text-To-Text (GPT-J-6B and GPT-NEOX-20B) as options. Making available complete source code of licensed works and modifications, which include larger works using a licensed work, under the same license. Copyright and license notices must be preserved. When a modified version is used to provide a service over a network, the complete source code of the modified version must be made available. Create 2D and 3D animations and not only still frames (from Disco Diffusion v5 and VQGAN Animations). Input audio and images for generation instead of just text. Simplify tool setup process on colab, and enable ‘one-click’ sharing of the generated link to other users. Experiment with the possibilities for multi-user access to the same link.
    Downloads: 6 This Week
    Last Update:
    See Project
  • 2
    gpt-2-simple

    gpt-2-simple

    Python package to easily retrain OpenAI's GPT-2 text-generating model

    A simple Python package that wraps existing model fine-tuning and generation scripts for OpenAI's GPT-2 text generation model (specifically the "small" 124M and "medium" 355M hyperparameter versions). Additionally, this package allows easier generation of text, generating to a file for easy curation, allowing for prefixes to force the text to start with a given phrase. For finetuning, it is strongly recommended to use a GPU, although you can generate using a CPU (albeit much more slowly). If you are training in the cloud, using a Colaboratory notebook or a Google Compute Engine VM w/ the TensorFlow Deep Learning image is strongly recommended. (as the GPT-2 model is hosted on GCP) You can use gpt-2-simple to retrain a model using a GPU for free in this Colaboratory notebook, which also demos additional features of the package. Note: Development on gpt-2-simple has mostly been superceded by aitextgen, which has similar AI text generation capabilities with more efficient training time.
    Downloads: 5 This Week
    Last Update:
    See Project
  • 3
    BNFGen

    BNFGen

    Generates random text based on context-free grammars defined in BNF

    BNFGen generates random text based on context-free grammar. You give it a file with your grammar, defined using BNF-like syntax, it gives you a string that follows that grammar. BNFGen is a CLI tool, an OCaml library. There are also official JS bindings available via NPM. Project goals are to make it easy to write and share grammar and give the user total control of and insight into the generation process. BNFGen provides a "DSL" for grammar definitions. It's a familiar BNF-like syntax with a few additions. One problem with using straight BNF for driving language generators is that you have no control over the process. BNFGen adds two features to fix that. The canonical way to express repetition in BNF is to use a self-referential recursive rule. In classic BNF, that can easily lead to the process terminating to early, since there's a 50% chance that it will take the non-recursive alternative.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 4
    Node.js Client For NLP Cloud

    Node.js Client For NLP Cloud

    NLP Cloud serves high performance pre-trained or custom models

    This is the Node.js client (with Typescript types) for the NLP Cloud API. NLP Cloud serves high-performance pre-trained or custom models for NER, sentiment analysis, classification, summarization, dialogue summarization, paraphrasing, intent classification, product description and ad generation, chatbot, grammar and spelling correction, keywords and keyphrases extraction, text generation, image generation, blog post generation, text generation, question answering, automatic speech recognition, machine translation, language detection, semantic search, semantic similarity, tokenization, POS tagging, embeddings, and dependency parsing. It is ready for production, and served through a REST API. You can either use the NLP Cloud pre-trained models, fine-tune your own models, or deploy your own models.
    Downloads: 2 This Week
    Last Update:
    See Project
  • Email and SMS Marketing Software Icon
    Email and SMS Marketing Software

    Boost Sales. Grow Audiences. Reduce Workloads.

    Our intuitive email marketing software to help you save time and build lasting relationships with your subscribers.
  • 5
    Text Generation Web UI

    Text Generation Web UI

    A gradio web UI for running Large Language Models like LLaMA

    A gradio web UI for running Large Language Models like LLaMA, llama.cpp, GPT-J, Pythia, OPT, and GALACTICA. Dropdown menu for switching between models. Notebook mode that resembles OpenAI's playground. Chat mode for conversation and role playing. Instruct mode compatible with Alpaca and Open Assistant formats. Nice HTML output for GPT-4chan. Markdown output for GALACTICA, including LaTeX rendering. Custom chat characters. Advanced chat features (send images, get audio responses with TTS). Very efficient text streaming. Parameter presets, 8-bit mode. Layers splitting across GPU(s), CPU, and disk. CPU mode, FlexGen, DeepSpeed ZeRO-3, API with streaming and without streaming. LLaMA model, including 4-bit GPTQ. RWKV model, LoRA (loading and training), Softprompts, and extensions.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 6
    abstract2paper

    abstract2paper

    Auto-generate an entire paper from a prompt or abstract using NLP

    Enter your abstract into the little doohicky here, and quicker'n you can blink your eyes1, a shiny new paper'll come right out for ya! What are you waiting for? Click the "doohicky" link above to get started, and then click the link to open the demo notebook in Google Colaboratory. To run the demo as a Jupyter notebook (e.g., locally), use this version instead. Note: to compile a PDF of your auto-generated paper (when you run the demo locally), you'll need to have a working LaTeX installation on your machine (e.g., so that pdflatex is a recognized system command). The notebook will also automatically install the transformers library if it's not already available in your local environment. In its unmodified state, the demo notebooks use the abstract from the GPT-3 paper as the "seed" for a new paper. Each time you run the notebook you'll get a new result.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 7
    gpt2-client

    gpt2-client

    Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, etc.

    GPT-2 is a Natural Language Processing model developed by OpenAI for text generation. It is the successor to the GPT (Generative Pre-trained Transformer) model trained on 40GB of text from the internet. It features a Transformer model that was brought to light by the Attention Is All You Need paper in 2017. The model has 4 versions - 124M, 345M, 774M, and 1558M - that differ in terms of the amount of training data fed to it and the number of parameters they contain. Finally, gpt2-client is a wrapper around the original gpt-2 repository that features the same functionality but with more accessiblity, comprehensibility, and utilty. You can play around with all four GPT-2 models in less than five lines of code. Install client via pip. The generation options are highly flexible. You can mix and match based on what kind of text you need generated, be it multiple chunks or one at a time with prompts.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 8
    Dickinson

    Dickinson

    Text generation language

    Dickinson is a text-generation language. You can try out the language on the web without installing anything. Binaries for some platforms are available on the releases page. There is an install script that will try to download the right release for your computer.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 9
    Intelligent Java

    Intelligent Java

    Integrate with the latest language models, image generation and speech

    Intelligent java (IntelliJava) is the ultimate tool to integrate with the latest language models and deep learning frameworks using java. The library provides an intuitive functions for sending input to models like ChatGPT and DALL·E, and receiving generated text, speech or images. With just a few lines of code, you can easily access the power of cutting-edge AI models to enhance your projects. Access ChatGPT, GPT3 to generate text and DALL·E to generate images. OpenAI is preferred for quality results without tuning. Generate text; Cohere allows you to generate a language model to suit your specific needs. Generate audio from text; Access DeepMind’s speech models. The only dependencies is GSON. Required to add manually when using IntelliJava jar. However, if you imported this repo through Maven, it will handle the dependencies.
    Downloads: 1 This Week
    Last Update:
    See Project
  • AI-based, Comprehensive Service Management for Businesses and IT Providers Icon
    AI-based, Comprehensive Service Management for Businesses and IT Providers

    Modular solutions for change management, asset management and more

    ChangeGear provides IT staff with the functions required to manage everything from ticketing to incident, change and asset management and more. ChangeGear includes a virtual agent, self-service portals and AI-based features to support analyst and end user productivity.
  • 10
    node-red-contrib-custom-chatgpt
    A Node-RED node that interacts with OpenAI machine learning models like "ChatGPT". Install with the built-in Node-RED Palette manager. When editing the properties of the node, to get your OPENAI_API_KEY log in to ChatGPT. Create a new secret key" then copy and paste the "API key" into the node API_KEY property value. msg.payload should be a well-written prompt that provides enough information for the model to know what you want and how it should respond. Its success generally depends on the complexity of the task and quality of your prompt. A good rule of thumb is to think about how you would write a word problem for a middle schooler to solve. msg.payload should be a well-written prompt that provides enough information for the model to know what you want and how it should respond.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 11
    AI Chatbots based on GPT Architecture

    AI Chatbots based on GPT Architecture

    Training & Implementation of chatbots leveraging GPT-like architecture

    Training & Implementation of chatbots leveraging GPT-like architecture with the aitextgen package to enable dynamic conversations. It sure seems like there are a lot of text-generation chatbots out there, but it's hard to find a python package or model that is easy to tune around a simple text file of message data. This repo is a simple attempt to help solve that problem. ai-msgbot covers the practical use case of building a chatbot that sounds like you (or some dataset/persona you choose) by training a text-generation model to generate conversation in a consistent structure. This structure is then leveraged to deploy a chatbot that is a "free-form" model that consistently replies like a human. Some of the trained models can be interacted with through the HuggingFace spaces and model inference APIs on the ETHZ Analytics Organization page on huggingface.co.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    Accelerated Text

    Accelerated Text

    Accelerated Text is a no-code natural language generation platform

    A picture is worth a thousand words. Or is it? Tables, charts, pictures are all useful in understanding our data but often we need a description – a story to tell us what are we looking at. Accelerated Text is a natural language generation tool which allows you to define data descriptions and then generates multiple versions of those descriptions varying in wording and structure. Accelerated Text is a no-code natural language generation platform. It will help you construct document plans which define how your data is converted to textual descriptions. With Accelerated Text you can use such data to generate text for your business reports, your e-commerce platform or your customer support system. Data descriptions require precision. Accelerated Text follows the principle of this strict adherence to data-bound text generation. Via its user interface, it provides instruments to define how the data should be translated into a descriptive text.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    Ad Generator

    Ad Generator

    Professional text randomizer and ad generator by Airat Khalitov

    Professional text randomizer and ad generator by Airat Khalitov / Professional text randomizer and ad generator. Author: Airat Halitov. Visit 'Plugins, Add New', click 'Upload Plugin', upload the file 'ad-generator.zip', and activate Ad Generator from your Plugins page. Add [ad_generator] shortcode to WordPress Page. Create a new WordPress Page, add [ad_generator] shortcode and save. Go to the page and use the ad generator. This is a program for industrial creation of pseudo-unique content. Used, for example, when registering a site in multiple directories. So that in each directory the site is described by text that is unique from the point of view of search engines. Unlike similar tools (synonymizers, dorgens), it allows you to maximize the readability of the resulting texts.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Aida Lib

    Aida Lib

    Aida is a language agnostic library for text generation

    Aida is a language-agnostic library for text generation. When using Aida, first you compose a tree of operations on your text that includes conditions via branches and other control flow. Later, you fill the tree with data and render the text. A building block is a variable class: Var. Use it to represent a value that you want to control later. A variable can hold numbers (e.g. float, int) or strings. You can create branches and complex logic with Branch. The context, represented by the class Ctx, is useful to create rules that depends on what has been written before. Each object or literal that is passed to Aida is remembered by the context. Creating a reference expression is a common use-case, so we have a helper function called create_ref. You can compose operations on your text with some handy operators.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    Basaran

    Basaran

    Basaran, an open-source alternative to the OpenAI text completion API

    Basaran is an open-source alternative to the OpenAI text completion API. It provides a compatible streaming API for your Hugging Face Transformers-based text generation models. The open source community will eventually witness the Stable Diffusion moment for large language models (LLMs), and Basaran allows you to replace OpenAI's service with the latest open-source model to power your application without modifying a single line of code. Stream generation using various decoding strategies. Support both decoder-only and encoder-decoder models. Detokenizer that handles surrogates and whitespace. Multi-GPU support with optional 8-bit quantization. Real-time partial progress using server-sent events. Compatible with OpenAI API and client libraries. Comes with a fancy web-based playground. Docker images are available on Docker Hub and GitHub Packages.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    CPT

    CPT

    CPT: A Pre-Trained Unbalanced Transformer

    A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation. We replace the old BERT vocabulary with a larger one of size 51271 built from the training data, in which we 1) add missing 6800+ Chinese characters (most of them are traditional Chinese characters); 2) remove redundant tokens (e.g. Chinese character tokens with ## prefix); 3) add some English tokens to reduce OOV. Position Embeddings We extend the max_position_embeddings from 512 to 1024. We initialize the new version of models with the old version of checkpoints with vocabulary alignment. Token embeddings found in the old checkpoints are copied. And other newly added parameters are randomly initialized. We further train the new CPT & Chinese BART 50K steps with batch size 2048, max-seq-length 1024, peak learning rate 2e-5, and warmup ratio 0.1. Aiming to unify both NLU and NLG tasks, We propose a novel Chinese Pre-trained Un-balanced Transformer (CPT).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    CRSLab

    CRSLab

    CRSLab is an open-source toolkit

    CRSLab is an open-source toolkit for building Conversational Recommender System (CRS). It is developed based on Python and PyTorch. CRSLab has the following highlights. Comprehensive benchmark models and datasets: We have integrated commonly-used 6 datasets and 18 models, including graph neural network and pre-training models such as R-GCN, BERT and GPT-2. We have preprocessed these datasets to support these models, and release for downloading. Extensive and standard evaluation protocols: We support a series of widely-adopted evaluation protocols for testing and comparing different CRS. General and extensible structure: We design a general and extensible structure to unify various conversational recommendation datasets and models, in which we integrate various built-in interfaces and functions for quickly development. Easy to get started: We provide simple yet flexible configuration for new researchers to quickly start in our library. Human-machine interaction interfaces.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    GPT-2 FR

    GPT-2 FR

    GPT-2 French demo | Démo française de GPT-2

    OpenAI GPT-2 model trained on four different datasets in French. Books in French, French film scripts, reports of parliamentary debates, Tweet by Emmanuel Macron, allowing to generate text. Tensorflow and gpt-2-simple are required in order to fine-tune GPT-2. Create an environment then install the two packages pip install tensorflow==1.14 gpt-2-simple. A script and a notebook are available in the src folder to fine-tune GPT-2 on your own datasets. The output of each workout, i.e. the folder checkpoint/run1, is to be put ingpt2-model/model1 model2 model3 etc. You can run the script deploy_cloudrun.shto deploy all your different models (into gpt2-model) at once. However, you must have already initialized the gcloud CLI tool (Cloud SDK).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    GPT2 for Multiple Languages

    GPT2 for Multiple Languages

    GPT2 for Multiple Languages, including pretrained models

    With just 2 clicks (not including Colab auth process), the 1.5B pretrained Chinese model demo is ready to go. The contents in this repository are for academic research purpose, and we do not provide any conclusive remarks. Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC) Simplifed GPT2 train scripts(based on Grover, supporting TPUs). Ported bert tokenizer, multilingual corpus compatible. 1.5B GPT2 pretrained Chinese model (~15G corpus, 10w steps). Batteries-included Colab demo. 1.5B GPT2 pretrained Chinese model (~30G corpus, 22w steps).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    Insert Text

    Insert Text

    Extends the media pool with an effect for outputting text in the image

    This addon expands the media manager with the effect Bild. The effect can be used, for example, to display a copyright notice, creation date, image title, etc. on images. The values ​​set for the effect are considered "default" and can be changed individually for each image if necessary via the effect parameters in meta data. The text source of the effect can be selected here. inputfor the field Textausgabe or any meta field from the media pool. A text area can also be selected from the media pool, which opens up even more possibilities for this effect. The values ​​set in the effect are considered "default" for all images to which this media type is applied. If a "text area" from the meta data has been selected as the text source, an individual setting for the effect can be applied to each image.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    Market Reporter

    Market Reporter

    Automatic Generation of Brief Summaries of Time-Series Data

    Market Reporter automatically generates short comments that describe time series data of stock prices, FX rates, etc. This is an implementation of Murakami et al. This tool stores data to Amazon S3. Ask the manager to give you AmazonS3FullAccess and issue a credential file. For details, please read AWS Identity and Access Management. Install Docker and Docker Compose. Edit envs/docker-compose.yaml according to your environment. Then, launch containers by docker-compose. We recommend to use pipenv to make a Python environment for this project. Suppose you have a database named master on your local machine. Prediction submodule generates a single comment of a financial instrument at specified time by loading a trained model.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    Minimal text diffusion

    Minimal text diffusion

    A minimal implementation of diffusion models for text generation

    A minimal implementation of diffusion models of text: learns a diffusion model of a given text corpus, allowing to generate text samples from the learned model. The main idea was to retain just enough code to allow training a simple diffusion model and generating samples, remove image-related terms, and make it easier to use. To train a model, run scripts/train.sh. By default, this will train a model on the simple corpus. However, you can change this to any text file using the --train_data argument. Note that you may have to increase the sequence length (--seq_len) if your corpus is longer than the simple corpus. The other default arguments are set to match the best setting I found for the simple corpus.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    PHP Client For NLP Cloud

    PHP Client For NLP Cloud

    NLP Cloud serves high performance pre-trained or custom models for NER

    NLP Cloud serves high performance pre-trained or custom models for NER, sentiment-analysis, classification, summarization, dialogue summarization, paraphrasing, intent classification, product description and ad generation, chatbot, grammar and spelling correction, keywords and keyphrases extraction, text generation, image generation, blog post generation, code generation, question answering, automatic speech recognition, machine translation, language detection, semantic search, semantic similarity, tokenization, POS tagging, embeddings, and dependency parsing. It is ready for production, served through a REST API. You can either use the NLP Cloud pre-trained models, fine-tune your own models, or deploy your own models. Pass the model you want to use and the NLP Cloud token to the client during initialization. If you are making asynchronous requests, you will always receive a quick response containing a URL.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    PerlPP

    PerlPP

    Perl preprocessor - embed Perl source in any file

    Translates Text+Perl to Text. It can be used for any kind of text templating, e.g. code generation. No external modules are required, just a single file. Requires Perl 5.10.1+. PerlPP runs in two passes: it generates a Perl script from your input, and then it runs the generated script. If you see error at (eval ##) (for some number ##), it means there was an error in the generated script. The -D switch defines elements of %D. If you do not specify a value, the value true (a constant in the generated script) will be used. The following commands work mostly analogously to their C preprocessor counterparts. but $fn can be determined programmatically. Note that defines set with -D or -s do not take effect until after the script has been generated, which is after the macro code runs.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    Pipeline for training Language Models

    Pipeline for training Language Models

    Pipeline for training Language Models using PyTorch.

    Pipeline for training Language Models using PyTorch. Inspired by Yandex Data School NLP Course (week 03: Language Modeling) Prepared text file with space-separated words on each line.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • Next