Business Software for Unsloth

Top Software that integrates with Unsloth as of July 2025

Unsloth Clear Filters

Compare business software, products, and services to find the best solution for your business or organization. Use the filters on the left to drill down by category, pricing, features, organization size, organization type, region, user reviews, integrations, and more. View and sort the products and solutions that match your needs in the results below.

  • 1
    ChatGPT

    ChatGPT

    OpenAI

    ChatGPT is an AI-powered conversational assistant developed by OpenAI that helps users with writing, learning, brainstorming, coding, and more. It is free to use with easy access via web and apps on multiple devices. Users can interact through typing or voice to get answers, generate creative content, summarize information, and automate tasks. The platform supports various use cases, from casual questions to complex research and coding help. ChatGPT offers multiple subscription plans, including Free, Plus, and Pro, with increasing access to advanced AI models and features. It is designed to boost productivity and creativity for individuals, students, professionals, and developers alike.
    Starting Price: Free
  • 2
    Mistral AI

    Mistral AI

    Mistral AI

    Mistral AI is a pioneering artificial intelligence startup specializing in open-source generative AI. The company offers a range of customizable, enterprise-grade AI solutions deployable across various platforms, including on-premises, cloud, edge, and devices. Flagship products include "Le Chat," a multilingual AI assistant designed to enhance productivity in both personal and professional contexts, and "La Plateforme," a developer platform that enables the creation and deployment of AI-powered applications. Committed to transparency and innovation, Mistral AI positions itself as a leading independent AI lab, contributing significantly to open-source AI and policy development.
    Starting Price: Free
  • 3
    Google Colab
    Google Colab is a free, hosted Jupyter Notebook service that provides cloud-based environments for machine learning, data science, and educational purposes. It offers no-setup, easy access to computational resources such as GPUs and TPUs, making it ideal for users working with data-intensive projects. Colab allows users to run Python code in an interactive, notebook-style environment, share and collaborate on projects, and access extensive pre-built resources for efficient experimentation and learning. Colab also now offers a Data Science Agent automating analysis, from understanding the data to delivering insights in a working Colab notebook (Sequences shortened. Results for illustrative purposes. Data Science Agent may make mistakes.)
  • 4
    Llama 3
    We’ve integrated Llama 3 into Meta AI, our intelligent assistant, that expands the ways people can get things done, create and connect with Meta AI. You can see first-hand the performance of Llama 3 by using Meta AI for coding tasks and problem solving. Whether you're developing agents, or other AI-powered applications, Llama 3 in both 8B and 70B will offer the capabilities and flexibility you need to develop your ideas. With the release of Llama 3, we’ve updated the Responsible Use Guide (RUG) to provide the most comprehensive information on responsible development with LLMs. Our system-centric approach includes updates to our trust and safety tools with Llama Guard 2, optimized to support the newly announced taxonomy published by MLCommons expanding its coverage to a more comprehensive set of safety categories, code shield, and Cybersec Eval 2.
    Starting Price: Free
  • 5
    Devstral

    Devstral

    Mistral AI

    Devstral is an open source, agentic large language model (LLM) developed by Mistral AI in collaboration with All Hands AI, specifically designed for software engineering tasks. It excels at navigating complex codebases, editing multiple files, and resolving real-world issues, outperforming all open source models on the SWE-Bench Verified benchmark with a score of 46.8%. Devstral is fine-tuned from Mistral-Small-3.1 and features a long context window of up to 128,000 tokens. It is optimized for local deployment on high-end hardware, such as a Mac with 32GB RAM or an Nvidia RTX 4090 GPU, and is compatible with inference frameworks like vLLM, Transformers, and Ollama. Released under the Apache 2.0 license, Devstral is available for free and can be accessed via Hugging Face, Ollama, Kaggle, Unsloth, and LM Studio.
    Starting Price: $0.1 per million input tokens
  • 6
    Llama 2
    The next generation of our open source large language model. This release includes model weights and starting code for pretrained and fine-tuned Llama language models — ranging from 7B to 70B parameters. Llama 2 pretrained models are trained on 2 trillion tokens, and have double the context length than Llama 1. Its fine-tuned models have been trained on over 1 million human annotations. Llama 2 outperforms other open source language models on many external benchmarks, including reasoning, coding, proficiency, and knowledge tests. Llama 2 was pretrained on publicly available online data sources. The fine-tuned model, Llama-2-chat, leverages publicly available instruction datasets and over 1 million human annotations. We have a broad range of supporters around the world who believe in our open approach to today’s AI — companies that have given early feedback and are excited to build with Llama 2.
    Starting Price: Free
  • 7
    Kaggle

    Kaggle

    Kaggle

    Kaggle offers a no-setup, customizable, Jupyter Notebooks environment. Access free GPUs and a huge repository of community published data & code. Inside Kaggle you’ll find all the code & data you need to do your data science work. Use over 19,000 public datasets and 200,000 public notebooks to conquer any analysis in no time.
  • 8
    Gemma

    Gemma

    Google

    Gemma is a family of lightweight, state-of-the-art open models built from the same research and technology used to create the Gemini models. Developed by Google DeepMind and other teams across Google, Gemma is inspired by Gemini, and the name reflects the Latin gemma, meaning “precious stone.” Accompanying our model weights, we’re also releasing tools to support developer innovation, foster collaboration, and guide the responsible use of Gemma models. Gemma models share technical and infrastructure components with Gemini, our largest and most capable AI model widely available today. This enables Gemma 2B and 7B to achieve best-in-class performance for their sizes compared to other open models. And Gemma models are capable of running directly on a developer laptop or desktop computer. Notably, Gemma surpasses significantly larger models on key benchmarks while adhering to our rigorous standards for safe and responsible outputs.
  • 9
    NVIDIA DRIVE
    Software is what turns a vehicle into an intelligent machine. The NVIDIA DRIVE™ Software stack is open, empowering developers to efficiently build and deploy a variety of state-of-the-art AV applications, including perception, localization and mapping, planning and control, driver monitoring, and natural language processing. The foundation of the DRIVE Software stack, DRIVE OS is the first safe operating system for accelerated computing. It includes NvMedia for sensor input processing, NVIDIA CUDA® libraries for efficient parallel computing implementations, NVIDIA TensorRT™ for real-time AI inference, and other developer tools and modules to access hardware engines. The NVIDIA DriveWorks® SDK provides middleware functions on top of DRIVE OS that are fundamental to autonomous vehicle development. These consist of the sensor abstraction layer (SAL) and sensor plugins, data recorder, vehicle I/O support, and a deep neural network (DNN) framework.
  • 10
    Llama

    Llama

    Meta

    Llama (Large Language Model Meta AI) is a state-of-the-art foundational large language model designed to help researchers advance their work in this subfield of AI. Smaller, more performant models such as Llama enable others in the research community who don’t have access to large amounts of infrastructure to study these models, further democratizing access in this important, fast-changing field. Training smaller foundation models like Llama is desirable in the large language model space because it requires far less computing power and resources to test new approaches, validate others’ work, and explore new use cases. Foundation models train on a large set of unlabeled data, which makes them ideal for fine-tuning for a variety of tasks. We are making Llama available at several sizes (7B, 13B, 33B, and 65B parameters) and also sharing a Llama model card that details how we built the model in keeping with our approach to Responsible AI practices.
  • Previous
  • You're on page 1
  • Next