Showing 2 open source projects for "code block installer"

View related business solutions
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • Automate contact and company data extraction Icon
    Automate contact and company data extraction

    Build lead generation pipelines that pull emails, phone numbers, and company details from directories, maps, social platforms. Full API access.

    Generate leads at scale without building or maintaining scrapers. Use 10,000+ ready-made tools that handle authentication, pagination, and anti-bot protection. Pull data from business directories, social profiles, and public sources, then export to your CRM or database via API. Schedule recurring extractions, enrich existing datasets, and integrate with your workflows.
    Explore Apify Store
  • 1
    Superagent

    Superagent

    Superagent protects your AI applications

    Superagent is an open-source AI safety platform built to protect applications from prompt injections, data leaks, and harmful outputs. It embeds real-time safety directly into AI workflows, helping teams secure models before threats cause damage. Superagent provides guardrails that block jailbreaks, prompt manipulation, and sensitive data exfiltration. It includes redaction tools to remove PII, PHI, and secrets automatically from text. The platform also scans code repositories to detect AI-specific attack vectors like repo poisoning. Superagent is designed for low-latency production environments and works with any major LLM provider. ...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 2
    ChatLLM Web

    ChatLLM Web

    Chat with LLM like Vicuna totally in your browser with WebGPU

    Chat with LLM like Vicuna totally in your browser with WebGPU, safely, privately, and with no server. Powered By web-llm. To use this app, you need a browser that supports WebGPU, such as Chrome 113 or Chrome Canary. Chrome versions ≤ 112 are not supported. You will need a GPU with about 6.4GB of memory. If your GPU has less memory, the app will still run, but the response time will be slower. The first time you use the app, you will need to download the model. For the Vicuna-7b model that...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next