Showing 11 open source projects for "aes-256"

View related business solutions
  • AI-powered service management for IT and enterprise teams Icon
    AI-powered service management for IT and enterprise teams

    Enterprise-grade ITSM, for every business

    Give your IT, operations, and business teams the ability to deliver exceptional services—without the complexity. Maximize operational efficiency with refreshingly simple, AI-powered Freshservice.
    Try it Free
  • Forever Free Full-Stack Observability | Grafana Cloud Icon
    Forever Free Full-Stack Observability | Grafana Cloud

    Our generous forever free tier includes the full platform, including the AI Assistant, for 3 users with 10k metrics, 50GB logs, and 50GB traces.

    Built on open standards like Prometheus and OpenTelemetry, Grafana Cloud includes Kubernetes Monitoring, Application Observability, Incident Response, plus the AI-powered Grafana Assistant. Get started with our generous free tier today.
    Create free account
  • 1
    IronClaw

    IronClaw

    IronClaw is OpenClaw inspired but focused on privacy & security

    IronClaw is a security-first, open-source personal AI assistant built in Rust and designed to keep your data fully under your control. It operates on the principle that your AI should work for you, not external vendors, ensuring all data is stored locally, encrypted, and never shared. The platform emphasizes transparency, offering auditable code with no hidden telemetry or data harvesting. IronClaw runs untrusted tools inside isolated WebAssembly (WASM) sandboxes with strict capability-based...
    Downloads: 9 This Week
    Last Update:
    See Project
  • 2
    Stake Crash Predictor

    Stake Crash Predictor

    Stake Crash Predictor is a toolkit for stake mines predictor & Plinko.

    The Stake Crash Predictor is a focused toolkit that combines statistical analysis, optional server fairness seed hash decrypt helpers, and AI-assisted summaries to help you study rounds on Stake.us. This project centers on the stake mines predictor and stake predictor workflows Demo-focused stake crash predictor app — seed-inspection helpers (SHA-512 / SHA-256), AI-assisted summaries, and demo bot templates for stake mines predictor too, Start in demo mode to test safely. Disclaimer: For educational and testing purposes only. No predictive or gameplay guarantees.
    Downloads: 168 This Week
    Last Update:
    See Project
  • 3
    DALL-E in Pytorch

    DALL-E in Pytorch

    Implementation / replication of DALL-E, OpenAI's Text to Image

    ...Currently only the VAE with a codebook size of 1024 is offered, with the hope that it may train a little faster than OpenAI's, which has a size of 8192. In contrast to OpenAI's VAE, it also has an extra layer of downsampling, so the image sequence length is 256 instead of 1024 (this will lead to a 16 reduction in training costs, when you do the math).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    BytePS

    BytePS

    A high performance and generic framework for distributed DNN training

    ...BytePS outperforms existing open-sourced distributed training frameworks by a large margin. For example, on BERT-large training, BytePS can achieve ~90% scaling efficiency with 256 GPUs (see below), which is much higher than Horovod+NCCL. In certain scenarios, BytePS can double the training speed compared with Horovod+NCCL. We show our experiment on BERT-large training, which is based on GluonNLP toolkit. The model uses mixed precision. We use Tesla V100 32GB GPUs and set batch size equal to 64 per GPU. Each machine has 8 V100 GPUs (32GB memory) with NVLink-enabled. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Auth0 B2B Essentials: SSO, MFA, and RBAC Built In Icon
    Auth0 B2B Essentials: SSO, MFA, and RBAC Built In

    Unlimited organizations, 3 enterprise SSO connections, role-based access control, and pro MFA included. Dev and prod tenants out of the box.

    Auth0's B2B Essentials plan gives you everything you need to ship secure multi-tenant apps. Unlimited orgs, enterprise SSO, RBAC, audit log streaming, and higher auth and API limits included. Add on M2M tokens, enterprise MFA, or additional SSO connections as you scale.
    Sign Up Free
  • 5
    PyTorch pretrained BigGAN

    PyTorch pretrained BigGAN

    PyTorch implementation of BigGAN with pretrained weights

    An op-for-op PyTorch reimplementation of DeepMind's BigGAN model with the pre-trained weights from DeepMind. This repository contains an op-for-op PyTorch reimplementation of DeepMind's BigGAN that was released with the paper Large Scale GAN Training for High Fidelity Natural Image Synthesis. This PyTorch implementation of BigGAN is provided with the pretrained 128x128, 256x256 and 512x512 models by DeepMind. We also provide the scripts used to download and convert these models from the...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6

    VedaViz

    Text to graphics (*3d, animation) (semi)automatically

    When students are creating themselves, learning is taking place. And teachers will be at the epicenter of this. Anyone who believes differently has never had a good teacher. I would trade all of my technology for an afternoon with Socrates. Steve Jobs http://www.thedailybeast.com/newsweek/2001/10/28/the-classroom-of-the-future.html Когда ученик творит, он обучается. Учитель же будет в эпицентре этого всего. Кто думает иначе, - у него никогда не было хорошего учителя. Я бы продал все...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 7
    ANts P2P
    ANts P2P realizes a third generation P2P net. It protects your privacy while you are connected and makes you not trackable, hiding your identity (ip) and crypting everything you are sending/receiving from others.
    Downloads: 10 This Week
    Last Update:
    See Project
  • 8
    The purpose of this project is to provide a biometric security solution by using voice print, fingerprint and/or facial recognition along with a password and/or smart card support using AES to protect data. Please read forums for if interested.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Ministral 3 14B Base 2512

    Ministral 3 14B Base 2512

    Powerful 14B-base multimodal model — flexible base for fine-tuning

    ...The model remains efficient enough for on-prem or local deployment — it fits in ~32 GB VRAM in BF16, and requires under ~24 GB when quantized. It supports dozens of languages, making it suitable for multilingual applications around the world. With a large 256 k-token context window, Ministral 3 14B Base 2512 can handle very long inputs, complex documents, or large contexts.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Go From AI Idea to AI App Fast Icon
    Go From AI Idea to AI App Fast

    One platform to build, fine-tune, and deploy ML models. No MLOps team required.

    Access Gemini 3 and 200+ models. Build chatbots, agents, or custom models with built-in monitoring and scaling.
    Try Free
  • 10
    Ministral 3 3B Reasoning 2512

    Ministral 3 3B Reasoning 2512

    Compact 3B-param multimodal model for efficient on-device reasoning

    Ministral 3 3B Reasoning 2512 is the smallest reasoning-capable model in the Ministal-3 family, yet delivers a surprisingly capable multimodal and multilingual base for lightweight AI applications. It pairs a 3.4B-parameter language model with a 0.4B-parameter vision encoder, enabling it to understand both text and image inputs. This reasoning-tuned variant is optimized for tasks like math, coding, and other STEM-related problem solving, making it suitable for applications that require...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    Ministral 3 8B Base 2512

    Ministral 3 8B Base 2512

    Versatile 8B-base multimodal LLM, flexible foundation for custom AI

    Ministral 3 8B Base 2512 is a mid-sized, dense model in the Ministral 3 series, designed as a general-purpose foundation for text and image tasks. It pairs an 8.4B-parameter language model with a 0.4B-parameter vision encoder, enabling unified multimodal capabilities out of the box. As a “base” model (i.e., not fine-tuned for instruction or reasoning), it offers a flexible starting point for custom downstream tasks or fine-tuning. The model supports a large 256k token context window, making...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB