We're thrilled to present OneLLM, an ensembling Meta-Transformer framework with Multimodal Large Language Models, which performs multimodal joint training, supports more modalities including fMRI, Depth, and Normal Maps, and demonstrates very impressive performances on 25 benchmarks.
Features
- Single Foundation Model Supports A Wide Range of Applications
- Meta-Transformer is capable of handling up to 12 modalities
- Shared-Encoder, Unpaired Data, More Modalities
- Open-source Modality-Agnostic Models
- Demo of Use for Pretrained Encoder
- This project is released under the Apache 2.0 license
Categories
Machine LearningLicense
Apache License V2.0Follow MetaTransformer
You Might Also Like
Take away the pain and hassle associated with applicant recruitment, hiring, and onboarding with ApplicantStack. Designed for HR professionals and recruiters, ApplicantStack helps streamline the recruiting and onboarding processes to improve productivity and reduce costs. ApplicantStack provides a complete toolkit that includes tools for posting, launching, and advertising jobs, assessing and managing candidates, collaborating with teams, centralizing information for quick hiring and onboarding, and more.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of MetaTransformer!