Roadmap
This section outlines the exciting milestones and future developments planned for Lumo Labs.
Last updated
This section outlines the exciting milestones and future developments planned for Lumo Labs.
Last updated
The Lumo-8B-DS-Instruct dataset [Completion: 15th January, 2025] comprising 5,502 high-quality question-answer pairs has been successfully launched on the Hugging Face Hub as an open-source resource.
This dataset provides a valuable foundation for researchers and developers interested in training and fine-tuning AI models specifically for the Solana ecosystem.
The Lumo-8B-Instruct model [Completion: 15th January, 2025], trained on the Lumo-8B-DS-Instruct dataset and leveraging the powerful Llama 3.1 8B parameter foundation, has been successfully launched on the Hugging Face Hub as an open-source resource.
This marks a significant achievement for the Lumo Labs community, making Lumo-8B-Instruct readily accessible for developers and researchers to experiment with and build upon.
Getting the model 'Lumo-8B-Instruct' natively listed on Ollama, this enables users to directly inference the model and plug and play it locally. [Completion: 15th January, 2025] https://ollama.com/lumolabs/Lumo-8B-Instruct
https://try-lumo8b.lumolabs.ai [Completion: 16th January, 2025]
While Lumo-8B-Instruct can be readily accessed and used through the Hugging Face Hub and inferenced locally and on servers, we have developed a chatbot interface so that users can try out Lumo without having to inference it themself.
This chatbot will provide an intuitive and accessible way for users to interact with the model, enabling seamless exploration of its capabilities.
Lumo has set a milestone by launching a dataset that is unmatched and the largest ever dataset for a Solana model. [Completion: 18th January, 2025] [Knowledge cut-off date: 17th January, 2025]
Lumo-Iris-DS-Instruct is the ultimate powerhouse of Solana-related knowledge, featuring a groundbreaking 28,518 high-quality question-answer pairs. This dataset is 5x larger, more comprehensive, and meticulously refined compared to its predecessor, Lumo-8B-DS-Instruct. With cutting-edge enhancements and superior coverage, it sets the gold standard for fine-tuning large language models for Solana. https://huggingface.co/datasets/lumolabs-ai/Lumo-Iris-DS-Instruct
Solana's largest ever language model has been launched on 21st January, 2025. Lumo-70B-Instruct, fine-tuned with the great Lumo Iris Dataset on Meta Llama 3.3 70B Intruct. [Completion: 21st January, 2025]
Lumo-70B-Instruct is capable of developing code for Solana stronger than ever, conversing better than ever, and is the most suitable model for building agents on Solana.
We are committed to continuous improvement and are actively working on training a larger, more powerful model with over 70 billion parameters.
This ambitious project will involve the creation of an expanded dataset comprising over 25,000 high-quality question-answer pairs, further enhancing the model's understanding and capabilities within the Solana ecosystem.
Lumo Labs envisions creating the most comprehensive open-source library of AI agents specifically designed for the Solana ecosystem.
This toolkit will empower developers to build sophisticated AI-powered applications on Solana, including:
Decentralized AI agents
Autonomous market makers
Predictive analytics tools
And much more.