Roadmap
This section outlines the exciting milestones and future developments planned for Lumo Labs.
Last updated
This section outlines the exciting milestones and future developments planned for Lumo Labs.
Last updated
The Lumo-8B-DS-Instruct dataset [Completion: 15th January, 2025] comprising 5,502 high-quality question-answer pairs has been successfully launched on the Hugging Face Hub as an open-source resource.
This dataset provides a valuable foundation for researchers and developers interested in training and fine-tuning AI models specifically for the Solana ecosystem.
The Lumo-8B-Instruct model [Completion: 15th January, 2025], trained on the Lumo-8B-DS-Instruct dataset and leveraging the powerful Llama 3.1 8B parameter foundation, has been successfully launched on the Hugging Face Hub as an open-source resource.
This marks a significant achievement for the Lumo Labs community, making Lumo-8B-Instruct readily accessible for developers and researchers to experiment with and build upon.
[Completion: 16th January, 2025]
While Lumo-8B-Instruct can be readily accessed and used through the Hugging Face Hub and inferenced locally and on servers, we have developed a chatbot interface so that users can try out Lumo without having to inference it themself.
This chatbot will provide an intuitive and accessible way for users to interact with the model, enabling seamless exploration of its capabilities.
Lumo has set a milestone by launching a dataset that is unmatched and the largest ever dataset for a Solana model. [Completion: 18th January, 2025] [Knowledge cut-off date: 17th January, 2025]
Lumo-Iris-DS-Instruct is the ultimate powerhouse of Solana-related knowledge, featuring a groundbreaking 28,518 high-quality question-answer pairs. This dataset is 5x larger, more comprehensive, and meticulously refined compared to its predecessor, Lumo-8B-DS-Instruct. With cutting-edge enhancements and superior coverage, it sets the gold standard for fine-tuning large language models for Solana.
Solana's largest ever language model has been launched on 21st January, 2025. Lumo-70B-Instruct, fine-tuned with the great Lumo Iris Dataset on Meta Llama 3.3 70B Intruct. [Completion: 21st January, 2025]
Lumo-70B-Instruct is capable of developing code for Solana stronger than ever, conversing better than ever, and is the most suitable model for building agents on Solana.
We are committed to continuous improvement and are actively working on training a larger, more powerful model with over 70 billion parameters.
This ambitious project will involve the creation of an expanded dataset comprising over 25,000 high-quality question-answer pairs, further enhancing the model's understanding and capabilities within the Solana ecosystem.
Lumo organised one of the biggest prize pools for a hackathon that a memecoin project had ever seen, the hackathon was a week-long activity where users had to build over Lumo's resources and showcase their technical abilities. [Completion: 31st January, 2025]
A total of 78 submissions were made over a very tight deadline with several notable submissions showcasing the abilities of Lumo, visit the link below to get more information:
The largest ever blockchain AI Dataset was launched on this date, an extremely powerful dataset that was trained over several official Solana Documentation sources, Solana StackExchange Data Dumps, etc. (19+ authoritative references). [Completion: 1st February, 2025]
It consists of 95,127 high-quality question-answer pairs. This dataset is 3.3x larger than its predecessor, Lumo-Iris-DS-Instruct, with enhanced precision, comprehensive coverage, and an optimized architecture for large-scale AI fine-tuning in the Solana ecosystem.
Lumo Labs envisions creating the most comprehensive open-source library of AI agents specifically designed for the Solana ecosystem.
This toolkit will empower developers to build sophisticated AI-powered applications on Solana, including:
Decentralized AI agents
Autonomous market makers
Predictive analytics tools
And much more.