This is why we are proud to introduce a novel approach based on a hardware-aware, training-free NAS method. Unlike traditional training-free NAS techniques that focus solely on performance proxies, our approach uniquely integrates the energy consumption for a specific hardware, producing tailored DNNs designs on a variety of different devices.Â
NebulOS is inherently user-centric, as it allows users to customize the weightage given to energy efficiency, which ensures that the final model aligns with the end-user's unique needs, resources, and preferences. Such a framework makes it possible to democratize the use of advanced AI solutions by extending their applicability to all those scenarios with limited computational resources.Â
This has the potential to empower researchers at regular institutions and SMEs to contribute to the AI race, bringing the frontier of AI innovation to the global scale needed to ensure this technology is maintained genuinely beneficial to all. In the pursuit of inclusive and sustainable AI development, and to foster and exploit a community-driven approach, NebulOS is completely open-source on GitHub (fracapuano/NebulOS), where users can effortlessly access, modify, propose changes, and expand upon the work presented.
We researched the best launch strategy for our open-source project by interviewing individual developers and small AI acceleration research groups, concluding on the value of clear communication for community development.Â
We crafted the README file as the cover of our GitHub repository, where we conveyed the project’s aim in a short and provided an explanation of what our algorithm can accomplish. With this, potential users can quickly ramp up on the latest features released and the upcoming ones, and learn about how they themselves could contribute to the project. Furthermore, we developed a web application serving the scope of accessibly showcasing our work manner our work, as well as attracting new potential contributors.