Home | Call for Papers | Program |
The convergence of Artificial Intelligence (AI) and the Internet of Things (IoT) presents unprecedented opportunities for advanced data processing and analysis. As IoT devices continue to evolve, capable of processing data within constrained resources, the demand for real-time AI computation leveraging local sensory data while preserving data privacy becomes paramount. However, achieving an adaptive and reliable AIoT system that seamlessly integrates sensing and computing poses significant challenges.
This workshop proposal aims to address these challenges by focusing on the development of resource-efficient Mobile and Embedded Large Language Model (LLM) systems within the AIoT framework. We aim to explore strategies for maintaining system robustness in dynamic environments, optimizing resource and energy utilization in IoT devices, and updating AI models using local data. This workshop serves as a platform for researchers, system developers, and practitioners to engage in discussions surrounding the design, development, deployment, and operational hurdles of adaptive AIoT systems.
Contributions encompassing both theoretical and practical aspects are encouraged, including advancements in embedded and edge AI systems, on-device machine learning algorithms, and methodological considerations in AIoT systems. Join us in exploring the frontier of resourceefficient Mobile and Embedded Large Language Model Systems in the context of AIoT. Potential topics of intertest include, but are not limited to:
Here is the Submission site: HotCRP Link