wiki:Other/Summer/2023/Inference

Version 61 (modified by tm897, 17 months ago) ( diff )

Resilient Edge-Cloud Autonomous Learning with Timely inferences

Project Advisor and Mentors: Professor Anand Sarwate, Professor Waheed Bajwa, Nitya Sathyavageeswaran, Vishakha Ramani, Yu Wu, Aliasghar Mohammadsalehi, Muhammad Zulqarnain

Team: Shreya Venugopaal, Haider Abdelrahman, Tanushree Mehta, Lakshya Gour, Yunhyuk Chang

Objective: The purpose of this project is to use ORBIT in order to design and run experiments which will analyze the training and prediction of various ML models across multiple edge devices. Additionally, students will develop a latency profiling framework for MEC-assisted machine learning using cutting edge techniques such as splitting models across multiple orbit nodes and networks, as well as early exit. They will then analyze these models for latency and accuracy tradeoff analysis, along with measuring network delays.

Week 1

Summary

  • Understood the goal of the project.
  • .
  • Got familiar with Linux by practicing some simple Linux commands.

Next Steps

  • Create and train a “small” and “large” Neural Network
  • Attempt to simulate the difference between their performances at inference

Week 2

Summary

* * *

Next Steps

* * *

Week 3

Summary

* * *

Next Steps

* * *

Week 4

Summary

* * *

Next Steps

* * *

Links to Presentations

Week 2

.

Week 3
Mid Sem

  • Week 4
  • Week 5
  • Final
Note: See TracWiki for help on using the wiki.