| | 1 | Project Title: Resilient Edge-Cloud Autonomous Learning with Timely inferences |
| | 2 | Project Advisor and Mentors: Professor Anand Sarwate, Professor Waheed Bajwa, Nitya Sathyavageeswaran, Vishakha Ramani, Yu Wu, Aliasghar Mohammadsalehi, Muhammad Zulqarnain |
| | 3 | Names: Shreya Venugopaal, Haider Abdelrahman, Tanushree Mehta, Lakshya Gour, Yunhyuk Chang |
| | 4 | |
| | 5 | |
| | 6 | Objective: The purpose of this project is to use ORBIT in order to |
| | 7 | |
| | 8 | Main objectives |
| | 9 | learn how to design and run experiments on ORBIT |
| | 10 | prototype a handoff scheme for ML prediction |
| | 11 | develop a latency profiling framework for MEC-assisted machine learning |
| | 12 | |
| | 13 | – Split the network, early exit, simulate multiple CPU speeds, network delays |
| | 14 | |
| | 15 | Progress |
| | 16 | Goals |
| | 17 | Team introductions |
| | 18 | Kickoff meetings |
| | 19 | Refining and understanding research questions |
| | 20 | go over goals for the summer |
| | 21 | For next week: |
| | 22 | Go through WINLAB orientation (TBD) |
| | 23 | Reading for next time: |
| | 24 | PPA Chapters 1 and 2 |
| | 25 | Coding: |
| | 26 | install python (via Anaconda or something else) |
| | 27 | go through the interactive intro to python online or by downloading the notebook |
| | 28 | Review the basics of Python linked from the UT Austin site to learn basic syntax, flow control, etc. |
| | 29 | Week 2: |
| | 30 | Goals: |
| | 31 | Summary: |
| | 32 | Basics of pattern recognition and Machine Learning (PPA - Patterns, Predictions, Actions) |
| | 33 | set up an instance using pytorch on an Orbit node |
| | 34 | Created a node image with Pytorch |
| | 35 | Basics of Pytorch |
| | 36 | Created small Machine Learning models |
| | 37 | Loaded the Modified National Institute of Standards and Technology (MNIST) database onto the node |
| | 38 | computed the second moment matrix |
| | 39 | Did PCA and SVM on MNIST |
| | 40 | Next Steps: |
| | 41 | Create and train a “small” and “large” Neural Network |
| | 42 | Attempt to simulate the difference between their performances at inference |
| | 43 | Links: |
| | 44 | Week 3: |
| | 45 | Goals: |
| | 46 | Summary: |
| | 47 | Created and Trained a ‘Small’ and ’Large’ Neural Networks |
| | 48 | Compared their performances on the CIFAR10 dataset |
| | 49 | Established connection between two nodes |
| | 50 | Communicated test data between nodes to compare accuracy and delay between our NN models |
| | 51 | Need to serialize by bytes instead of transferring as strings |
| | 52 | Had a discussion on various research papers related to our project |
| | 53 | Next Steps: |
| | 54 | Send the data as bytes instead of strings |
| | 55 | Calculate the times for transfer & processing |
| | 56 | Read more papers related to stuff about early exit and edge cloud computing |
| | 57 | Divide data into “chunks” for faster and more efficient transmission |
| | 58 | Design experiments to test different architectures and implementations of Early Exiting and Split Computing |
| | 59 | Track and add Age of Information Metrics |
| | 60 | Presentation links: |
| | 61 | Link - Lakshya |
| | 62 | Link - Shreya |
| | 63 | Link - Haider |
| | 64 | Link - James |
| | 65 | Link - Tanushree |
| | 66 | Week 4: |
| | 67 | Goals |
| | 68 | Summary |
| | 69 | Next Steps |
| | 70 | |