| 63 |   |    * | 
          
          
            | 64 |   |    * | 
          
          
            | 65 |   |    * | 
          
          
            |   | 62 |    * Divide data into “chunks” for faster and more efficient transmission | 
          
          
            |   | 63 |    * Design experiments to test different architectures and implementations of Early Exiting and Split Computing | 
          
          
            |   | 64 |    * Track and add Age of Information Metrics | 
          
          
            |   | 65 |  | 
          
          
            |   | 66 | == Week 5 | 
          
          
            |   | 67 | **Summary** | 
          
          
            |   | 68 |   * Figured out how to properly split a NN using split computing | 
          
          
            |   | 69 |   * First Experimented with image inferencing on the same device | 
          
          
            |   | 70 |   * Split the Neural Network(ResNet 18) onto two different devices | 
          
          
            |   | 71 |   * Ran an inference with the split NN across a network | 
          
          
            |   | 72 |  | 
          
          
            |   | 73 | **Next Steps** | 
          
          
            |   | 74 |   * Take a step back | 
          
          
            |   | 75 |   * Ask: what questions do we want to answer? | 
          
          
            |   | 76 |   * As you vary the threshold for asking for help, how does the average latency change(over the dataset)? | 
          
          
            |   | 77 | Open ended | 
          
          
            |   | 78 |  | 
          
          
            |   | 79 | Transfer to Precision time protocol(PTP) | 
          
          
            |   | 80 |  |