wiki:Other/Summer/2015/cMF3

Real-Time Cyber Physical Systems Application on MobilityFirst

Table of Contents

  1. 2015 Winlab Summer Internship
    1. Projects
    1. Indoor Localization
    2. Introduction
      1. Motivation
      2. What is ORBIT Lab?
      3. Overall Approach
      4. Resources
      5. Procedure
      6. Plan of Action
      7. Weekly Presentations
      8. Team
    1. SDR in ORBIT: Spectrum Sensing
      1. Introduction
      2. Team
      3. Objectives
      4. Weekly Progress
      5. Experiments
    1. LTE Unlicensed (LTE-U)
      1. Introduction
      2. Objectives
      3. Theory
      4. Analyzing Tools
      5. Experiment 1: Transmit and Receive LTE Signal
      6. Experiment 2: The Waterfall Plot
      7. Experiment 3: eNB and UE GUI
      8. Experiment 4: Varying Bandwidths
      9. Experiment 5: Working with TDD or FDD
      10. Experiment 6: TDD with Varying Bandwidths
      11. Experiment 7: TDD Waterfall Plot
      12. Poster
      13. Members
      14. Materials
      15. Resources
    1. Distributed Simulation of Power Grid
      1. Introduction
      2. Objectives
      3. People
      4. Resources
    1. Context-Aware IoT Application on MobilityFirst
      1. Introduction
      2. Objectives
      3. System Architecture
      4. Network Diagram
      5. Experiment Tools
      6. Results
      7. Future Work
      8. Team member
    1. Real-Time Cyber Physical Systems Application on MobilityFirst
      1. Github Repo
      2. Introduction
      3. Preliminary Goal
      4. Outline of the Project
      5. Tasks
      6. Image Processing
      7. Weekly Summary
      8. Team
      9. Presentation Slides
    1. GNRS Assited Inter Domain Routing
      1. Introduction
    1. GNRS Management
      1. Introduction
      2. Work Milestones
    1. Effective Password Cracking Using GPU
      1. Introduction
      2. Objectives
      3. GPU
      4. Experiment
      5. Tools and Resources
  2. Body Sensor Networks
    1. Introduction
    2. Project Overview
    3. Data Collection
      1. Initial BCI data
    4. Data Analysis
    5. Tools/ Resources
    1. Unity Traffic Simulation
      1. Introduction
      2. Objectives
      3. People
    1. Mobile Security
      1. Introduction
      2. Motivation
    2. Resources
  3. Dynamic Video Encoding
    1. Introduction
    2. Goals
    3. Background Information
      1. Anatomy of a Video File
      2. What is a CODEC?
      3. H.264 Compression Algorithm
      4. Scalable Video Coding
      5. Network Emulator Test Results
      6. DASH Multi-Bitrate Encoding
      7. DASH Content Generation
      8. Bitrate Profiles
      9. Video Encoding Algorithms
      10. GPAC
    4. Presentations
    5. People

Github Repo

https://github.com/MrHohn/opencv-CPS

Introduction

Most Cyber Physical Systems (http://en.wikipedia.org/wiki/Cyber-physical_system) are characterized by stringent network requirements both in terms of latency (i.e. less than 100 msec response time) and scalability (i.e. Trillion-order scalability for CPS devices/objects ).

MobilityFirst, thorugh its name based Virtual Network Architecture used to provide Application Specific Routing to services and applications, can be exploited in the attempt of assisting these applications to meet their requirements.

The goal of the project is to implement a CPS Application based on computer vision (i.e. object recognition) that will be used in conjunction with MobilityFirst’s Virtual Network to showcase the benefits of the network service.

Preliminary Goal

preliminary-goal

Outline of the Project

outline

Tasks

  • Part A: Cyber Physical System (Higher Priority Tasks)
    • Get familiar with camera system available
    • Implement application that transmits video over the network in standard format. Requirements:
      • Control of frame per second transmitted over the network
      • Potentially start with transmitting single frames (i.e. still pictures)
    • Implement server application for object recognition. Standard libraries are available. Random references:
    • Collect training set of objects/buildings
    • Implement simple graphical interface to display processing results
  • Part B: MobilityFirst
  • Integration
    • CBS application port to the MobilityFirst API. Two options:
      • Through IP-to-MF proxy and vice-versa
      • Replacement of network logic through native MF implementation (preferred option)
    • Test on Orbit MF topology
    • Run on top of MobilityFirst VN with ASR

Image Processing

We implement SURF algorithm to process object identify and match.

The core idea of SURF algorithm can be summarized to the followings:

1.Use Hessian Matrix and Scale Space to calculate key points of an image
2.Use SURFDescriptorExtractor to find feature vectors and to complete related calculations.
3.Use BruteForce to match feature vectors within two images.
4.Use drawMatches to draw matched key points.

From the perspective of image processing each picture is consisted of n × m pixels which can obtain its corresponding determinant value of Hessian Matrix. Given a point p = (x,y) within an image [1] the Hessian matrix H(p, σ) at point and scale σ, is defined as follows: formula for SURF
where formula for SURF etc. are the second-order derivatives of the grayscale image.

After the calculations of determinant value of each pixels a scale-space that contains repeatedly smoothed images through a Gaussian filter will be constructed. A scale-space can be regarded as an image pyramid built with multiple layers of an image with distinct scales. Determinant value of each pixel will be compared with that of other surrounding 26 pixels consisting of 9 upward ones, 9 downward ones and 8 encompassing ones. The maximum one will become a key point. From a circular region around the key point, a square region will be built to find its orientation and extract the SURF descriptor in order to construct its feature vector. After the calculations of all feature vectors of two images by following the above method, BruteForce algorithm matches feature vectors by calculating the shortest Euler distance of each of two vectors in all vectors. Each two feature vectors with the shortest distance is a pair of matched vector. At last we use drawMatches to draw matched key points.

Weekly Summary

Week 1

 Familiarizing with Mobility First.

 Completing Orbit Tutorials.

Week 2

 Explore Object recognition algorithms

 Development of Image processing with basic functionality

 Implement camera sampling apps with integrated webcam and mjpg-streamer

Week 3*

 Further development of Image processing and camera apps

Week 4*

 Complete the integration of client and server part, the system supports real-time video transmission and image matching on Linux

Week 5-6*

 implemented mobility first network on real video transmission to replace TCP/IP protocol

 add message distribution API on mobility first code so that it will assign to each connecting socket a thread number as an identification

 revise image processing code to support multiple clients while considering on memory optimization

 start to learn the concept of fog computing

  • To be updated

Future Goals

 Incorporation with Google glass for live object recognition and information display

 Build CPS project on Android platform

 Client side low resolution object verification for coherent information display

Team

Kathic-portrait Wuyang-portrait Zihong-portrait Shan_picture Avi_picture

     Karthikeyan Ganesan                  Wuyang Zhang                          Zihong Zheng                        Shantanu Ghosh                             Avi Cooper

Presentation Slides

Week 1 (Presentation)

Week 2 (Presentation)

Week 3 (Presentation)

Week 4 (Presentation)

Week 5 (Presentation)

Week 6 (Presentation)

Week 7 (Presentation)

Week 8 (Presentation)

Week 9 (Presentation)

Week 10 (Presentation)

Week 11 (Presentation)

Last modified 9 years ago Last modified on Sep 21, 2015, 3:44:05 PM

Attachments (20)

Note: See TracWiki for help on using the wiki.