IMSC Sponsored Projects 2012-2013

 

BigData Pricing Schemes

  • Faculty lead: Prof. Hamid Nazerzadeh
  • Description: Design and development of revenue-maximizing pricing schemes for Big-data marketplaces, with focus on Transportation sensor data.

Is mobile datapocalypse real?

  • Faculty lead: Prof. Andy Molisch
  • Description: Mobile operators have used many a financial call and speech to spell out the impending doom that awaits us once they use up their precious frequency resources. They insist we are fast approaching a mobile “datapocalypse” where their networks will no longer be able to meet the enormous demands for mobile broadband. On the other side of the argument, major hardware providers make the argument that the existing spectrum is being inefficiently utilized and that we can easily “squeeze more bits from the same hertz.”

Archived Traffic Data Management System

  • Faculty lead: Prof. Genevieve Giuliano
  • Description: Implement Online Analytical Processing (OLAP) techniques to analyze archived traffic sensor data towards transportation decision making and planning.

Study of Congested Corridors in Los Angeles

  • Faculty lead: Prof. James Elliott Moore, II
  • Description: Study the most congested corridors of Los Angeles County using historical traffic sensor data, and make before and after analysis of Carmegedons.

Analysis of Mobility Data Using Spatiotemporal Graph-based Techniques

  • Faculty lead: Prof. Antonio Ortega (Electrical Engineering)
  • We consider the scenario where multiple body-attached sensors are used and assume that the data provided by these sensors will be noisy. We will start by analyzing the data captured by real sensors in order to develop a better understanding of noise sources and characteristics. We will then study several de-noising approaches. The first one will be purely time based and will seek to develop techniques (e.g., based on Wavelets) to smooth the temporal trajectories of data at each sensor without removing information that will be important for evaluation and diagnostic. Then, we will also consider graph-based techniques to improve data quality by taking into consideration constraints on the sensor position due to the fact that they are attached to the body. As an example, two sensors attached to an individual's arm will be limited in the extent of their relative motions. We will develop methods were the system is initially calibrated under controlled circumstances, and then data acquired regarding relative sensor positions is used for de-noising. We further plan to explore the potential benefits of recently introduced graph wavelets which can capture all relevant information about body movement as either vertex data (sensor information) or edge data (distance between sensors). In addition to considering the de-noising problem, we will also study lossless and lossy techniques to compress the information generated by the sensors in order to make it easier to store complete records of sensor motion during extended periods of time. The compression techniques will be application-specific, with the goal of preserving key information for signal analysis.

Mining Events from Multi-Source Multi-Modal Data

  • Faculty lead: Prof. Yan Liu (Computer Science)
  • In a multi-INT/multi-source environment, integration of the readings collected from multiple data sources/sensors (possibly of different modalities) allows for compensation of each source’s inherent deficiencies by utilizing the strengths of other sources. In particular, such multi-source integration enables more effective surveillance for activities of interest. In this project, we use novel data mining techniques to automatically detect events given mults-source multi-modal data.

Constructing a Dynamic Ontology for Streaming Big Data in a Specific Domain

  • Faculty lead: Prof. Dennis McLeod
  • Description: Build ontologies for dynamically changing data, particularly big data streams. One essential characteristic of such big data streams is the unpredictability of their semantic relationships, which requires an ontology that can automatically evolve. By analyzing data streams, the proposed research expects to capture the varying semantics in time so that the dynamic semantics can be used to update the ontology.