Time series are measured and analyzed across the scientific disciplines, with new methods for their analysis being developed regularly. How can we make sense of these hundreds of methods? How can we distinguish a real advance from a new method that actually reproduces the behavior of an existing method? The need for comprehensive and systematic comparison is paramount in methodological literatures like time-series analysis, but is incredibly difficult to achieve practically. We have recently developed an online portal for comparing time-series data, CompEngine (https://www.comp-engine.org/), that allows scientists to drag-and-drop their data onto our portal to get an answer to the question: “what sorts of data from across science are similar to the data that I measure?” However, there is still no way for scientists to compare their methods to alternative methods from other disciplines. Achieving this would enable scientists to better work together across disciplinary boundaries, towards a unification of methods for time series. Such an endeavor would be transformative in facilitating the concentration of scientific effort towards meaningful interdisciplinary progress, and could become a template for similar efforts applied to other data types.
Aims: In this project, we will develop an online platform to compare time-series analysis methods. As CompEngine does for time-series data, this would allow a scientist to upload their code, compute the results on a dataset, and search a library of existing features for the most similar methods. The scientist could then be given a ‘uniqueness’ score, and be able to visualize their method in a broader scientific context.
Skills: Web development and C/python coding.