Open-Code-Interpreter
📷 Introducing Open-Code-Interpreter, the ultimate open-source tool for turning your instructions into code. Powered by HuggingFace models, it can handle any task you throw at it. Try it now and see the magic! 📷 #HuggingFace https://github.com/haseeb-heaven/open-code-interpreter
You can use any Hugging face model and dont need to download any models to your System.
/r/Python
https://redd.it/172mzgb
📷 Introducing Open-Code-Interpreter, the ultimate open-source tool for turning your instructions into code. Powered by HuggingFace models, it can handle any task you throw at it. Try it now and see the magic! 📷 #HuggingFace https://github.com/haseeb-heaven/open-code-interpreter
You can use any Hugging face model and dont need to download any models to your System.
/r/Python
https://redd.it/172mzgb
GitHub
GitHub - haseeb-heaven/open-code-interpreter: An innovative open-source Code Interpreter with (GPT,Gemini,PALM,LLaMa) models.
An innovative open-source Code Interpreter with (GPT,Gemini,PALM,LLaMa) models. - haseeb-heaven/open-code-interpreter
R DynaMix: First dynamical systems foundation model enabling zero-shot forecasting of long-term statistics at #NeurIPS2025
Our dynamical systems foundation model DynaMix was accepted to #NeurIPS2025 with outstanding reviews (6555) – the first model which can zero-shot, w/o any fine-tuning, forecast the long-term behavior of time series from just a short context signal. Test it on #HuggingFace:
https://huggingface.co/spaces/DurstewitzLab/DynaMix
Preprint: https://arxiv.org/abs/2505.13192
Unlike major time series (TS) foundation models (FMs), DynaMix exhibits zero-shot learning of long-term stats of unseen DS, incl. attractor geometry & power spectrum. It does so with only 0.1% of the parameters & >100x faster inference times than the closest competitor, and with an extremely small training corpus of just 34 dynamical systems \- in our minds a paradigm shift in time series foundation models.
https://preview.redd.it/d46h9deagorf1.png?width=1791&format=png&auto=webp&s=7a86714f6e8d7eb269224c0e06ac317f405dfbee
https://preview.redd.it/mullm71cgorf1.png?width=1436&format=png&auto=webp&s=e53055fcc8b1d2f77da88c3896a95d65f3fac893
It even outperforms, or is at least on par with, major TS foundation models like Chronos on forecasting diverse empirical time series, like weather, traffic, or medical data, typically used to train TS FMs. This is surprising, cos DynaMix’ training corpus consists *solely* of simulated limit cycles or chaotic systems, no empirical data at all!
https://preview.redd.it/8twn70e2horf1.png?width=1127&format=png&auto=webp&s=20a7a7721a29d80bc2f01077b6e8684b54ce21ef
And no, it’s neither based on Transformers nor Mamba – it’s a new type of mixture-of-experts architecture based on the recently introduced AL-RNN (https://proceedings.neurips.cc/paper_files/paper/2024/file/40cf27290cc2bd98a428b567ba25075c-Paper-Conference.pdf). It is specifically designed & trained for dynamical systems reconstruction.
https://preview.redd.it/j0njmppkgorf1.png?width=1796&format=png&auto=webp&s=e05e275bf6aeba93fb04e8a288cd0fbac6d8fa84
Remarkably, it not only generalizes zero-shot to
/r/MachineLearning
https://redd.it/1nrqzm7
Our dynamical systems foundation model DynaMix was accepted to #NeurIPS2025 with outstanding reviews (6555) – the first model which can zero-shot, w/o any fine-tuning, forecast the long-term behavior of time series from just a short context signal. Test it on #HuggingFace:
https://huggingface.co/spaces/DurstewitzLab/DynaMix
Preprint: https://arxiv.org/abs/2505.13192
Unlike major time series (TS) foundation models (FMs), DynaMix exhibits zero-shot learning of long-term stats of unseen DS, incl. attractor geometry & power spectrum. It does so with only 0.1% of the parameters & >100x faster inference times than the closest competitor, and with an extremely small training corpus of just 34 dynamical systems \- in our minds a paradigm shift in time series foundation models.
https://preview.redd.it/d46h9deagorf1.png?width=1791&format=png&auto=webp&s=7a86714f6e8d7eb269224c0e06ac317f405dfbee
https://preview.redd.it/mullm71cgorf1.png?width=1436&format=png&auto=webp&s=e53055fcc8b1d2f77da88c3896a95d65f3fac893
It even outperforms, or is at least on par with, major TS foundation models like Chronos on forecasting diverse empirical time series, like weather, traffic, or medical data, typically used to train TS FMs. This is surprising, cos DynaMix’ training corpus consists *solely* of simulated limit cycles or chaotic systems, no empirical data at all!
https://preview.redd.it/8twn70e2horf1.png?width=1127&format=png&auto=webp&s=20a7a7721a29d80bc2f01077b6e8684b54ce21ef
And no, it’s neither based on Transformers nor Mamba – it’s a new type of mixture-of-experts architecture based on the recently introduced AL-RNN (https://proceedings.neurips.cc/paper_files/paper/2024/file/40cf27290cc2bd98a428b567ba25075c-Paper-Conference.pdf). It is specifically designed & trained for dynamical systems reconstruction.
https://preview.redd.it/j0njmppkgorf1.png?width=1796&format=png&auto=webp&s=e05e275bf6aeba93fb04e8a288cd0fbac6d8fa84
Remarkably, it not only generalizes zero-shot to
/r/MachineLearning
https://redd.it/1nrqzm7
huggingface.co
DynaMix - a Hugging Face Space by DurstewitzLab
Upload your time series data in CSV or NPY format and generate future forecasts. Configure the forecast length and settings, then download the results as CSV or NPY.