Top 10 Python and Data Science Stories of 2016 [audio]
https://talkpython.fm/episodes/show/91/top-10-data-science-stories-of-2016
/r/Python
https://redd.it/5kks1j
https://talkpython.fm/episodes/show/91/top-10-data-science-stories-of-2016
/r/Python
https://redd.it/5kks1j
talkpython.fm
Top 10 Data Science Stories of 2016
It's been an amazing year for Python and Data Science. It's time to look back at the major headlines and take stock in what we've done as a community.
fancyimpute 0.0.4 : Matrix completion in python
https://pypi.python.org/pypi/fancyimpute/0.0.4
/r/pystats
https://redd.it/3zo6gt
https://pypi.python.org/pypi/fancyimpute/0.0.4
/r/pystats
https://redd.it/3zo6gt
pypi.python.org
fancyimpute 0.0.4 : Python Package Index
Matrix completion and feature imputation algorithms
Getting started with Regression and Decision Trees
http://blog.cambridgecoding.com/2016/01/03/getting-started-with-regression-and-decision-trees/
/r/pystats
https://redd.it/3zblpk
http://blog.cambridgecoding.com/2016/01/03/getting-started-with-regression-and-decision-trees/
/r/pystats
https://redd.it/3zblpk
django-markdownx not compiling html to model database
I've been struggling trying to get [django-markdownx](https://github.com/adi-/django-markdownx) to work in a django project. I am able to get the markdown preview to show the compiled markdown in the admin form. However, it does not appear the markdown is being compiled when saved to the model.
Here are two screen shots showing the [admin preview](https://gitlab.com/ahaboverboard/beblog/uploads/0a91e36c838e10bfcf9c479551093b46/Screenshot_from_2016-12-27_23-42-53.png) and the [template](https://gitlab.com/ahaboverboard/beblog/uploads/e50bf8b3bff71300d3bfdedaa7568351/Screenshot_from_2016-12-27_23-44-33.png) still showing markdown
You can view the code at https://gitlab.com/ahaboverboard/beblog/tree/master with the relevant files being the [admin.py](https://gitlab.com/ahaboverboard/beblog/blob/master/blog/admin.py), [models.py](https://gitlab.com/ahaboverboard/beblog/blob/master/blog/models.py), and the [post_detail.html template](https://gitlab.com/ahaboverboard/beblog/blob/master/blog/templates/blog/post_detail.html).
Looking at the django-markdownx readme I'm unsure what I'm missing. Current guesses are the save_model function in the ModelAdmin is missing something or that using django-markdownx in django admin will require making a custom admin form. Any thoughts on best way forward?
/r/djangolearning
https://redd.it/5kp16u
I've been struggling trying to get [django-markdownx](https://github.com/adi-/django-markdownx) to work in a django project. I am able to get the markdown preview to show the compiled markdown in the admin form. However, it does not appear the markdown is being compiled when saved to the model.
Here are two screen shots showing the [admin preview](https://gitlab.com/ahaboverboard/beblog/uploads/0a91e36c838e10bfcf9c479551093b46/Screenshot_from_2016-12-27_23-42-53.png) and the [template](https://gitlab.com/ahaboverboard/beblog/uploads/e50bf8b3bff71300d3bfdedaa7568351/Screenshot_from_2016-12-27_23-44-33.png) still showing markdown
You can view the code at https://gitlab.com/ahaboverboard/beblog/tree/master with the relevant files being the [admin.py](https://gitlab.com/ahaboverboard/beblog/blob/master/blog/admin.py), [models.py](https://gitlab.com/ahaboverboard/beblog/blob/master/blog/models.py), and the [post_detail.html template](https://gitlab.com/ahaboverboard/beblog/blob/master/blog/templates/blog/post_detail.html).
Looking at the django-markdownx readme I'm unsure what I'm missing. Current guesses are the save_model function in the ModelAdmin is missing something or that using django-markdownx in django admin will require making a custom admin form. Any thoughts on best way forward?
/r/djangolearning
https://redd.it/5kp16u
GitHub
GitHub - neutronX/django-markdownx: Comprehensive Markdown plugin built for Django
Comprehensive Markdown plugin built for Django. Contribute to neutronX/django-markdownx development by creating an account on GitHub.
Can I run django-admin commands when saving a model?
Specifically I was wondering if I could run makemessages on model save.
class TranslationExample(models.Model):
title = models.CharField(max_length=25)
body = models.TextField()
....
def save(self):
....
**What to put here here to run makemessages?**
super(TranslationExample, self).save()
/r/django
https://redd.it/5knjp5
Specifically I was wondering if I could run makemessages on model save.
class TranslationExample(models.Model):
title = models.CharField(max_length=25)
body = models.TextField()
....
def save(self):
....
**What to put here here to run makemessages?**
super(TranslationExample, self).save()
/r/django
https://redd.it/5knjp5
reddit
Can I run django-admin commands when saving a model? • /r/django
Specifically I was wondering if I could run makemessages on model save. class TranslationExample(models.Model): title =...
I need help with my first genetic algorithm
Hi I develop open source cryptocurrency trading software at www.tradewave.net. I have written python versions of many financial technical indicators, you can find them in the tradewave forums.
This is my first attempt at a genetic algo and I'm not sure where I went wrong. When I optimize the bot manually... backtesting each possible allele state I get better results than when I let the algo run through my evolve() definition and do the same optimization automatically on the same data set.
You can backtest the strategy free here (you have to create an account - its free):
https://tradewave.net/strategy/unlgoZdiYF
or just read the syntax highlighted code here:
http://pastebin.com/nLdy2CiY
The whole algo is about 400 lines, but I'm pretty sure the problem is in `def evolve()` which is less than 50 lines without comments. The rest of the code works fine if I pre train the genome and skip the evolution process. There is a description of the algo at the top of the pastebin and everything is well commented.
Can you help my find my error?
much thanks!
`def evolve():`
log('evolve() attempt at locus: %s' % storage.locus)
# Define simulated backtest window
depth = 5000; width = 200
# create deep closing price arrays for each of 3 trading pairs
ltcbtc_dc, ltcusd_dc, btcusd_dc = close_arrays(depth)
#matrix of closing price numpy arrays
price_matrix = [ltcbtc_dc, ltcusd_dc, btcusd_dc]
# find the current value of locus to be mutated
original_state = storage.genome[storage.locus]
# create 3 simulated starting portfolios w/ same start balance
portfolio = np.array([ [0.0, 1.0, 0.0],
[0.0, 1.0, 0.0],
[0.0, 1.0, 0.0] ])
# [usd, btc, ltc]
# empty any previous data in storage.macd_n_smooth from previous evolution
storage.macd_0_smooth = [[],[]] #ltcbtc
storage.macd_1_smooth = [[],[]] #ltcusd
storage.macd_2_smooth = [[],[]] #btcusd
# log('DEBUG 1 %s' % portfolio)
# run a simulated backtest for each allele state
# possible allele states are 0=usd; 1=btc; 2=ltc
for allele in range(0,3): # do this for each possible allele state (0,1,2)
#log('DEBUG 2 %s' % allele)
# for every possible slice[-200:] of 2000 deep closing array; oldest first:
for cut in range(depth, width, -1):
#if cut == depth: log('DEBUG 3 %s' % allele)
# create 2d matrix for market slices and macd for each currency pair
market_slice = [[],[],[]]
macd = [[],[],[]]
# create 1d matrix for last price
last = [0,0,0]
for n in range (0,3): # do this for each currency pair
#if (n == 0) and (cut==depth): log('DEBUG 4 %s' % allele)
#take a slice of the market for each currency pair
# slice notation reference
# z[-3:] # only the last 3
# z[:3] # only the first 3
market_slice[n] = (price_matrix[n][:cut])[-width:]
#calculate an macd value, and previous macd value for each currency pair
#try:
macd[n] = ta.MACD(market_slice[n], PERIOD1, PERIOD2)[-1]
#except Exception as e: log('talib fail %s' % e)
# smooth the macd for each currency pair
label = 'macd_' + str(n)
macd[n] = smooth(label, macd[n], AGGREGATION, 10)
# price normalize each macd by sma30
mean = (sum((market_slice[n])[-30:])/30)/100
macd[n] = macd[n]/mean
#extract last closing price
last[n] = market_slice[n][-1]
# calculate all_bull and all_bear from sim macd arrays for each instrument
all_bull, all_bear = all_bull_all_bear(macd[0], macd[1], macd[2])
#if (cut == depth): log('normalized macd %.5f %
Hi I develop open source cryptocurrency trading software at www.tradewave.net. I have written python versions of many financial technical indicators, you can find them in the tradewave forums.
This is my first attempt at a genetic algo and I'm not sure where I went wrong. When I optimize the bot manually... backtesting each possible allele state I get better results than when I let the algo run through my evolve() definition and do the same optimization automatically on the same data set.
You can backtest the strategy free here (you have to create an account - its free):
https://tradewave.net/strategy/unlgoZdiYF
or just read the syntax highlighted code here:
http://pastebin.com/nLdy2CiY
The whole algo is about 400 lines, but I'm pretty sure the problem is in `def evolve()` which is less than 50 lines without comments. The rest of the code works fine if I pre train the genome and skip the evolution process. There is a description of the algo at the top of the pastebin and everything is well commented.
Can you help my find my error?
much thanks!
`def evolve():`
log('evolve() attempt at locus: %s' % storage.locus)
# Define simulated backtest window
depth = 5000; width = 200
# create deep closing price arrays for each of 3 trading pairs
ltcbtc_dc, ltcusd_dc, btcusd_dc = close_arrays(depth)
#matrix of closing price numpy arrays
price_matrix = [ltcbtc_dc, ltcusd_dc, btcusd_dc]
# find the current value of locus to be mutated
original_state = storage.genome[storage.locus]
# create 3 simulated starting portfolios w/ same start balance
portfolio = np.array([ [0.0, 1.0, 0.0],
[0.0, 1.0, 0.0],
[0.0, 1.0, 0.0] ])
# [usd, btc, ltc]
# empty any previous data in storage.macd_n_smooth from previous evolution
storage.macd_0_smooth = [[],[]] #ltcbtc
storage.macd_1_smooth = [[],[]] #ltcusd
storage.macd_2_smooth = [[],[]] #btcusd
# log('DEBUG 1 %s' % portfolio)
# run a simulated backtest for each allele state
# possible allele states are 0=usd; 1=btc; 2=ltc
for allele in range(0,3): # do this for each possible allele state (0,1,2)
#log('DEBUG 2 %s' % allele)
# for every possible slice[-200:] of 2000 deep closing array; oldest first:
for cut in range(depth, width, -1):
#if cut == depth: log('DEBUG 3 %s' % allele)
# create 2d matrix for market slices and macd for each currency pair
market_slice = [[],[],[]]
macd = [[],[],[]]
# create 1d matrix for last price
last = [0,0,0]
for n in range (0,3): # do this for each currency pair
#if (n == 0) and (cut==depth): log('DEBUG 4 %s' % allele)
#take a slice of the market for each currency pair
# slice notation reference
# z[-3:] # only the last 3
# z[:3] # only the first 3
market_slice[n] = (price_matrix[n][:cut])[-width:]
#calculate an macd value, and previous macd value for each currency pair
#try:
macd[n] = ta.MACD(market_slice[n], PERIOD1, PERIOD2)[-1]
#except Exception as e: log('talib fail %s' % e)
# smooth the macd for each currency pair
label = 'macd_' + str(n)
macd[n] = smooth(label, macd[n], AGGREGATION, 10)
# price normalize each macd by sma30
mean = (sum((market_slice[n])[-30:])/30)/100
macd[n] = macd[n]/mean
#extract last closing price
last[n] = market_slice[n][-1]
# calculate all_bull and all_bear from sim macd arrays for each instrument
all_bull, all_bear = all_bull_all_bear(macd[0], macd[1], macd[2])
#if (cut == depth): log('normalized macd %.5f %
tradewave.net
The Magic Carpet Freeware
A public strategy shared with the Tradewave community.
.5f %.5f' % (macd[0][-1], macd[1][-1], macd[2][-1]))
# Temporarily MODIFY THE GENOME at the current locus
storage.genome[storage.locus] = allele
# use genome definition to return ideal position
sim_position = genome(
macd[0][-1], macd[1][-1], macd[2][-1], all_bull, all_bear)
# Undo MODIFY THE GENOME
storage.genome[storage.locus] = original_state
# check what each simulation is holding
sim_state = np.argmax(portfolio, axis=1)
# if the the simulator wants to change position
if sim_position != sim_state[allele]:
#if (cut == depth): log('allele %s simulator trade signal: to' % allele)
#update simulated portfolio via the last price
#last[0] 'ltcbtc close'
#last[1] 'ltcusd close'
#last[2] 'btcusd close'
if sim_position == 0:
#if (cut == depth): log('move to usd')
if sim_state[allele] == 1:
#if (cut == depth): log('via btcusd')
portfolio[allele][0] = portfolio[allele][1] * last[2]
portfolio[allele][1] = 0
if sim_state[allele] == 2:
#if (cut == depth): log('via ltcusd')
portfolio[allele][0] = portfolio[allele][2] * last[1]
portfolio[allele][2] = 0
if sim_position == 1:
#if (cut == depth): log('move to btc')
if sim_state[allele] == 0:
#if (cut == depth): log('via btcusd')
portfolio[allele][1] = portfolio[allele][0] / last[2]
portfolio[allele][0] = 0
if sim_state[allele] == 2:
#if (cut == depth): log('via ltcbtc')
portfolio[allele][1] = portfolio[allele][2] * last[0]
portfolio[allele][2] = 0
if sim_position == 2:
#if (cut == depth): log('move to ltc')
if sim_state[allele] == 0:
#if (cut == depth): log('via ltcusd')
portfolio[allele][2] = portfolio[allele][0] / last[1]
portfolio[allele][0] = 0
if sim_state[allele] == 1:
#if (cut == depth): log('via ltcbtc')
portfolio[allele][2] = portfolio[allele][1] / last[0]
portfolio[allele][1] = 0
#if (cut == depth):
# log('sim_position %s' % sim_position)
# log('sim_state %s' % sim_state)
# log(portfolio)
# move postion back to usd at the end of the simulated backtest
sim_state = np.argmax(portfolio, axis=1)
if sim_state[allele] != 0:
if sim_state[allele] == 1:
#if (cut == depth): log('via btcusd')
portfolio[allele][0] = portfolio[allele][1] * last[2]
portfolio[allele][1] = 0
if sim_state[allele] == 2:
#if (cut == depth): log('via ltcusd')
portfolio[allele][0] = portfolio[allele][2] * last[1]
portfolio[allele][2] = 0
log(portfolio)
# determine which allele has highest USD ROI
winner = -1
if portfolio[0][0] > max([portfolio[1][0], portfolio[2][0]]): winner = 0
if portfolio[1][0] > max([portfolio[0][0], portfolio[2][0]]): winner = 1
if portfolio[2][0] > max([portfolio[0][0], portfolio[1][0]]): winner = 2
# if mutation improves ROI
if (winner != original_state) and (winner > -1):
# evolve genome at this locus to winning allele
storag
# Temporarily MODIFY THE GENOME at the current locus
storage.genome[storage.locus] = allele
# use genome definition to return ideal position
sim_position = genome(
macd[0][-1], macd[1][-1], macd[2][-1], all_bull, all_bear)
# Undo MODIFY THE GENOME
storage.genome[storage.locus] = original_state
# check what each simulation is holding
sim_state = np.argmax(portfolio, axis=1)
# if the the simulator wants to change position
if sim_position != sim_state[allele]:
#if (cut == depth): log('allele %s simulator trade signal: to' % allele)
#update simulated portfolio via the last price
#last[0] 'ltcbtc close'
#last[1] 'ltcusd close'
#last[2] 'btcusd close'
if sim_position == 0:
#if (cut == depth): log('move to usd')
if sim_state[allele] == 1:
#if (cut == depth): log('via btcusd')
portfolio[allele][0] = portfolio[allele][1] * last[2]
portfolio[allele][1] = 0
if sim_state[allele] == 2:
#if (cut == depth): log('via ltcusd')
portfolio[allele][0] = portfolio[allele][2] * last[1]
portfolio[allele][2] = 0
if sim_position == 1:
#if (cut == depth): log('move to btc')
if sim_state[allele] == 0:
#if (cut == depth): log('via btcusd')
portfolio[allele][1] = portfolio[allele][0] / last[2]
portfolio[allele][0] = 0
if sim_state[allele] == 2:
#if (cut == depth): log('via ltcbtc')
portfolio[allele][1] = portfolio[allele][2] * last[0]
portfolio[allele][2] = 0
if sim_position == 2:
#if (cut == depth): log('move to ltc')
if sim_state[allele] == 0:
#if (cut == depth): log('via ltcusd')
portfolio[allele][2] = portfolio[allele][0] / last[1]
portfolio[allele][0] = 0
if sim_state[allele] == 1:
#if (cut == depth): log('via ltcbtc')
portfolio[allele][2] = portfolio[allele][1] / last[0]
portfolio[allele][1] = 0
#if (cut == depth):
# log('sim_position %s' % sim_position)
# log('sim_state %s' % sim_state)
# log(portfolio)
# move postion back to usd at the end of the simulated backtest
sim_state = np.argmax(portfolio, axis=1)
if sim_state[allele] != 0:
if sim_state[allele] == 1:
#if (cut == depth): log('via btcusd')
portfolio[allele][0] = portfolio[allele][1] * last[2]
portfolio[allele][1] = 0
if sim_state[allele] == 2:
#if (cut == depth): log('via ltcusd')
portfolio[allele][0] = portfolio[allele][2] * last[1]
portfolio[allele][2] = 0
log(portfolio)
# determine which allele has highest USD ROI
winner = -1
if portfolio[0][0] > max([portfolio[1][0], portfolio[2][0]]): winner = 0
if portfolio[1][0] > max([portfolio[0][0], portfolio[2][0]]): winner = 1
if portfolio[2][0] > max([portfolio[0][0], portfolio[1][0]]): winner = 2
# if mutation improves ROI
if (winner != original_state) and (winner > -1):
# evolve genome at this locus to winning allele
storag
e.genome[storage.locus] = winner
log('!!! EVOLUTION !!! New Genome:')
log(storage.genome)
#log('DEBUG 9')
# shift the locus counter forward to optimize a new allele on next evolve()
storage.locus += 1
# start over after optimizing the full genome
if storage.locus == len(storage.genome): storage.locus = 0
/r/pystats
https://redd.it/3z2cnt
log('!!! EVOLUTION !!! New Genome:')
log(storage.genome)
#log('DEBUG 9')
# shift the locus counter forward to optimize a new allele on next evolve()
storage.locus += 1
# start over after optimizing the full genome
if storage.locus == len(storage.genome): storage.locus = 0
/r/pystats
https://redd.it/3z2cnt
reddit
I need help with my first genetic algorithm • /r/pystats
Hi I develop open source cryptocurrency trading software at www.tradewave.net. I have written python versions of many financial technical...
Plotting e**x and ln(x) and seeing a nice inverse
I'm not sure what I'm goofing up here but I was expecting to be able to see a nice inverse graph with the following code but it's coming out a bit goofy here.
import numpy as np
from matplotlib.pyplot import (plot, subplot, cm, imread, imshow, xlabel,
ylabel, title, grid, axis, show, savefig, gcf,
figure, close, tight_layout, legend)
from math import e
print()
###############################################################################
def f_inv(n):
return np.log(n)
def f(n):
return e**n
###############################################################################
close('all')
t = np.linspace(-10, 10, 100)
e_to_x = f(t)
ln_x = f_inv(t)
plot(t, e_to_x, 'g-o', t, ln_x, 'b-o')
show()
I'm sure it's something pretty basic, cheers
/r/pystats
https://redd.it/3y8i7o
I'm not sure what I'm goofing up here but I was expecting to be able to see a nice inverse graph with the following code but it's coming out a bit goofy here.
import numpy as np
from matplotlib.pyplot import (plot, subplot, cm, imread, imshow, xlabel,
ylabel, title, grid, axis, show, savefig, gcf,
figure, close, tight_layout, legend)
from math import e
print()
###############################################################################
def f_inv(n):
return np.log(n)
def f(n):
return e**n
###############################################################################
close('all')
t = np.linspace(-10, 10, 100)
e_to_x = f(t)
ln_x = f_inv(t)
plot(t, e_to_x, 'g-o', t, ln_x, 'b-o')
show()
I'm sure it's something pretty basic, cheers
/r/pystats
https://redd.it/3y8i7o
reddit
Plotting e**x and ln(x) and seeing a nice inverse • /r/pystats
I'm not sure what I'm goofing up here but I was expecting to be able to see a nice inverse graph with the following code but it's coming out a bit...
Learning Python - 2 or 3?
Learn Python the Hard Way (as many recommend) is adamant about using Python 2, but it is not as focused on DS. Which version should I be learning?
/r/pystats
https://redd.it/3xdhq0
Learn Python the Hard Way (as many recommend) is adamant about using Python 2, but it is not as focused on DS. Which version should I be learning?
/r/pystats
https://redd.it/3xdhq0
reddit
Learning Python - 2 or 3? • /r/pystats
Learn Python the Hard Way (as many recommend) is adamant about using Python 2, but it is not as focused on DS. Which version should I be learning?
Web Scraping, data analysis and GUI development with Pokemon!
https://www.youtube.com/watch?v=egYVP-TeSg8&list=PLuVTNX0oceI87L2sPUTODZmwn-ORos-9Z
/r/pystats
https://redd.it/3w61ya
https://www.youtube.com/watch?v=egYVP-TeSg8&list=PLuVTNX0oceI87L2sPUTODZmwn-ORos-9Z
/r/pystats
https://redd.it/3w61ya
YouTube
PokeScrape 1: Scraping an online PokeDex (Pokemon data) with Python
On this video, we will learn how to scrape an online Pokedex for getting Pokemon data with common Python libraries.
Ipython Notebook:
https://github.com/snazrul1/PyRevolution/blob/master/Puzzles/PokeScraper.ipynb
Ipython Notebook:
https://github.com/snazrul1/PyRevolution/blob/master/Puzzles/PokeScraper.ipynb
Scoreboards for Presidential Candidates -- A tutorial by Github User empet
http://nbviewer.jupyter.org/github/empet/Plotly-plots/blob/master/Scoreboard-republican-candidates.ipynb
/r/JupyterNotebooks
https://redd.it/47rq4x
http://nbviewer.jupyter.org/github/empet/Plotly-plots/blob/master/Scoreboard-republican-candidates.ipynb
/r/JupyterNotebooks
https://redd.it/47rq4x
nbviewer.jupyter.org
Notebook on nbviewer
Check out this Jupyter notebook!
Automating Django Deployments with Fabric and Ansible
https://realpython.com/blog/python/automating-django-deployments-with-fabric-and-ansible#.WGPLJbXGC1U.reddit
/r/django
https://redd.it/5kqgcc
https://realpython.com/blog/python/automating-django-deployments-with-fabric-and-ansible#.WGPLJbXGC1U.reddit
/r/django
https://redd.it/5kqgcc
reddit
Automating Django Deployments with Fabric and Ansible • /r/django
12 points and 0 comments so far on reddit
Python, Flask and MySQL integration template for AnyChart JS (data visualization)
https://github.com/anychart-integrations/python-flask-mysql-template
/r/flask
https://redd.it/5kpt27
https://github.com/anychart-integrations/python-flask-mysql-template
/r/flask
https://redd.it/5kpt27
GitHub
anychart-integrations/python-flask-mysql-template
This example shows how to use Anychart library with the Python programming language, Flask microframework and MySQL database. - anychart-integrations/python-flask-mysql-template
Help with making empirical CDF using statsmodels
It's a minor problem, but can someone explain to me why statsmodels' ECDF function returns an extra value for x and and extra value for y in my [example](http://nbviewer.jupyter.org/github/pybokeh/jupyter_notebooks/blob/master/statistics/nonparametric_methods/Distribution_Free_Methods.ipynb)?. I have to remove them so that I can plot the confidence interval along with my ecdf. I have 44 values in my data set, but the ECDF returns 45 values. So if I try to plot the confidence intervals with my ecdf, it will bomb. So I remove those extra values. Here's the [source](http://statsmodels.sourceforge.net/devel/_modules/statsmodels/distributions/empirical_distribution.html) for the empirical_distribution module. Thanks in advance!
/r/pystats
https://redd.it/3w6iti
It's a minor problem, but can someone explain to me why statsmodels' ECDF function returns an extra value for x and and extra value for y in my [example](http://nbviewer.jupyter.org/github/pybokeh/jupyter_notebooks/blob/master/statistics/nonparametric_methods/Distribution_Free_Methods.ipynb)?. I have to remove them so that I can plot the confidence interval along with my ecdf. I have 44 values in my data set, but the ECDF returns 45 values. So if I try to plot the confidence intervals with my ecdf, it will bomb. So I remove those extra values. Here's the [source](http://statsmodels.sourceforge.net/devel/_modules/statsmodels/distributions/empirical_distribution.html) for the empirical_distribution module. Thanks in advance!
/r/pystats
https://redd.it/3w6iti
nbviewer.jupyter.org
Notebook on nbviewer
Check out this Jupyter notebook!
A comprehensive list of alternative kernels for Jupyter--R, Javascript, Coffeescript, Matlab, even Bash
https://github.com/ipython/ipython/wiki/IPython-kernels-for-other-languages
/r/JupyterNotebooks
https://redd.it/47pjbf
https://github.com/ipython/ipython/wiki/IPython-kernels-for-other-languages
/r/JupyterNotebooks
https://redd.it/47pjbf
GitHub
IPython kernels for other languages
Official repository for IPython itself. Other repos in the IPython organization contain things like the website, documentation builds, etc. - ipython/ipython
Data analysis of Surfing Conditions on Irish East Coast
http://marcoforte.github.io/blog/2015/30/11/Data-analysis-of-Surfing-Conditions-on-Irish-East-Coast/
/r/pystats
https://redd.it/3v06t9
http://marcoforte.github.io/blog/2015/30/11/Data-analysis-of-Surfing-Conditions-on-Irish-East-Coast/
/r/pystats
https://redd.it/3v06t9
Django Fellowship Program: 2016 retrospective
https://www.djangoproject.com/weblog/2016/dec/28/fellowship-2016-retrospective/
/r/django
https://redd.it/5kt98t
https://www.djangoproject.com/weblog/2016/dec/28/fellowship-2016-retrospective/
/r/django
https://redd.it/5kt98t
reddit
Django Fellowship Program: 2016 retrospective • /r/django
2 points and 0 comments so far on reddit
Machine Learning and Data Science Tutorials
https://github.com/rasbt/pattern_classification
/r/JupyterNotebooks
https://redd.it/47n03c
https://github.com/rasbt/pattern_classification
/r/JupyterNotebooks
https://redd.it/47n03c
GitHub
GitHub - rasbt/pattern_classification: A collection of tutorials and examples for solving and understanding machine learning and…
A collection of tutorials and examples for solving and understanding machine learning and pattern classification tasks - rasbt/pattern_classification
Leveraging Python for Data Visibility - A tutorial on how to leverage python's Data Spyre, Bokeh, and Pygal libraries (xpost from /r/DataScience)
Hi /r/pystats,
I got a recommendation to post this here from /r/python, so here you go!
Last night I gave a talk at the Kansas City Data Science Meetup on how to use Data Spyre to create a lightweight and easy to use web application. On top of that, I give examples on how you can incorporate Bokeh and Pygal plots to add interactivity to your web application as well. I've put my Jupyter notebook along with all of my example scripts into a Git Repo for all to use. Enjoy!
https://github.com/pm8k/dataspyre_tutorial
/r/pystats
https://redd.it/3sm9ju
Hi /r/pystats,
I got a recommendation to post this here from /r/python, so here you go!
Last night I gave a talk at the Kansas City Data Science Meetup on how to use Data Spyre to create a lightweight and easy to use web application. On top of that, I give examples on how you can incorporate Bokeh and Pygal plots to add interactivity to your web application as well. I've put my Jupyter notebook along with all of my example scripts into a Git Repo for all to use. Enjoy!
https://github.com/pm8k/dataspyre_tutorial
/r/pystats
https://redd.it/3sm9ju
GitHub
pm8k/dataspyre_tutorial
Contribute to pm8k/dataspyre_tutorial development by creating an account on GitHub.
Armin explains the Flask globals. I've always wanted to know the reasoning behind this design choice..
https://www.youtube.com/watch?v=1ByQhAM5c1I
/r/Python
https://redd.it/5kqoou
https://www.youtube.com/watch?v=1ByQhAM5c1I
/r/Python
https://redd.it/5kqoou
YouTube
Armin Ronacher, "Flask for Fun and Profit", PyBay2016
Learn about building small and large projects with Flask in ways you probably did not see yet.
Abstract
This talk explores how you can build applications and APIs with Flask step by step by being easy to test and scale to larger and more complex scenarios.…
Abstract
This talk explores how you can build applications and APIs with Flask step by step by being easy to test and scale to larger and more complex scenarios.…