r/RoguelikeDev is running a summer dev-along tutorial series in python
Interested in making a traditional turn-based roguelike? Want to do it alongside other devs and beginners with a tutorial? Over at r/roguelikedev we're starting a [weekly summer event](https://i.imgur.com/EYJFgdI.png) on June 19th where you can do just that :D
Check out "[RoguelikeDev Does The Complete Roguelike Tutorial](https://www.reddit.com/r/roguelikedev/comments/8ql895/roguelikedev_does_the_complete_roguelike_tutorial/)" for more info, including the planned schedule.
The main tutorial we'll be following is in python, although some participants may opt to use other languages.
/r/Python
https://redd.it/8qnlcm
Interested in making a traditional turn-based roguelike? Want to do it alongside other devs and beginners with a tutorial? Over at r/roguelikedev we're starting a [weekly summer event](https://i.imgur.com/EYJFgdI.png) on June 19th where you can do just that :D
Check out "[RoguelikeDev Does The Complete Roguelike Tutorial](https://www.reddit.com/r/roguelikedev/comments/8ql895/roguelikedev_does_the_complete_roguelike_tutorial/)" for more info, including the planned schedule.
The main tutorial we'll be following is in python, although some participants may opt to use other languages.
/r/Python
https://redd.it/8qnlcm
Does Flask-SQLAlchemy DB instance need to be closed?
I am converting the Flask\-SQLAlchemy tutorial (Which I have found the latest version to be kind of confusing) to use Postgres with Flask\-SQLAlchemy. I noticed in it that he creates used `.close()` to close the db connection. Do I need to do this same thing in Flask\-SQLAlchemy? I don’t see a `.close()` method in it. Also, when exactly is this `app.teardown_appcontext(close_db)` code run?
http://flask.pocoo.org/docs/1.0/tutorial/database/
/r/flask
https://redd.it/8qlf9l
I am converting the Flask\-SQLAlchemy tutorial (Which I have found the latest version to be kind of confusing) to use Postgres with Flask\-SQLAlchemy. I noticed in it that he creates used `.close()` to close the db connection. Do I need to do this same thing in Flask\-SQLAlchemy? I don’t see a `.close()` method in it. Also, when exactly is this `app.teardown_appcontext(close_db)` code run?
http://flask.pocoo.org/docs/1.0/tutorial/database/
/r/flask
https://redd.it/8qlf9l
How many of you Djangonauts here are full-time entrepreneurs?
I've been bootstrapping full\-time my startup idea, built using Django for the last year or so. I've really benefited tremendously from the advice I received here, from beginning to launch. Thank you so much.
A couple of weeks ago I got contacted by a firm in the industry I used to work in about a possible job opportunity. It would pay probably around 500k usd in total comp. What really surprised me is that, having worked on this idea single\-mindedly for around a year or so, the thought and possibility of going back to paid employment and the world of office politics really plunged me into something that feels like clinical depression. I knew I hated corporate employment but I didn't realize I hate it this much.
I'm not trolling here.
This made me want to find out: How many of you Djangonauts lurking here are in a similar situation? You are using technology and you really like Django in particular, not to get employment but to run away from it, as your ticket to freedom from soul\-crushing BS. Maybe you are self\-taught in everything from HTML to JS to Django like me. We should connect and probably start something together, or just share ideas.
Always, thanks to all in this community. I've gotten so much from you guys.
/r/django
https://redd.it/8qp1l3
I've been bootstrapping full\-time my startup idea, built using Django for the last year or so. I've really benefited tremendously from the advice I received here, from beginning to launch. Thank you so much.
A couple of weeks ago I got contacted by a firm in the industry I used to work in about a possible job opportunity. It would pay probably around 500k usd in total comp. What really surprised me is that, having worked on this idea single\-mindedly for around a year or so, the thought and possibility of going back to paid employment and the world of office politics really plunged me into something that feels like clinical depression. I knew I hated corporate employment but I didn't realize I hate it this much.
I'm not trolling here.
This made me want to find out: How many of you Djangonauts lurking here are in a similar situation? You are using technology and you really like Django in particular, not to get employment but to run away from it, as your ticket to freedom from soul\-crushing BS. Maybe you are self\-taught in everything from HTML to JS to Django like me. We should connect and probably start something together, or just share ideas.
Always, thanks to all in this community. I've gotten so much from you guys.
/r/django
https://redd.it/8qp1l3
reddit
r/django - How many of you Djangonauts here are full-time entrepreneurs?
15 votes and 9 so far on reddit
Python 3.7.0rc1 and 3.6.6rc1 now available for testing
https://blog.python.org/2018/06/python-370rc1-and-366rc1-now-available.html
/r/Python
https://redd.it/8qn86l
https://blog.python.org/2018/06/python-370rc1-and-366rc1-now-available.html
/r/Python
https://redd.it/8qn86l
reddit
r/Python - Python 3.7.0rc1 and 3.6.6rc1 now available for testing
43 votes and 1 so far on reddit
Oh boy...did I fuck up? [Flask SQLAlchemy]
Haha, well, funny problem. I've made an app with quite a few users. It seems to be running into some issues. I realised that due to the Charge IDs from my Shopify app now being too large, no new users can join.
I have some Python code a bit like this.
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True, nullable=False)
email = db.Column(db.String(120), unique=True, nullable=False)
charge_id = db.Column(db.Integer, unique=True, nullable=False)
def __repr__(self):
return '<User %r>' % self.username
I realise now that my charge_id should be of type BigInteger to accomodate the large charge id's I'm getting now.
The thing is, in the past, I've never managed to fix this without deleting and creating the DB again. So, is it possible to edit the type of charge ID, to make it db.BigInteger, without having
to recreate the entire table? Of course, deleting the entire database isn't an option for me.
/r/flask
https://redd.it/8qm96d
Haha, well, funny problem. I've made an app with quite a few users. It seems to be running into some issues. I realised that due to the Charge IDs from my Shopify app now being too large, no new users can join.
I have some Python code a bit like this.
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True, nullable=False)
email = db.Column(db.String(120), unique=True, nullable=False)
charge_id = db.Column(db.Integer, unique=True, nullable=False)
def __repr__(self):
return '<User %r>' % self.username
I realise now that my charge_id should be of type BigInteger to accomodate the large charge id's I'm getting now.
The thing is, in the past, I've never managed to fix this without deleting and creating the DB again. So, is it possible to edit the type of charge ID, to make it db.BigInteger, without having
to recreate the entire table? Of course, deleting the entire database isn't an option for me.
/r/flask
https://redd.it/8qm96d
reddit
r/flask - Oh boy...did I fuck up? [Flask SQLAlchemy]
1 votes and 2 so far on reddit
Debugging a slow part of Flask app; and how do I run tests?
My slow part is probably an AJAX, here, in [Javascript](https://github.com/patarapolw/HanziLevelUp/blob/48a6a2d8927b49a9ba0d2809a886a63a143ee628/webapp/static/js/viewVocab.js#L22), and in Python [1](https://github.com/patarapolw/HanziLevelUp/blob/48a6a2d8927b49a9ba0d2809a886a63a143ee628/webapp/views/vocab.py#L62)->[2](https://github.com/patarapolw/HanziLevelUp/blob/48a6a2d8927b49a9ba0d2809a886a63a143ee628/CJKhyperradicals/dict.py#L24)
Also, I'd like to ask, generally, how do I run tests in (Client-side) Javascript and (server-side) Flask/Python? I am used to pytest/unittest/doctest in Python, but not in Flask, at all.
I'd like to know more about speed/performance profiling (server load, client load...)
/r/flask
https://redd.it/8qm73s
My slow part is probably an AJAX, here, in [Javascript](https://github.com/patarapolw/HanziLevelUp/blob/48a6a2d8927b49a9ba0d2809a886a63a143ee628/webapp/static/js/viewVocab.js#L22), and in Python [1](https://github.com/patarapolw/HanziLevelUp/blob/48a6a2d8927b49a9ba0d2809a886a63a143ee628/webapp/views/vocab.py#L62)->[2](https://github.com/patarapolw/HanziLevelUp/blob/48a6a2d8927b49a9ba0d2809a886a63a143ee628/CJKhyperradicals/dict.py#L24)
Also, I'd like to ask, generally, how do I run tests in (Client-side) Javascript and (server-side) Flask/Python? I am used to pytest/unittest/doctest in Python, but not in Flask, at all.
I'd like to know more about speed/performance profiling (server load, client load...)
/r/flask
https://redd.it/8qm73s
GitHub
patarapolw/HanziLevelUp
HanziLevelUp - A Hanzi learning suite, with levels based on Hanzi Level Project, aka. another attempt to clone WaniKani.com for Chinese.
Qt for Python 5.11 released - Qt Blog
https://blog.qt.io/blog/2018/06/13/qt-python-5-11-released/
/r/Python
https://redd.it/8qrgef
https://blog.qt.io/blog/2018/06/13/qt-python-5-11-released/
/r/Python
https://redd.it/8qrgef
Qt Blog
Qt for Python 5.11 released - Qt Blog
We are happy to announce the first official release of Qt for Python (Pyside2). As the version tag implies, it is based on Qt 5.11 and therefore the first release that supports the Qt 5 series. At large the project will follow the general Qt release schedule…
MultipleObjectsReturned on Delete and update view
I just found that I am getting a MultipleObjectsReturned if I have a user put in similar required data fields. Here is my model and view:
**Models.py**
`class Product(models.Model):`
`name = models.CharField(max_length=50)`
`company = models.ForeignKey(`
`Company, on_delete=models.CASCADE)`
`serial = models.CharField(max_length=50, null=True, blank=True)`
`description = TextField()`
`date_added = models.DateTimeField(auto_now_add=True, auto_now=False)`
**Views.py**
`class ProductUpdateView(UpdateView):`
`model = Product`
`fields = ['name', 'serial', 'description']`
`pk_url_kwarg = 'id'`
`slug_field = 'serial'`
`slug_url_kwarg = 'serial'`
`template_name = 'update_product.html'`
`success_url = reverse_lazy('main:product list')`
`def get_queryset(self):`
`base_qs = super(ProductUpdateView, self).get_queryset()`
`return base_qs.filter(Company__user=\`\`self.request.user.id\`\`)`
**urls.py**
`path('product/update/<int:pk>=?<slug:serial>', login_required(views.ProductUpdateView.as_view()), name='update product'),`
The error I am when two objects are entered with the similar info is:
`MultipleObjectsReturned at /product/37=?333=delete`
`get() returned more than one Product -- it returned 2!`
Is there a way to get these views working since there is a unique id in the url path? or do I have to use "unique =true" on one of the fields to limit what the user can input?
/r/djangolearning
https://redd.it/8qpmlr
I just found that I am getting a MultipleObjectsReturned if I have a user put in similar required data fields. Here is my model and view:
**Models.py**
`class Product(models.Model):`
`name = models.CharField(max_length=50)`
`company = models.ForeignKey(`
`Company, on_delete=models.CASCADE)`
`serial = models.CharField(max_length=50, null=True, blank=True)`
`description = TextField()`
`date_added = models.DateTimeField(auto_now_add=True, auto_now=False)`
**Views.py**
`class ProductUpdateView(UpdateView):`
`model = Product`
`fields = ['name', 'serial', 'description']`
`pk_url_kwarg = 'id'`
`slug_field = 'serial'`
`slug_url_kwarg = 'serial'`
`template_name = 'update_product.html'`
`success_url = reverse_lazy('main:product list')`
`def get_queryset(self):`
`base_qs = super(ProductUpdateView, self).get_queryset()`
`return base_qs.filter(Company__user=\`\`self.request.user.id\`\`)`
**urls.py**
`path('product/update/<int:pk>=?<slug:serial>', login_required(views.ProductUpdateView.as_view()), name='update product'),`
The error I am when two objects are entered with the similar info is:
`MultipleObjectsReturned at /product/37=?333=delete`
`get() returned more than one Product -- it returned 2!`
Is there a way to get these views working since there is a unique id in the url path? or do I have to use "unique =true" on one of the fields to limit what the user can input?
/r/djangolearning
https://redd.it/8qpmlr
reddit
MultipleObjectsReturned on Delete and update view • r/djangolearning
I just found that I am getting a MultipleObjectsReturned if I have a user put in similar required data fields. Here is my model and view: ...
Introduction to Matplotlib — Data Visualization in Python
https://heartbeat.fritz.ai/introduction-to-matplotlib-data-visualization-in-python-d9143287ae39
/r/Python
https://redd.it/8qrh6d
https://heartbeat.fritz.ai/introduction-to-matplotlib-data-visualization-in-python-d9143287ae39
/r/Python
https://redd.it/8qrh6d
Medium
Introduction to Matplotlib — Data Visualization in Python
Learn how to install and use Matplotlib, create commonly-used plots, and determine when to use them.
How to query whole database with flask-sqlalchemy?
I have been working on a flask api to query data from database and return to user on each request, using flask\-sqlalchemy orm.
But problem is the database have over 200 tables and initializing model class for each table is something i don't want to do. I hope there is some alternative to do such task.
/r/flask
https://redd.it/8qs66b
I have been working on a flask api to query data from database and return to user on each request, using flask\-sqlalchemy orm.
But problem is the database have over 200 tables and initializing model class for each table is something i don't want to do. I hope there is some alternative to do such task.
/r/flask
https://redd.it/8qs66b
reddit
r/flask - How to query whole database with flask-sqlalchemy?
5 votes and 5 so far on reddit
DRF GenericAPI GET request taking 500ms to complete, generic.View GET request taking 10ms. What's happening?
I have a class that will check if a value is cached. If it is cached, then it will return the value from a local cache. Otherwise, this value will be generated by a remote database. If I use the DRF APIView, then the call will take around 500ms. If I use the standard View class in Django, then it takes about 10ms to return. I have checked, and both of these are hitting the cache. Is there any magic with the DRF APIView that could be responsible for this insane amount of overhead that I am getting?
class ItemDetail(views.APIView):
def get(self, request, pk):
key_name = 'item:{}'.format(pk)
data = cache.get(key_name)
# data should always already exist in the cache, but just in case
if not data:
obj = get_object_or_404(Item, pk=pk)
data = ItemSerializer(obj).data
# expiration set to None to preserve key
cache.set(key_name, data, None)
return JsonResponse(data)
/r/django
https://redd.it/8qvf4a
I have a class that will check if a value is cached. If it is cached, then it will return the value from a local cache. Otherwise, this value will be generated by a remote database. If I use the DRF APIView, then the call will take around 500ms. If I use the standard View class in Django, then it takes about 10ms to return. I have checked, and both of these are hitting the cache. Is there any magic with the DRF APIView that could be responsible for this insane amount of overhead that I am getting?
class ItemDetail(views.APIView):
def get(self, request, pk):
key_name = 'item:{}'.format(pk)
data = cache.get(key_name)
# data should always already exist in the cache, but just in case
if not data:
obj = get_object_or_404(Item, pk=pk)
data = ItemSerializer(obj).data
# expiration set to None to preserve key
cache.set(key_name, data, None)
return JsonResponse(data)
/r/django
https://redd.it/8qvf4a
reddit
DRF GenericAPI GET request taking 500ms to complete,... • r/django
I have a class that will check if a value is cached. If it is cached, then it will return the value from a local cache. Otherwise, this value will...
[D] Machine Learning - WAYR (What Are You Reading) - Week 44
This is a place to share machine learning research papers, journals, and articles that you're reading this week. If it relates to what you're researching, by all means elaborate and give us your insight, otherwise it could just be an interesting paper you've read.
Please try to provide some insight from your understanding and please don't post things which are present in wiki.
Preferably you should link the arxiv page (not the PDF, you can easily access the PDF from the summary page but not the other way around) or any other pertinent links.
Previous weeks :
|1-10|11-20|21-30|31-40|41-50|
|----|-----|-----|-----|-----|
|[Week 1](https://www.reddit.com/r/MachineLearning/comments/4qyjiq/machine_learning_wayr_what_are_you_reading_week_1/)|[Week 11](https://www.reddit.com/r/MachineLearning/comments/57xw56/discussion_machine_learning_wayr_what_are_you/)|[Week 21](https://www.reddit.com/r/MachineLearning/comments/60ildf/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 31](https://www.reddit.com/r/MachineLearning/comments/6s0k1u/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 41](https://www.reddit.com/r/MachineLearning/comments/7tn2ax/d_machine_learning_wayr_what_are_you_reading_week/)|||
|[Week 2](https://www.reddit.com/r/MachineLearning/comments/4s2xqm/machine_learning_wayr_what_are_you_reading_week_2/)|[Week 12](https://www.reddit.com/r/MachineLearning/comments/5acb1t/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 22](https://www.reddit.com/r/MachineLearning/comments/64jwde/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 32](https://www.reddit.com/r/MachineLearning/comments/72ab5y/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 42](https://www.reddit.com/r/MachineLearning/comments/7wvjfk/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 3](https://www.reddit.com/r/MachineLearning/comments/4t7mqm/machine_learning_wayr_what_are_you_reading_week_3/)|[Week 13](https://www.reddit.com/r/MachineLearning/comments/5cwfb6/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 23](https://www.reddit.com/r/MachineLearning/comments/674331/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 33](https://www.reddit.com/r/MachineLearning/comments/75405d/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 43](https://www.reddit.com/r/MachineLearning/comments/807ex4/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 4](https://www.reddit.com/r/MachineLearning/comments/4ub2kw/machine_learning_wayr_what_are_you_reading_week_4/)|[Week 14](https://www.reddit.com/r/MachineLearning/comments/5fc5mh/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 24](https://www.reddit.com/r/MachineLearning/comments/68hhhb/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 34](https://www.reddit.com/r/MachineLearning/comments/782js9/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 44](https://www.reddit.com/r/MachineLearning/comments/8aluhs/d_machine_learning_wayr_what_are_you_reading_week/)|
|[Week 5](https://www.reddit.com/r/MachineLearning/comments/4xomf7/machine_learning_wayr_what_are_you_reading_week_5/)|[Week 15](https://www.reddit.com/r/MachineLearning/comments/5hy4ur/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 25](https://www.reddit.com/r/MachineLearning/comments/69teiz/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 35](https://www.reddit.com/r/MachineLearning/comments/7b0av0/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 6](https://www.reddit.com/r/MachineLearning/comments/4zcyvk/machine_learning_wayr_what_are_you_reading_week_6/)|[Week 16](https://www.reddit.com/r/MachineLearning/comments/5kd6vd/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 26](https://www.reddit.com/r/MachineLearning/comments/6d7nb1/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 36](https://www.reddit.com/r/MachineLearning/comments/7e3fx6/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 7](https://www.reddit.com/r/MachineLearning/comments/52t
This is a place to share machine learning research papers, journals, and articles that you're reading this week. If it relates to what you're researching, by all means elaborate and give us your insight, otherwise it could just be an interesting paper you've read.
Please try to provide some insight from your understanding and please don't post things which are present in wiki.
Preferably you should link the arxiv page (not the PDF, you can easily access the PDF from the summary page but not the other way around) or any other pertinent links.
Previous weeks :
|1-10|11-20|21-30|31-40|41-50|
|----|-----|-----|-----|-----|
|[Week 1](https://www.reddit.com/r/MachineLearning/comments/4qyjiq/machine_learning_wayr_what_are_you_reading_week_1/)|[Week 11](https://www.reddit.com/r/MachineLearning/comments/57xw56/discussion_machine_learning_wayr_what_are_you/)|[Week 21](https://www.reddit.com/r/MachineLearning/comments/60ildf/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 31](https://www.reddit.com/r/MachineLearning/comments/6s0k1u/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 41](https://www.reddit.com/r/MachineLearning/comments/7tn2ax/d_machine_learning_wayr_what_are_you_reading_week/)|||
|[Week 2](https://www.reddit.com/r/MachineLearning/comments/4s2xqm/machine_learning_wayr_what_are_you_reading_week_2/)|[Week 12](https://www.reddit.com/r/MachineLearning/comments/5acb1t/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 22](https://www.reddit.com/r/MachineLearning/comments/64jwde/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 32](https://www.reddit.com/r/MachineLearning/comments/72ab5y/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 42](https://www.reddit.com/r/MachineLearning/comments/7wvjfk/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 3](https://www.reddit.com/r/MachineLearning/comments/4t7mqm/machine_learning_wayr_what_are_you_reading_week_3/)|[Week 13](https://www.reddit.com/r/MachineLearning/comments/5cwfb6/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 23](https://www.reddit.com/r/MachineLearning/comments/674331/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 33](https://www.reddit.com/r/MachineLearning/comments/75405d/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 43](https://www.reddit.com/r/MachineLearning/comments/807ex4/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 4](https://www.reddit.com/r/MachineLearning/comments/4ub2kw/machine_learning_wayr_what_are_you_reading_week_4/)|[Week 14](https://www.reddit.com/r/MachineLearning/comments/5fc5mh/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 24](https://www.reddit.com/r/MachineLearning/comments/68hhhb/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 34](https://www.reddit.com/r/MachineLearning/comments/782js9/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 44](https://www.reddit.com/r/MachineLearning/comments/8aluhs/d_machine_learning_wayr_what_are_you_reading_week/)|
|[Week 5](https://www.reddit.com/r/MachineLearning/comments/4xomf7/machine_learning_wayr_what_are_you_reading_week_5/)|[Week 15](https://www.reddit.com/r/MachineLearning/comments/5hy4ur/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 25](https://www.reddit.com/r/MachineLearning/comments/69teiz/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 35](https://www.reddit.com/r/MachineLearning/comments/7b0av0/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 6](https://www.reddit.com/r/MachineLearning/comments/4zcyvk/machine_learning_wayr_what_are_you_reading_week_6/)|[Week 16](https://www.reddit.com/r/MachineLearning/comments/5kd6vd/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 26](https://www.reddit.com/r/MachineLearning/comments/6d7nb1/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 36](https://www.reddit.com/r/MachineLearning/comments/7e3fx6/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 7](https://www.reddit.com/r/MachineLearning/comments/52t
Reddit
From the MachineLearning community on Reddit
Explore this post and more from the MachineLearning community
6mo/machine_learning_wayr_what_are_you_reading_week_7/)|[Week 17](https://www.reddit.com/r/MachineLearning/comments/5ob7dx/discussion_machine_learning_wayr_what_are_you/)|[Week 27](https://www.reddit.com/r/MachineLearning/comments/6gngwc/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 37](https://www.reddit.com/r/MachineLearning/comments/7hcc2c/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 8](https://www.reddit.com/r/MachineLearning/comments/53heol/machine_learning_wayr_what_are_you_reading_week_8/)|[Week 18](https://www.reddit.com/r/MachineLearning/comments/5r14yd/discussion_machine_learning_wayr_what_are_you/)|[Week 28](https://www.reddit.com/r/MachineLearning/comments/6jgdva/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 38](https://www.reddit.com/r/MachineLearning/comments/7kgcqr/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 9](https://www.reddit.com/r/MachineLearning/comments/54kvsu/machine_learning_wayr_what_are_you_reading_week_9/)|[Week 19](https://www.reddit.com/r/MachineLearning/comments/5tt9cz/discussion_machine_learning_wayr_what_are_you/)|[Week 29](https://www.reddit.com/r/MachineLearning/comments/6m9l1v/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 39](https://www.reddit.com/r/MachineLearning/comments/7nayri/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 10](https://www.reddit.com/r/MachineLearning/comments/56s2oa/discussion_machine_learning_wayr_what_are_you/)|[Week 20](https://www.reddit.com/r/MachineLearning/comments/5wh2wb/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 30](https://www.reddit.com/r/MachineLearning/comments/6p3ha7/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 40](https://www.reddit.com/r/MachineLearning/comments/7qel9p/d_machine_learning_wayr_what_are_you_reading_week/)||
Most upvoted papers two months ago:
/u/Molag_Balls: "Variance-based Gradient Compression for Efficient Distributed Deep Learning" A [proposed ICLR paper](https://openreview.net/forum?id=rkEfPeZRb) on a method for reducing the need for communication between workers in a distributed training environment.
/u/theainerd: AndrewNg released the first 19 chapters of his book Machine Learning Yearning. It is focused not on teaching you ML algorithms, but on how to make ML algorithms work. [Machine Learning Yearning](http://www.mlyearning.org/)
Besides that, there are no rules, have fun.
Note: The /u/ML_WAYR_bot was great, please bring it back :)
/r/MachineLearning
https://redd.it/8qp3at
|[Week 8](https://www.reddit.com/r/MachineLearning/comments/53heol/machine_learning_wayr_what_are_you_reading_week_8/)|[Week 18](https://www.reddit.com/r/MachineLearning/comments/5r14yd/discussion_machine_learning_wayr_what_are_you/)|[Week 28](https://www.reddit.com/r/MachineLearning/comments/6jgdva/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 38](https://www.reddit.com/r/MachineLearning/comments/7kgcqr/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 9](https://www.reddit.com/r/MachineLearning/comments/54kvsu/machine_learning_wayr_what_are_you_reading_week_9/)|[Week 19](https://www.reddit.com/r/MachineLearning/comments/5tt9cz/discussion_machine_learning_wayr_what_are_you/)|[Week 29](https://www.reddit.com/r/MachineLearning/comments/6m9l1v/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 39](https://www.reddit.com/r/MachineLearning/comments/7nayri/d_machine_learning_wayr_what_are_you_reading_week/)||
|[Week 10](https://www.reddit.com/r/MachineLearning/comments/56s2oa/discussion_machine_learning_wayr_what_are_you/)|[Week 20](https://www.reddit.com/r/MachineLearning/comments/5wh2wb/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 30](https://www.reddit.com/r/MachineLearning/comments/6p3ha7/d_machine_learning_wayr_what_are_you_reading_week/)|[Week 40](https://www.reddit.com/r/MachineLearning/comments/7qel9p/d_machine_learning_wayr_what_are_you_reading_week/)||
Most upvoted papers two months ago:
/u/Molag_Balls: "Variance-based Gradient Compression for Efficient Distributed Deep Learning" A [proposed ICLR paper](https://openreview.net/forum?id=rkEfPeZRb) on a method for reducing the need for communication between workers in a distributed training environment.
/u/theainerd: AndrewNg released the first 19 chapters of his book Machine Learning Yearning. It is focused not on teaching you ML algorithms, but on how to make ML algorithms work. [Machine Learning Yearning](http://www.mlyearning.org/)
Besides that, there are no rules, have fun.
Note: The /u/ML_WAYR_bot was great, please bring it back :)
/r/MachineLearning
https://redd.it/8qp3at
reddit
[Discussion] Machine Learning - WAYR (What Are You Reading) - Week 17
This is a place to share machine learning research papers, journals, and articles that you're reading this week. If it relates to what you're...
Daemonizing django-rq workers in production
Hi all,
I'm a bit lost attempting to deploy an app that uses redis as a queue.
I'm deploying the app via Elastic Beanstalk and have automated the deployment in the .ebextensions folder (e.g. wget, make,
daemonize server, run etc) but am having trouble on how to figure out how to run workers.
Do I ssh in to my ec2 instance and type "python manage.py rqworker"?
Also should I use anything else to manage workers and the queue?
Thanks!
Edit: my redis.config file for elastic beanstalk looks like this:
sources:
/home/ec2-user: http://download.redis.io/releases/redis-4.0.10.tar.gz
commands:
redis_build:
command: make
cwd: /home/ec2-user/redis-4.0.10
redis_config_001:
command: sed -i -e "s/daemonize no/daemonize yes/" redis.conf
cwd: /home/ec2-user/redis-4.0.10
redis_config_002:
command: sed -i -e "s/# maxmemory <bytes>/maxmemory 500MB/" redis.conf
cwd: /home/ec2-user/redis-4.0.10
redis_config_003:
command: sed -i -e "s/# maxmemory-policy volatile-lru/maxmemory-policy allkeys-lru/" redis.conf
cwd: /home/ec2-user/redis-4.0.10
redis_server:
command: src/redis-server redis.conf
cwd: /home/ec2-user/redis-4.0.10
/r/djangolearning
https://redd.it/8qyslb
Hi all,
I'm a bit lost attempting to deploy an app that uses redis as a queue.
I'm deploying the app via Elastic Beanstalk and have automated the deployment in the .ebextensions folder (e.g. wget, make,
daemonize server, run etc) but am having trouble on how to figure out how to run workers.
Do I ssh in to my ec2 instance and type "python manage.py rqworker"?
Also should I use anything else to manage workers and the queue?
Thanks!
Edit: my redis.config file for elastic beanstalk looks like this:
sources:
/home/ec2-user: http://download.redis.io/releases/redis-4.0.10.tar.gz
commands:
redis_build:
command: make
cwd: /home/ec2-user/redis-4.0.10
redis_config_001:
command: sed -i -e "s/daemonize no/daemonize yes/" redis.conf
cwd: /home/ec2-user/redis-4.0.10
redis_config_002:
command: sed -i -e "s/# maxmemory <bytes>/maxmemory 500MB/" redis.conf
cwd: /home/ec2-user/redis-4.0.10
redis_config_003:
command: sed -i -e "s/# maxmemory-policy volatile-lru/maxmemory-policy allkeys-lru/" redis.conf
cwd: /home/ec2-user/redis-4.0.10
redis_server:
command: src/redis-server redis.conf
cwd: /home/ec2-user/redis-4.0.10
/r/djangolearning
https://redd.it/8qyslb
How to open matplotlib plots in a popup window?
Seen this being asked several times, but most seem outdated. Bash magic %matplotlib qt doesn't seem to work, the graphs still appear but inline.
I tried %matplotlib qt5 which had some missing dependencies on my Ubuntu machine.
Any suggestions?
Thanks!
/r/IPython
https://redd.it/8qvn0p
Seen this being asked several times, but most seem outdated. Bash magic %matplotlib qt doesn't seem to work, the graphs still appear but inline.
I tried %matplotlib qt5 which had some missing dependencies on my Ubuntu machine.
Any suggestions?
Thanks!
/r/IPython
https://redd.it/8qvn0p
reddit
r/IPython - How to open matplotlib plots in a popup window?
0 votes and 2 so far on reddit
Data Sets and Challenge Statements Released for this year's Hack for the Sea
The [Hack for the Sea](https://hackforthesea.tech) Crew is proud to present this year's challenge statements and data sets.
They are, as follows:
* [How does a changing coastal watershed impact coastal waters?](https://hackforthesea.tech/GLO/challenge/1)
* [Can you predict where and when cod spawning will occur?](https://hackforthesea.tech/GLO/challenge/2)
* [Can you design a mooring that's both eelgrass and user-friendly?](https://hackforthesea.tech/GLO/challenge/3)
* [Can an individual whale be identified based on its blowhole?](https://hackforthesea.tech/GLO/challenge/4)
The event is all ages and open to anybody who is ready and willing to provide their skills to help the oceans. Also, while the summit will be held in person, the community is open and involved year round. Join us!
/r/pystats
https://redd.it/8qz3xb
The [Hack for the Sea](https://hackforthesea.tech) Crew is proud to present this year's challenge statements and data sets.
They are, as follows:
* [How does a changing coastal watershed impact coastal waters?](https://hackforthesea.tech/GLO/challenge/1)
* [Can you predict where and when cod spawning will occur?](https://hackforthesea.tech/GLO/challenge/2)
* [Can you design a mooring that's both eelgrass and user-friendly?](https://hackforthesea.tech/GLO/challenge/3)
* [Can an individual whale be identified based on its blowhole?](https://hackforthesea.tech/GLO/challenge/4)
The event is all ages and open to anybody who is ready and willing to provide their skills to help the oceans. Also, while the summit will be held in person, the community is open and involved year round. Join us!
/r/pystats
https://redd.it/8qz3xb
hackforthesea.tech
Hack for the Sea - Sept 21-23 2018
The 3rd Annual Hack for the Sea is a marine hackathon where participants take on challenges in marine science, research, and industry.
Mechanical Engineer Trying to Create a Python Data Aquistion GUI
So a little of background, I'm a mechanical engineer that doesn't a lot of testing and hates NI/Labview. I've been teaching my self python via various books, so I would still very much call myself a beginner. I've recently purchased some data acquisition hardware and supplier has supplied an API and some example code for interfacing with the hardware but I would like to create a GUI so that it is more user friendly and other people can use the system with needing to know how to understand the python code. That being said I know there are a lot of frameworks out there to develop a GUI and I was wondering if anyone had any experience and/or recommendations. For reference the DAQ unit is from UEI.
/r/Python
https://redd.it/8qyjrh
So a little of background, I'm a mechanical engineer that doesn't a lot of testing and hates NI/Labview. I've been teaching my self python via various books, so I would still very much call myself a beginner. I've recently purchased some data acquisition hardware and supplier has supplied an API and some example code for interfacing with the hardware but I would like to create a GUI so that it is more user friendly and other people can use the system with needing to know how to understand the python code. That being said I know there are a lot of frameworks out there to develop a GUI and I was wondering if anyone had any experience and/or recommendations. For reference the DAQ unit is from UEI.
/r/Python
https://redd.it/8qyjrh
reddit
r/Python - Mechanical Engineer Trying to Create a Python Data Aquistion GUI
29 votes and 14 so far on reddit
What is the best way of doing integration test with flask?
I am trying to make an integration test using thread, using multiple ports. But it is not working very well. What is the best way of doing it?
def get_server():
Server = Thread(target=main)
Server.daemon = True
return Server
def test_update_blockchain():
sys.argv = ['','--port', '8002','--wallet','1']
ip1 = f'http://localhost:8002'
server1 = get_server()
server1.start()
time.sleep(0.01)
sys.argv = ['','--port', '8003', '--wallet', '2', '--servers', 'localhost:8002']
ip2 = f'http://localhost:8003'
server2 = get_server()
server2.start()
time.sleep(0.01)
requests.get(f'{ip1}/mine')
res = requests.get(f'{ip2}/blockchain')
assert res.status_code == 200
blockchain = res.json()
assert len(blockchain['blocks']) == 1
/r/flask
https://redd.it/8qz382
I am trying to make an integration test using thread, using multiple ports. But it is not working very well. What is the best way of doing it?
def get_server():
Server = Thread(target=main)
Server.daemon = True
return Server
def test_update_blockchain():
sys.argv = ['','--port', '8002','--wallet','1']
ip1 = f'http://localhost:8002'
server1 = get_server()
server1.start()
time.sleep(0.01)
sys.argv = ['','--port', '8003', '--wallet', '2', '--servers', 'localhost:8002']
ip2 = f'http://localhost:8003'
server2 = get_server()
server2.start()
time.sleep(0.01)
requests.get(f'{ip1}/mine')
res = requests.get(f'{ip2}/blockchain')
assert res.status_code == 200
blockchain = res.json()
assert len(blockchain['blocks']) == 1
/r/flask
https://redd.it/8qz382
reddit
r/flask - What is the best way of doing integration test with flask?
2 votes and 0 so far on reddit
Is it a good idea to make a text-based story game in Python?
I'm doing this because Python is like the easiest language but I don't know if I should use it. Help?
/r/Python
https://redd.it/8qyhn6
I'm doing this because Python is like the easiest language but I don't know if I should use it. Help?
/r/Python
https://redd.it/8qyhn6
reddit
r/Python - Is it a good idea to make a text-based story game in Python?
16 votes and 17 so far on reddit