Setting up Jupyter on the Cloud
This article shows how you can run Jupyter on a remote server, connect to it, and have Jupyter continue to run - even if you get disconnected.
This article shows how you can run Jupyter on a remote server, connect to it, and have Jupyter continue to run - even if you get disconnected.
An earlier article, "Save the environment with conda", showed how to make a new environment and use it with Jupyter. This article walks through how to fix Jupyter if it isn't using the correct environment.
Jupyter's use for quick experimentation encourages the use of global variables, as we may only have one connection to a database, or one dataframe used by all functions. The globals can lead to subtle, hard to debug problems. This article shows...
Jupyter notebooks allow for quick experimentation and exploration, but can encourage some bad habits. One subtle error is the usage of global variables in a Jupyter notebook. This is a quick post to show the error, and some steps you can take to avoid it
If your Ubuntu server is shutdown (for example, by your AWS instance rebooting), you may leave Postgres in an inconsistent state. This post walks through the steps of locating the lockfiles and getting Postgres up and running again.
ROC (Receiver Operator Characteristic) curves are a great way for measuring the performance of binary classifiers. They show how well a classifier's score (where a higher score means more likely to be in the "positive" class) does at separating...
Environments allow you to distribute software to other users, where you don't know what packages they have installed. This is a better solution than using requirements.txt, as the packages you install won't interfere with the users system.
This is the eighth in a series of blog posts where we go through the process of taking a collection of functions and turn them into a deployable Python package. In this post, we summarize the steps needed to make and deploy a Python package.
This is the seventh in a series of blog posts where we go through the process of taking a collection of functions and turn them into a deployable Python package. In this post, we show how to deploy to TestPyPI.
This is the sixth in a series of blog posts where we go through the process of taking a collection of functions and turn them into a deployable Python package. In this post, we show how to include a CSV file into your package. This should be...
This is the fifth in a series of blog posts where we go through the process of taking a collection of functions and turn them into a deployable Python package. In this post, we use the tox package to automate some of the deployment steps
This is the fourth in a series of blog posts where we go through the process of taking a collection of functions and turn them into a deployable Python package. In this post, we use pytest to write unit tests for the roman numeral package.
This is the third in a series of blog posts where we go through the process of taking a collection of functions and turn them into a deployable Python package. In this post, we use setuptools to allow people to install our package on their system.
This is the second in a series of blog posts where we go through the process of taking a collection of functions and turn them into a deployable Python package. In this post, we add docstrings for our users to be able to understand what our package does.
This is the first in a series of blog posts where we go through the process of taking a collection of functions and turning them into a deployable Python package. In this post, we create a Roman Numerals function, and make it into a Python module.
What is the difference between a production database and a data warehouse? How does that differ from a data lake? Why would I use one over the other? With the volume of data around, there are more and more use cases for data storage. This article...