I've been using google cloud services over the last year and I thoroughly enjoy their range of services. In this walkthrough, I'll guide you through the process of unzipping files uploaded to Google cloud storage buckets. But first, let me give yo...
On the last post of the Django multi-tenant series, we set up a Django project from scratch. In this part, we'll set up the authentication system using JWTs and connect the API to a vue...
Over the course of the web scraping series, we've covered a number of approaches to scraping the web. The approaches typically involved writing code using python and the various libraries that python has to offer such as BeautifulSoup, selenium an...
This post is a continuation of the django multi-tenancy series. In this part, we'll begin to implement the backend of the multitenant Django app. We'll be setting up a Django project fr...
I've been developing software using Django for a while now and over the years, there's a couple of resources and Libraries that I've come to depend on. They're tried and tested and can be easily extended to suit my needs.
I did another post...
When developing django projects, there comes a need to write one-off scripts that automate a particular task. Here are some use cases I've found myself applying before we continue with the implementation.
Multi-tenancy is an architecture in which a single instance of a software applicat...
When working with the python interactive shell, you may end up having a cluttered screen. For Windows and Linux users, one can run cls or clear to clear their screens. But this doesn't seem to cut it while working in the ...
cls
clear
In the first part of the intro django rest framework, we set up the project and did an overview of what the project is all ...
Hey there, I'd like to introduce you to making web APIs with the popular python web framework, Django.
You've probably heard that the best way to learn is by doing. So in this walkthrough, we'll be building a RESTful API with Django and the...
In the first and second part of this series, we've introduced ourselves to web scraping and the techniques one can apply to achieve this task. We did so with BeautifulSoup and selenium python libraries. Check them out if you haven't yet.
In the first part of this series, we introduced ourselves to the concept of web scraping using two python libraries to achieve this task. Namely, requests and BeautifulSoup. The results were then stored in a JSON file. In this walkthrough, we'll t...
This is the process of extracting information from a webpage by taking advantage of patterns in the ...
Hey there I was inspired to write this post based on my experience trying to move my deployments to use docker, particularly for django applications and couldn't get a comprehensive place/article that covered what I needed.Hopefully this article w...