I was bored, so I decided to learn Python. After a bit of googling I came across
Udemy courses and
picked their highest rated Python course:
100 Days of Code
by
Angela Yu
.
So, for a colossal outlay of £15.99, I started learning Python. The course was excellent and had a good
chunk, maybe 20 days, focussed on designing and building websites using Flask / Jinja and the
Bootstrap frameworks. After going through all this, I decided I
needed a project to consolidate everything on, so I wrote this website from scratch.
So, what does this website use?
The site is hosted on an
E2 series
Virtual Machine
in the
Google Cloud
at their London
Data Center
, which forms part of Google zone
Europe-West 2c.
The cost is currently approx £35 / month. See chart for a breakdown:
Cost breakdown for Sep 2023.
NB The costs in the chart are before various discounts and taxes. The discounts reduce
it by about 25% as the map loads seem to be free for now.
-
The virtual machine runs Linux (Ubuntu Server flavour) as an operating system.
This website uses NGINX as a reverse proxy and file server. A reverse proxy is the process which
handles your incoming requests, to the server, and decides what to do with them. In most cases it
will just pass your request on to Gunicorn, but for files (e.g. css, js, jpg, etc) it also acts as
a file server, serving the files direct. Downloading GPX files is handled by Gunicorn as they are
restricted (you have to be logged in). There's an excellent guide to setting this all up:
Flask / Gunicorn / Nginx.
ELSR is served using
HTTP/2, which is the latest
revision of the HTTP protocol and increases the speed at which web pages are served over the
internet. You can test websites for HTTP/2 compatibility using an online
HTTP/2 Tester.
Configuring NGINX to use HTTP/2 is absolutely trivial e.g.
NGINX and HTTP/2.
Python
as the underlying back end language. The website runs in a
Virtual Python Environment
on a virtual machine. That's a lot of virtualness...
JavaScript.
Initially the site was mainly written in Python, using a few Javascript frameworks, eg BootStrap.
However, I'm now porting more and more functionality to the client in JavaScript and away from
the server side and Python. This is because you get a better user experience with a more intelligent
client. E.g. Take the Add Ride Form,
originally you filled out the form, posted it back to the server where Python would validate the
content and if there were any issues found, the form would be served back to you for correction.
This is the standard HTML model. However, with JavaScript, the validation can take place in the
browser, and now you can't actually submit an invalid form. Form completion is also sped up
e.g. as soon as you pick a date from the calendar, the start time and location auto complete.
This website uses
Gunicorn
as an application server (the thing which serves the webpages).
If you want to know how to fine tune Gunicorn threads and workers, there is a guide
Gunicorn Concurrency.
If you're interested this website is configured to use five servers.
-
Flask
as the micro framework backend for the website. There is an epic free course on building
Flask websites online
Flask Mega Tutorial.
-
Jinja
as the html templating engine. Most of the web pages you see are created on the fly by taking a
pre-defined template page and the contents from a database and merging the two together. Jinja is
a scripting language which enables content insertion into html template pages. Two very
useful tips were
namespaces
and
accessing Flask functions
from within Jinja.
Responsive CSS / Javascript styling from
Bootstrap
. This is how the website dynamically adapts to the screen size e.g. for mobiles etc. If you
narrow or widen your browser window and see content move around, that's Bootstrap's javascript
running in your browser.
Fancy font icons from
Font Awesome
, the man on a bike comes from
Person Biking.
Free for up to 50,000 page impressions a month.
Website fonts from
Google Fonts
(totally free). This site uses
Lora
,
Open Sans
and
Roboto
for the tables.
All the data is stored in a PostgreSQL database, using
SQLAlchemy
as the Python interface. Total overkill, but we use PostgreSQL at work, so it's simpler to only
have to maintain one type of database.
Web forms are handled using
Flask-WTF.
They make it very really trivial to build web forms, normally just a single html reference!
Password salting and hashing using
Werkzeug Utilities.
Salting makes the password hashes completely useless, to a hacker, as even if you use the same
passwords on all websites, the hashes will be completely different on every site, so a hacker can't
make any use of the information.
Google Map plugins using
Google Maps API.
I'll probably swap this out for
OSM
as it costs me £0.003 everytime you see a map.
GPX file parsing is done using
gpxpy.
The library makes it very easy to scan the data in the file and work out how close each route
passes all the cafes in the database.
Strava chaingang leader board web-scraped using
Beautiful Soup.
Most of the background images are my own, although I might have stolen one or two from the internet.
Uploaded photo images are often quite large, especially from phones, (e.g. 3-5 Mbytes) which
are then slow for other users to download. Uploaded images are shrunk using
Python Pillow.
The shrunken size is 150 kB, which is fine for using as a background image and
about the same size as when you share a photo via WhatsApp.
Elevation graphs, for each of the GPX routes, are generated using
Chart.js.
Sortable and searchable tables are provided by
DataTables.
Two Factor Authentication (2FA) and message alert SMSes using
Twilio.
It costs me approx. 4p per SMS message sent. All Admin accounts are protected by 2FA.
Fancy user avatars, for logged-in users, from
Gravatar.
Mine is "Blake et Mortimer" stolen from their
Blake et Mortimer FB page.
Domain courtesy of
Google Domains,
a whole £10 for "elsr.co.uk". Although, recently SquareSpace has taken over Google Domains.
Site diagnostics and real time performance data courtesy of
Sentry.
Obviously I got stuck many, many times getting this all working and spent many days trawling
Stack overflow
desperately searching for answers to whatever it was I'd screwed up that particluar day!
The code for this website was scanned, for free, by
Git Hub security scanning tools,
which found quite a few vulnerabilities which hopefully I've fixed by the time you read this.
The complete source code is on Github, you can take a look
ELSR on Github.
As soon as you put a site on the internet, 100s of robots start probing it, looking for known
vulnerabilities in software packages. They're pretty stupid, I get dozens of attacks on Microsoft
Exchange servers every day, even though I'm not hosting one. However, they now all get blocked by
Fail2Ban,
which looks for servers trying it on and blocks their IPs.
If you want to see someone probing the site, trying to find the right webpage for admin access:
Site Probing.
Disaster recovery. In the event someone does manage to hack the site and take it down, the
server is automatically
snapshoted
once a week by Google and the images are stored somewhere else in the
Google Cloud, so restoring the site shouldn't be too difficult! Famous last words...
Chrome. This website was developed and tested using Google's Chrome browser. Chrome is by far the
best browser for developers and also the dominant browser for users, both on desktop and mobile.
It's market share is quite impressive:
Source:
By StatCounter - OpenOffice Calc, Public Domain
What is even more impressive is that after being completely defeated by Google, Microsoft abandoned
their own browser development and now just re-badge Chrome's open source rendering engine as 'Edge'.
So when you're using Microsoft Edge, you're actually just using Chrome with a different wrapper.
It's not often a tech giant get's KO'ed and then throws in the towel completely.
Site Optimisation. Google's Chrome browser has an amazing built in developer tool, called
Lighthouse,
which will scan any website and analyse it for quality of structure. It can't tell if the content
is any good, but it can tell you if the site is well-designed and implemented from a HTML point
of view. It can also tell you how easy it is for search engines to understand your site when they
are crawling it (discovering the content).
You can see the scores for the Home page, below, for Desktop:
And also for mobile:
Some work to be done shrinking image sizes for faster loads on low quality mobile connections.