FastAPI Uvicorn logging in Production

Hello 🙋‍♂️,

Running a ⏩FastAPI ⏩ application in production is very easy and fast, but along the way some Uvicorn logs are lost.

In this article I will discuss how to write a custom UvicornWorker and to centralize your logging configuration into a single file.

To keep things as simple as possible I’ve put all my code in a single Python file.

main.py

import uvicorn as uvicorn
from fastapi import FastAPI, APIRouter

router = APIRouter(prefix="")


def create_app():
    fast_app = FastAPI()
    fast_app.include_router(router)
    return fast_app


@router.get("/")
def read_root():
    return {"Hello": "World"}


if __name__ == '__main__':
    app = create_app()
    uvicorn.run(app=app)

Running the code will return a {"Hello": "World"} json when you visit the root endpoint / at http://127.0.0.1:8000. 😁

When you check the console window, the following log lines are printed:

INFO:     Started server process [10276]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     127.0.0.1:53491 - "GET / HTTP/1.1" 200 OK

Notice the Uvicorn log GET / HTTP/1.1″ 200 OK.

According to Uvicorn’s deployment docs we should run Uvicorn in a production settings with the following command: gunicorn -k uvicorn.workers.UvicornWorker main:create_app.

(venv2) ➜  FastAPILogging gunicorn -k uvicorn.workers.UvicornWorker main:create_app
[2021-05-17 22:10:44 +0300] [6250] [INFO] Starting gunicorn 20.1.0
[2021-05-17 22:10:44 +0300] [6250] [INFO] Listening at: http://127.0.0.1:8000 (6250)
[2021-05-17 22:10:44 +0300] [6250] [INFO] Using worker: uvicorn.workers.UvicornWorker
[2021-05-17 22:10:44 +0300] [6252] [INFO] Booting worker with pid: 6252
[2021-05-17 22:10:45 +0300] [6252] [WARNING] ASGI app factory detected. Using it, but please consider setting the --factory flag explicitly.
[2021-05-17 22:10:45 +0300] [6252] [INFO] Started server process [6252]
[2021-05-17 22:10:45 +0300] [6252] [INFO] Waiting for application startup.
[2021-05-17 22:10:45 +0300] [6252] [INFO] Application startup complete.

Now, if we visit the root endpoint, the console won’t print “GET / HTTP/1.1” 200 OK anymore/ 🤦‍♂️.

To fix it we need a custom UvicornWorker ⚙ and a logging configuration file 🗃.

Create a new file and name it logging.yaml, then paste the following contents in it:

version: 1
disable_existing_loggers: false

formatters:
standard:
format: "%(asctime)s - %(levelname)s - %(message)s"

handlers:
console:
class: logging.StreamHandler
formatter: standard
stream: ext://sys.stdout

loggers:
uvicorn:
error:
propagate: true

root:
level: INFO
handlers: [console]
propagate: no

This file will configure our root logger and our Uvicorn logger. To read more on the topic please see Python logging configuration.

Next, we will create a custom UvicornWorker class that will set log_config to the path of our logging.yaml file, to pass the logging configuration that we’ve just made to Uvicorn. 🦄

I added the following code in main.py:

class MyUvicornWorker(UvicornWorker):
    CONFIG_KWARGS = {
        "log_config": "/mnt/c/Users/denis/PycharmProjects/FastAPILogging/logging.yaml",
    }

▶ If we run the application with:

gunicorn -k main.MyUvicornWorker main:create_app

We should see the Uvicorn access logs printed in the console 🦄

(venv2) ➜  FastAPILogging gunicorn -k main.MyUvicornWorker main:create_app
[2021-05-17 22:31:28 +0300] [6278] [INFO] Starting gunicorn 20.1.0
[2021-05-17 22:31:28 +0300] [6278] [INFO] Listening at: http://127.0.0.1:8000 (6278)
[2021-05-17 22:31:28 +0300] [6278] [INFO] Using worker: main.MyUvicornWorker
[2021-05-17 22:31:28 +0300] [6280] [INFO] Booting worker with pid: 6280
2021-05-17 22:31:28,185 - WARNING - ASGI app factory detected. Using it, but please consider setting the --factory flag explicitly.
2021-05-17 22:31:28,185 - INFO - Started server process [6280]
2021-05-17 22:31:28,185 - INFO - Waiting for application startup.
2021-05-17 22:31:28,185 - INFO - Application startup complete.
2021-05-17 22:31:30,129 - INFO - 127.0.0.1:54004 - "GET / HTTP/1.1" 200

Thanks for reading! 📚

PS: If you want to support this blog, I’ve made a Udemy course on FastAPI. Purchasing it through my referral link will give me 96% of the sale amount. Thanks!

requirements.txt

click==7.1.2
fastapi==0.65.1
gunicorn==20.1.0
h11==0.12.0
httptools==0.2.0
pydantic==1.8.2
PyYAML==5.4.1
starlette==0.14.2
typing-extensions==3.10.0.0
uvicorn==0.13.4
uvloop==0.15.2

Kafka Connect GitHub source connector

Hello!

In this article we will discuss how to quickly get started with Kafka and Kafka Connect to grab all the commits from a Github repository.

This is a practical tutorial which saves you some time browsing the Kafka’s documentation.

Environment

Kafka is bit difficult to setup, you will need Kafka, Zookeper and Kafka Connect at least. To simplify the setup we’re going to use Docker and docker-compose.

Once you have Docker and docker-compose installed on your system, you can run a single command and get a Kafka environment for developing.

Let’s start Kafka by using the Confluent Platform repository, make sure you have Docker, Docker-Compose and Git installed before proceding.

git clone https://github.com/confluentinc/cp-all-in-one
cd .\cp-all-in-one\cp-all-in-one-community
docker-compose up

After the docker-compose up command finishes, Kafka and it’s related components should be up and running. To make sure all the components are running properly please start a new terminal window and compare your docker ps output with mine:

CONTAINER ID        IMAGE                                         COMMAND                  CREATED             STATUS                            PORTS                                                                      NAMES
fc95d4828397        confluentinc/ksqldb-examples:5.5.1            "bash -c 'echo Waiti…"   6 minutes ago       Up 6 minutes
                 ksql-datagen
28e67d645888        confluentinc/cp-ksqldb-cli:5.5.1              "/bin/sh"                6 minutes ago       Up 6 minutes
                 ksqldb-cli
371fd742ad1f        confluentinc/cp-ksqldb-server:5.5.1           "/etc/confluent/dock…"   6 minutes ago       Up 6 minutes (healthy)            0.0.0.0:8088->8088/tcp
                 ksqldb-server
caeb86c71308        cnfldemos/kafka-connect-datagen:0.3.2-5.5.0   "/etc/confluent/dock…"   6 minutes ago       Up 6 minutes (health: starting)   0.0.0.0:8083->8083/tcp, 9092/tcp
                 connect
c4d90361c677        confluentinc/cp-kafka-rest:5.5.1              "/etc/confluent/dock…"   6 minutes ago       Up 6 minutes                      0.0.0.0:8082->8082/tcp
                 rest-proxy
c3752a0c1487        confluentinc/cp-schema-registry:5.5.1         "/etc/confluent/dock…"   6 minutes ago       Up 6 minutes                      0.0.0.0:8081->8081/tcp
                 schema-registry
14bfbef71822        confluentinc/cp-kafka:5.5.1                   "/etc/confluent/dock…"   6 minutes ago       Up 6 minutes                      0.0.0.0:9092->9092/tcp, 0.0.0.0:9101->9101/tcp, 0.0.0.0:29092->29092/tcp   broker
f2ddb8971efa        confluentinc/cp-zookeeper:5.5.1               "/etc/confluent/dock…"   6 minutes ago       Up 6 minutes                      2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp
                 zookeeper

If it looks similar then you’re ready to start configuring Kafka Connect.

Kafka Connect

Kafka Connect is a great tool for building data pipelines. We’re going to use it to get data from Github into Kafka.

We could write a simple python producer in order to do that, query Github’s API and produce a record for Kafka using a client library, but, Kafka Connect comes with additional benefits. It’s like Docker containers for Producers and Consumers, in fact, in Kafka Connect, the producers are called source connectors and the consumers sink connectors.

Installing the Github connector

To install the Github connector which is supported by Confluent, you have to download it from here. (click download zip)

After the download has finished, unzip the files.

Before installing it, if you visit: http://localhost:8083/connector-plugins you will see the currently installed connectors, there’s currently no Github connector.

To install it, we need to copy it’s folder into the Kafka Connect container, and restart it:

docker cp ./confluentinc-kafka-connect-github-1.0.1/. connect:/usr/share/java
docker restart connect

If we visit http://localhost:8083/connector-plugins again, we’ll see that the connector has been installed successfully:

{"class":"io.confluent.connect.github.GithubSourceConnector","type":"source","version":"1.0.1"},

Configuring the connector

There’s not configuration for the connector thus the connector isn’t running. To configured a connector you need to make a post request to the configuration endpoint, you can have as many configuration as you like for the connector, each configuration represents a worker in it’s own, the catch is that you need to use a different name.

To view the already configured connectors we can visit: localhost:8083/connectors/. As expected, an empty JSON should be returned.

To configure the Github connector, I took the sample config from it’s docummentation and adapted it into json format.

I’ve also chosen to use the JsonConverter instead of the AvroConverter for simplicity, and instead of getting the stargazers I’m getting the commits of the repo instead.

Here’s the config:

{
"name":"MyGithubConnector",
"config":{
"connector.class":"io.confluent.connect.github.GithubSourceConnector",
"confluent.topic.bootstrap.servers":"broker:29092",
"tasks.max":"1",
"confluent.topic.replication.factor":"1",
"github.service.url":"https://api.github.com",
"github.repositories":"apache/kafka",
"github.tables":"commits",
"github.since":"2019-01-01",
"github.access.token":"YOUR_PERSONAL_ACCESS_TOKEN",
"topic.name.pattern":"github-avro-${entityName}",
"key.converter":"org.apache.kafka.connect.json.JsonConverter",
"value.converter":" org.apache.kafka.connect.json.JsonConverter"
}
}

Now, we need to upload the config to the Kafka Connect, I saved it as a file named github-connector-config.json and used curl to upload it. If the upload is successful the server should echo back your config:

➜ cp-all-in-one-community git:(5.5.1-post) ✗ curl -X POST -H "Content-Type: application/json" --data @github-connector-config.json http://localhost:8083/connectors

{"name":"MyGithubConnector","config":{"connector.class":"io.confluent.connect.github.GithubSourceConnector","confluent.topic.bootstrap.servers":"broker:29092","tasks.max":"1","confluent.topic.replication.factor":"1","github.service.url":"https://api.github.com","github.repositories":"apache/kafka","github.tables":"commits","github.since":"2019-01-01","github.access.token":"xxx","topic.name.pattern":"github-avro-${entityName}","key.converter":"org.apache.kafka.connect.json.JsonConverter","value.converter":" org.apache.kafka.connect.json.JsonConverter","name":"MyGithubConnector"},"tasks":[],"type":"source"}%

Visiting http://localhost:8083/connectors/ now returns our config for our Github connector:

["MyGithubConnector"]

If we check it’s status (http://localhost:8083/connectors/MyGithubConnector/status) we should see that the connector is already running:

{"name":"MyGithubConnector","connector":{"state":"RUNNING","worker_id":"connect:8083"},"tasks":[{"id":0,"state":"RUNNING","worker_id":"connect:8083"}],"type":"source"}

The connector will pull all the commits it can get, in 5 minutes as I let it ran, It pulled over 2.5k in Github commits.

I will stop it in order to preserve disk space:

curl -X PUT localhost:8083/connectors/MyGithubConnector/pause

Inspecting the data in Kafka

To inspect the data I use a tool called kafkacat.

Running kafkacat -C -b localhost:9092 -t github-avro-commits should show you the data collected from Github:

...
"commit":{ "author":{ "name":"hejiefang", "email":"xxxx", "date":1546437651000 }, "committer":{ "name":"Manikumar Reddy", "email":"xxxx", "date":1546437651000 }, "message":"MINOR: Fix doc format in upgrade notes\n\nAuthor: hejiefang <he.jiefang@zte.com.cn>\n\nReviewers: Srinivas <xxxx>, Manikumar Reddy <xxxx>\n\nCloses #6076 from hejiefang/modifynotable", "tree":{ "sha":"91db6646f829d6636011d09fdc8643e36280716b", "url":"https://api.github.com/repos/apache/kafka/git/trees/91db6646f829d6636011d09fdc8643e36280716b" }, "url":"https://api.github.com/repos/apache/kafka/git/commits/ffd6f2a2e8a573695d0c1c98e663f0b8198b1b6d", "comment_count":0, "verification":{ "verified":false, "reason":"unsigned", "signature":null, "payload":null } },
...

Summary

We’ve successfully setup a simple Kafka playground with docker-compose and we installed a SOURCE connector for Github. It was fairly easy and Kafka Connect’s rest API is quite powerful and user friendly.

In the next article we will take the commits data and index it in ElasticSearch using a SINK connector.

Thanks for reading!

Bonus

You can debug the connector by modifying the logging level:

// Logging set
curl -s -X PUT -H "Content-Type:application/json" \
http://localhost:8083/admin/loggers/io.confluent.connect.github.GithubSourceConnector \
-d '{"level": "TRACE"}' \
| jq '.'

Firebase REST Authentication

Hello,

In this article I will show you how to authenticate to Firebase and secure the databases with some simple security rules. I’m using this setup for my NucuCar project.

Authentication

The first step is to enable the Email/Password sign in method, by going to the Authentication and clicking the Sign-In Method tab:

Next, we can add a manual user by clicking Users tab and Add user.

Now, to login with our newly created user, we make a POST request with a json body to the following endpoint:

// POST https://identitytoolkit.googleapis.com/v1/accounts:signInWithPassword?key=WEB_API_KEY 
{
     "email": "your_user@email.com",
     "password": "your_password",
     "returnSecureToken": true
 }

Note that you need to fill in the WEB_API_KEY parameter as well! You can obtain it by clicking the settings gear next to Project Overview. The General tab should open and it should list the Web API Key.

If the login is successful you will get a response similar to the following:

{
  "kind": "identitytoolkit#VerifyPasswordResponse",
  "localId": "drH5pThXcuY1V2w2FSVwlaRZXEN2",
  "email": "xx.xx@xx.xx",
  "displayName": "",
  "idToken": "xx.xx.xx-xx-xx-xx",
  "registered": true,
  "refreshToken": "xx-xx-xx",
  "expiresIn": "3600"
}

In order to make authenticated requests you’ll have to provide the following header to all requests:

Authentication: Bearer ${idToken}

Keep in mind that the idToken expires in 3600 seconds and you’ll need to exchange the refresh token for a new idToken before or after it expires:

// https://securetoken.googleapis.com/v1/token?key=WEB_API_KEY
{
  "grant_type": "refresh_token",
  "refresh_token": your_refresh_token
}

Security Rules

To secure my database I’ve used the following security rules. The don’t allow deletion and only allow creating, updating and reading documents. Deleting them is forbidden.

rules_version = '2';
service cloud.firestore {
  match /databases/{database}/documents {
    match /{document=**} {
      allow delete: if false;
      allow create, update, read: if (request.auth != null && request.auth.uid != null);
    }
  }
}

Thanks for reading!

Simple Dark Theme Switch with Vue.JS

Hello,

In this post I’m going to show you how quickly you can add a dark theme switch to your Vue.JS application.

We’re going to start with a blank application. And then we’re going to create a dark-theme CSS file which we’re going to save in public/css/darktheme.css.

This is how the application looks without any CSS.

Now, we’re going to put the following code in darktheme.css:

body {
    background-color: #2c3e50;
}

h1,h2,h3,h4,h5,h6,p {
    color: #42b983;
}

We can test our theme by appending the following line in the <head> of public/index.html

    <link rel="stylesheet" href="<%= BASE_URL %>css/darktheme.css">

Now let’s make this interactive!

In src/App.vue we’ll add a button and the following methods:

    <button @click="darkThemeSwitch">Switch Theme</button>

  methods: {
    _addDarkTheme() {
      let darkThemeLinkEl = document.createElement("link");
      darkThemeLinkEl.setAttribute("rel", "stylesheet");
      darkThemeLinkEl.setAttribute("href", "/css/darktheme.css");
      darkThemeLinkEl.setAttribute("id", "dark-theme-style");

      let docHead = document.querySelector("head");
      docHead.append(darkThemeLinkEl);
    },
    _removeDarkTheme() {
      let darkThemeLinkEl = document.querySelector("#dark-theme-style");
      let parentNode = darkThemeLinkEl.parentNode;
      parentNode.removeChild(darkThemeLinkEl);
    },
    darkThemeSwitch() {
      let darkThemeLinkEl = document.querySelector("#dark-theme-style");
      if (!darkThemeLinkEl) {
        this._addDarkTheme()
      } else {
        this._removeDarkTheme()
      }
    }

Whenever we click on the Switch Theme button, the dark theme should switch back and forth.

This is a quick and dirty way to add a dark theme switch to your Vue.JS application. You can can also take this further by implementing a theme service and persistence support.

Thank you for reading!