How To Perform Unit Testing in Flask

Sep 05, 2024 01:54 PM - 2 months ago 61240

Introduction

Testing is basal to nan package improvement process, ensuring that codification behaves arsenic expected and is defect-free. In Python, pytest is simply a celebrated testing model that offers respective advantages complete nan modular portion trial module, which is simply a built-in Python testing model and is portion of nan modular library. pytest includes a simpler syntax, amended output, powerful fixtures, and a rich | plugin ecosystem. This tutorial will guideline you done mounting up a Flask application, integrating pytest fixtures, and penning portion tests utilizing pytest.

Prerequisites

Before you begin, you’ll request nan following:

  • A server moving Ubuntu and a non-root personification pinch sudo privileges and an progressive firewall. For guidance connected really to group this up, please take your distribution from this list and travel our first server setup guide. Please guarantee to activity pinch a supported version of Ubuntu.

  • Familiarity pinch nan Linux bid line. You tin sojourn this guideline connected Linux bid statement primer.

  • A basal knowing of Python programming and pytest testing model successful Python. You tin mention to our tutorial connected PyTest Python Testing Framework to study much astir pytest.

  • Python 3.7 aliases higher installed connected your Ubuntu system. To study really to tally a Python book connected Ubuntu, you tin mention to our tutorial connected How to tally a Python book connected Ubuntu.

Why pytest is simply a Better Alternative to unittest

pytest offers respective advantages complete nan built-in unittest framework:

  • Pytest allows you to constitute tests pinch little boilerplate code, utilizing elemental asseverate statements alternatively of nan much verbose methods required by unittest.

  • It provides much elaborate and readable output, making it easier to place wherever and why a trial failed.

  • Pytest fixtures let for much elastic and reusable trial setups than unittest’s setUp and tearDown methods.

  • It makes it easy to tally nan aforesaid trial usability pinch aggregate sets of input, which is not arsenic straightforward successful unittest.

  • Pytest has a rich | postulation of plugins that widen its functionality, from codification sum devices to parallel trial execution.

  • It automatically discovers trial files and functions that lucifer its naming conventions, redeeming clip and effort successful managing trial suites.

Given these benefits, pytest is often nan preferred prime for modern Python testing. Let’s group up a Flask exertion and constitute portion tests utilizing pytest.

Step 1 - Setting Up nan Environment

Ubuntu 24.04 ships Python 3 by default. Open nan terminal and tally nan pursuing bid to double-check nan Python 3 installation:

root@ubuntu:~ Python 3.12.3

If Python 3 is already installed connected your machine, nan supra bid will return nan existent type of Python 3 installation. In lawsuit it is not installed, you tin tally nan pursuing bid and get nan Python 3 installation:

root@ubuntu:~

Next, you request to instal nan pip package installer connected your system:

root@ubuntu:~

Once pip is installed, let’s instal Flask.

Step 2 - Create a Flask Application

Let’s commencement by creating a elemental Flask application. Create a caller directory for your task and navigate into it:

root@ubuntu:~ root@ubuntu:~

Now, let’s create and activate a virtual situation to negociate dependencies:

root@ubuntu:~ root@ubuntu:~

Install Flask utilizing pip:

root@ubuntu:~

Now, let’s create a elemental Flask application. Create a caller record named app.py and adhd nan pursuing code:

app.py

from flask import Flask, jsonify app = Flask(__name__) @app.route('/') def home(): return jsonify(message="Hello, Flask!") @app.route('/about') def about(): return jsonify(message="This is nan About page") @app.route('/multiply/<int:x>/<int:y>') def multiply(x, y): consequence = x * y return jsonify(result=result) if __name__ == '__main__': app.run(debug=True)

This exertion has 3 routes:

  • /: Returns a elemental “Hello, Flask!” message.
  • /about: Returns a elemental “This is nan About page” message.
  • /multiply/<int:x>/<int:y>: Multiplies 2 integers and returns nan result.

To tally nan application, execute nan pursuing command:

root@ubuntu:~

output

* Serving Flask app "app" (lazy loading) * Environment: production WARNING: This is simply a improvement server. Do not usage it in a accumulation deployment. Use a accumulation WSGI server instead. * Debug mode: on * Running connected http://127.0.0.1:5000/ (Press CTRL+C to quit)

From nan supra output you tin announcement that nan server is moving connected http://127.0.0.1 and listening connected larboard 5000. Open different Ubuntu Console and execute nan beneath curl commands 1 by one:

  • GET: curl http://127.0.0.1:5000/
  • GET: curl http://127.0.0.1:5000/about
  • GET: curl http://127.0.0.1:5000/multiply/10/20

Let’s understand what these GET requests do:

  1. curl http://127.0.0.1:5000/: This sends a GET petition to nan guidelines way (‘/’) of our Flask application. The server responds pinch a JSON entity containing nan connection “Hello, Flask!”, demonstrating nan basal functionality of our location route.

  2. curl http://127.0.0.1:5000/about: This sends a GET petition to nan /about route. The server responds pinch a JSON entity containing nan connection “This is nan About page”. This shows that our way is functioning correctly.

  3. curl http://127.0.0.1:5000/multiply/10/20: This sends a GET petition to nan /multiply way pinch 2 parameters: 10 and 20. The server multiplies these numbers and responds pinch a JSON entity containing nan consequence (200). This demonstrates that our multiply way tin correctly process URL parameters and execute calculations.

These GET requests let america to interact pinch our Flask application’s API endpoints, retrieving accusation aliases triggering actions connected nan server without modifying immoderate data. They’re useful for fetching data, testing endpoint functionality, and verifying that our routes are responding arsenic expected.

Let’s spot each of these GET requests successful action:

root@ubuntu:~

Output

{"message":"Hello, Flask!"} root@ubuntu:~

Output

{"message":"This is nan About page"} root@ubuntu:~

Output

{"result":200}

Step 3 - Installing pytest and Writing Your First Test

Now that you person a basal Flask application, let’s instal pytest and constitute immoderate portion tests.

Install pytest utilizing pip:

root@ubuntu:~

Create a tests directory to shop your trial files:

root@ubuntu:~

Now, let’s create a caller record named test_app.py and adhd nan pursuing code:

test_app.py

import sys import os sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))) from app import app import pytest @pytest.fixture def client(): """A trial customer for nan app.""" with app.test_client() as client: yield client def test_home(client): """Test nan location route.""" consequence = client.get('/') assert response.status_code == 200 assert response.json == {"message": "Hello, Flask!"} def test_about(client): """Test nan astir route.""" consequence = client.get('/about') assert response.status_code == 200 assert response.json == {"message": "This is nan About page"} def test_multiply(client): """Test nan multiply way pinch valid input.""" consequence = client.get('/multiply/3/4') assert response.status_code == 200 assert response.json == {"result": 12} def test_multiply_invalid_input(client): """Test nan multiply way pinch invalid input.""" consequence = client.get('/multiply/three/four') assert response.status_code == 404 def test_non_existent_route(client): """Test for a non-existent route.""" consequence = client.get('/non-existent') assert response.status_code == 404

Let’s break down nan functions successful this trial file:

  1. @pytest.fixture def client(): This is simply a pytest fixture that creates a trial customer for our Flask app. It uses nan app.test_client() method to create a customer that tin nonstop requests to our app without moving nan existent server. The output connection allows nan customer to beryllium utilized successful tests and past decently closed aft each test.

  2. def test_home(client): This usability tests nan location way (/) of our app. It sends a GET petition to nan way utilizing nan trial client, past asserts that nan consequence position codification is 200 (OK) and that nan JSON consequence matches nan expected message.

  3. def test_about(client): Similar to test_home, this usability tests nan astir way (/about). It checks for a 200 position codification and verifies nan JSON consequence content.

  4. def test_multiply(client): This usability tests nan multiply way pinch valid input (/multiply/3/4). It checks that nan position codification is 200 and that nan JSON consequence contains nan correct consequence of nan multiplication.

  5. def test_multiply_invalid_input(client): This usability tests nan multiply way pinch invalid input (multiply/three/four). It checks that nan position codification is 404 (Not Found), which is nan expected behaviour erstwhile nan way can’t lucifer nan drawstring inputs to nan required integer parameters.

  6. def test_non_existent_route(client): This usability tests nan behaviour of nan app erstwhile a non-existent way is accessed. It sends a GET petition to /non-existent, which is not defined successful our Flask app. The trial asserts that nan consequence position codification is 404 (Not Found), ensuring that our app correctly handles requests to undefined routes.

These tests screen nan basal functionality of our Flask app, ensuring that each way responds correctly to valid inputs and that nan multiply way handles invalid inputs appropriately. By utilizing pytest, we tin easy tally these tests to verify that our app is moving arsenic expected.

Step 4 - Running nan Tests

To tally nan tests, execute nan pursuing command:

root@ubuntu:~

By default, nan pytest find process will recursively scan nan existent files and its subfolders for files starting pinch names either “test_” aliases ending pinch “_test”. Tests located successful those files are past executed. You should spot output akin to:

Output

platform linux -- Python 3.12.3, pytest-8.3.2, pluggy-1.5.0 rootdir: /home/user/flask_testing_app collected 5 items tests/test_app.py .... [100%] ======================================================= 5 passed in 0.19s ========================================================

This indicates that each tests person passed successfully.

Step 5: Using Fixtures successful pytest

Fixtures are functions that are utilized to supply information aliases resources to tests. They tin beryllium utilized to group up and tear down trial environments, load data, aliases execute different setup tasks. In pytest, fixtures are defined utilizing nan @pytest.fixture decorator.

Here’s really to heighten nan existing fixture. Update nan customer fixture to usage setup and teardown logic:

test_app.py

@pytest.fixture def client(): """Set up a trial customer for nan app pinch setup and teardown logic.""" print("\nSetting up nan trial client") with app.test_client() as client: yield customer print("Tearing down nan trial client") def test_home(client): """Test nan location route.""" consequence = client.get('/') assert response.status_code == 200 assert response.json == {"message": "Hello, Flask!"} def test_about(client): """Test nan astir route.""" consequence = client.get('/about') assert response.status_code == 200 assert response.json == {"message": "This is nan About page"} def test_multiply(client): """Test nan multiply way pinch valid input.""" consequence = client.get('/multiply/3/4') assert response.status_code == 200 assert response.json == {"result": 12} def test_multiply_invalid_input(client): """Test nan multiply way pinch invalid input.""" consequence = client.get('/multiply/three/four') assert response.status_code == 404 def test_non_existent_route(client): """Test for a non-existent route.""" consequence = client.get('/non-existent') assert response.status_code == 404

This setup adds people statements to show nan setup and teardown phases successful nan trial output. These tin beryllium replaced pinch existent assets guidance codification if needed.

Let’s effort to tally nan tests again:

root@ubuntu:~

The -v emblem increases verbosity, and nan -s emblem allows people statements to beryllium displayed successful nan console output.

You should spot nan pursuing output:

Output

platform linux -- Python 3.12.3, pytest-8.3.2, pluggy-1.5.0 rootdir: /home/user/flask_testing_app cachedir: .pytest_cache collected 5 items tests/test_app.py::test_home Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_about Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_multiply Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_multiply_invalid_input Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_non_existent_route Setting up nan test client PASSED Tearing down nan test client ============================================ 5 passed in 0.35s =============================================

Step 6: Adding a Failure Test Case

Let’s adhd a nonaccomplishment trial lawsuit to nan existing trial file. Modify nan test_app.py record and adhd nan beneath usability towards nan extremity for a failing trial lawsuit for an incorrect result:

test_app.py

def test_multiply_edge_cases(client): """Test nan multiply way pinch separator cases to show failing tests.""" consequence = client.get('/multiply/0/5') assert response.status_code == 200 assert response.json == {"result": 0} consequence = client.get('/multiply/1000000/1000000') assert response.status_code == 200 assert response.json == {"result": 1000000000000} consequence = client.get('/multiply/2/3') assert response.status_code == 200 assert response.json == {"result": 7}, "This trial should neglect to show a failing case"

Let’s break down nan test_multiply_edge_cases usability and explicate what each portion does:

  1. Test pinch zero: This trial checks if nan multiply usability correctly handles multiplication by zero. We expect nan consequence to beryllium 0 erstwhile multiplying immoderate number by zero. This is an important separator lawsuit to trial because immoderate implementations mightiness person issues pinch zero multiplication.

  2. Test pinch ample numbers: This trial verifies if nan multiply usability tin grip ample numbers without overflow aliases precision issues. We’re multiplying 2 1 cardinal values, expecting a consequence of 1 trillion. This trial is important because it checks nan precocious limits of nan function’s capability. Note that this mightiness neglect if nan server’s implementation doesn’t grip ample numbers properly, which could bespeak a request for large number libraries aliases a different information type.

  3. Intentional failing test: This trial is deliberately group up to fail. It checks if 2 * 3 equals 7, which is incorrect. This trial intends to show really a failing trial looks successful nan trial output. This helps successful knowing really to place and debug failing tests, which is an basal accomplishment successful test-driven improvement and debugging processes.

By including these separator cases and an intentional failure, you’re testing not only nan basal functionality of your multiply way but besides its behaviour nether utmost conditions and its correction reporting capabilities. This attack to testing helps guarantee nan robustness and reliability of our application.

Let’s effort to tally nan tests again:

root@ubuntu:~

You should spot nan pursuing output:

Output

platform linux -- Python 3.12.3, pytest-8.3.2, pluggy-1.5.0 rootdir: /home/user/flask_testing_app cachedir: .pytest_cache collected 6 items tests/test_app.py::test_home Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_about Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_multiply Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_multiply_invalid_input Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_non_existent_route Setting up nan test client PASSED Tearing down nan test client tests/test_app.py::test_multiply_edge_cases Setting up nan test client FAILED Tearing down nan test client ================================================================= FAILURES ================================================================== _________________________________________________________ test_multiply_edge_cases __________________________________________________________ client = <FlaskClient <Flask 'app'>> def test_multiply_edge_cases(client): """Test nan multiply way pinch separator cases to show failing tests.""" consequence = client.get('/multiply/0/5') asseverate response.status_code == 200 asseverate response.json == {"result": 0} consequence = client.get('/multiply/1000000/1000000') asseverate response.status_code == 200 asseverate response.json == {"result": 1000000000000} consequence = client.get('/multiply/2/3') asseverate response.status_code == 200 > asseverate response.json == {"result": 7}, "This trial should neglect to show a failing case" E AssertionError: This test should neglect to show a failing case E asseverate {'result': 6} == {'result': 7} E E Differing items: E {'result': 6} != {'result': 7} E E Full diff: E { E - 'result': 7,... E E ...Full output truncated (4 lines hidden), usage '-vv' to show tests/test_app.py:61: AssertionError ========================================================== short test summary info ========================================================== FAILED tests/test_app.py::test_multiply_edge_cases - AssertionError: This test should neglect to show a failing case ======================================================== 1 failed, 5 passed in 0.32s ========================================================

The nonaccomplishment connection supra indicates that nan trial test_multiply_edge_cases successful nan tests/test_app.py record failed. Specifically, nan past assertion successful this trial usability caused nan failure.

This intentional nonaccomplishment is useful for demonstrating really trial failures are reported and what accusation is provided successful nan nonaccomplishment message. It shows nan nonstop statement wherever nan nonaccomplishment occurred, nan expected and existent values, and nan quality betwixt nan two.

In a real-world scenario, you would hole nan codification to make nan trial walk aliases set nan trial if nan expected consequence was incorrect. However, successful this case, nan nonaccomplishment is intentional for acquisition purposes.

Conclusion

In this tutorial, we covered really to group up portion tests for a Flask exertion utilizing pytest, integrated pytest fixtures, and demonstrated what a trial nonaccomplishment looks like. By pursuing these steps, you tin guarantee your Flask applications are reliable and maintainable, minimizing bugs and enhancing codification quality.

You tin mention to Flask and Pytest charismatic archiving to study more.

More