update proper locust test

dev
a2nr 2026-01-03 10:17:20 +07:00
parent f67f41af08
commit e13e9df64f
11 changed files with 284 additions and 362 deletions

1
.gitignore vendored Normal file
View File

@ -0,0 +1 @@
test/__pycache__/locustfile.cpython-311.pyc

154
README.md
View File

@ -230,6 +230,81 @@ To add new lessons:
4. Add exercises with the `---EXERCISE---` separator if needed
5. Include expected output, initial code, and solution code as appropriate
### Complete Lesson Template
Here's a complete template with all optional sections for a new lesson:
```markdown
---LESSON_INFO---
**Learning Objectives:**
- Understand the purpose of this lesson
- Learn specific concepts or skills
- Apply knowledge to practical examples
**Prerequisites:**
- Knowledge from previous lessons
- Basic understanding of related concepts
---END_LESSON_INFO---
# Lesson Title
Lesson content goes here...
## Section
More content...
---
## Tables and Other Content
| Column 1 | Column 2 | Column 3 |
|----------|----------|----------|
| Data 1 | Data 2 | Data 3 |
| Data 4 | Data 5 | Data 6 |
---EXERCISE---
# Exercise Title
Exercise instructions go here...
**Requirements:**
- Requirement 1
- Requirement 2
**Expected Output:**
```
Expected output example
```
Try writing your solution in the code editor below!
---EXPECTED_OUTPUT---
Expected output text
---END_EXPECTED_OUTPUT---
---INITIAL_CODE---
#include <stdio.h>
int main() {
// Write your code here
printf("Hello, World!\\n");
return 0;
}
---END_INITIAL_CODE---
---SOLUTION_CODE---
#include <stdio.h>
int main() {
// Write your solution here
printf("Solution output\\n");
return 0;
}
---END_SOLUTION_CODE---
```
### Markdown Features Supported
- Headers: `#`, `##`, `###`
@ -250,6 +325,25 @@ To add new lessons:
- Use `---EXPECTED_OUTPUT---` to provide automatic feedback when students complete exercises correctly
- Use `---INITIAL_CODE---` to provide starter code
- Use `---SOLUTION_CODE---` to provide a reference solution
- Use `---LESSON_INFO---` to provide learning objectives and prerequisites in a special information card
### Updating the Home Page
After creating new lessons, update the `content/home.md` file to include links to your new lessons in the `---Available_Lessons---` section:
```markdown
---Available_Lessons---
1. [Lesson Title](lesson/filename.md)
```
### Content Organization
- Store all lesson files in the `content/` directory
- Use descriptive filenames with underscores instead of spaces (e.g., `variables_and_data_types.md`)
- Keep lesson files focused on a single topic or concept
- Use consistent formatting and structure across all lessons
- Include practical examples and exercises where appropriate
## Student Progress Tracking
@ -405,3 +499,63 @@ In case of compilation errors:
"error": "Compilation failed"
}
```
## Load Testing with Locust
The system includes support for load testing using Locust to simulate multiple concurrent users accessing the LMS.
### Prerequisites for Load Testing
- Docker/Podman installed on the load testing machine
- Network access to the LMS server
### Running Load Tests
1. **Prepare the LMS Server**
- Deploy the LMS container on the target server
- Note the server IP address and port (default: http://<LMS_SERVER_IP>:5000)
2. **Configure Locust**
- Update the `podman-compose.locust.yml` file with the correct LMS server IP address:
```yaml
command: ["locust", "-f", "locustfile.py", "--host", "http://<LMS_SERVER_IP>:5000"]
```
3. **Run Locust Load Test**
- On the load testing machine, navigate to the project directory
- Build and start the Locust container:
```bash
podman-compose -f test/podman-compose.locust.yml up --build
```
- Access the Locust web interface at `http://localhost:8089`
- Configure the number of users, spawn rate, and other parameters
- Start the load test
4. **Alternative: Run Locust Directly**
- Install Locust on the load testing machine:
```bash
pip install -r locust/requirements.txt
```
- Run Locust directly:
```bash
cd locust
locust -f locustfile.py --host http://<LMS_SERVER_IP>:5000
```
### Locust Test Scenarios
The included `locustfile.py` simulates the following user behaviors:
- Browsing the home page
- Viewing lessons
- Compiling C code
- Validating student tokens
- Logging in with tokens
- Tracking student progress
### Monitoring Performance
Monitor the LMS server's resource usage (CPU, memory, disk I/O) during load testing to identify potential bottlenecks. Pay attention to:
- Response times for API requests
- Compilation performance under load
- Database performance (if one is added in the future)
- Container resource limits

View File

@ -1,25 +1,22 @@
FROM python:3.11-slim
# Install necessary packages
# Install gcc compiler for C code compilation (needed for communication with LMS)
RUN apt-get update && \
apt-get install -y gcc build-essential && \
rm -rf /var/lib/apt/lists/*
# Set working directory
WORKDIR /app
WORKDIR /locust
# Copy requirements and install Python dependencies
COPY test/requirements.txt .
# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY . .
# Copy locustfile
COPY locustfile.py .
# Expose port for Locust web interface
EXPOSE 8089
# Set the working directory to test
WORKDIR /app/test
# Run Locust
CMD ["locust", "-f", "load_test.py", "--host=http://lms-c:5000"]
# Command to run Locust
CMD ["locust", "-f", "locustfile.py"]

View File

@ -1,171 +0,0 @@
# Load Testing Documentation for C Programming Learning Management System
## Overview
This document describes how to perform load testing on the C Programming Learning Management System to ensure it can handle at least 50 concurrent students accessing the application.
## Requirements
- Python 3.7+ (for local testing)
- Locust (installed via pip)
- Podman (for containerized testing)
- The LMS application running on your local machine or a test server
## Setup
### Option 1: Local Installation
#### 1. Install Locust
```bash
pip install locust
```
Alternatively, install from the requirements file in the test directory:
```bash
cd /path/to/lms-c/test
pip install -r requirements.txt
```
#### 2. Ensure the LMS Application is Running
Start your LMS application:
```bash
cd /path/to/lms-c
./start.sh
```
Or if running directly:
```bash
python app.py
```
By default, the application runs on `http://localhost:5000`.
### Option 2: Containerized Setup with Podman
The load testing can also be run in a containerized environment using Podman. This ensures consistency across different environments and simplifies setup.
#### 1. Using the provided podman-compose file
A podman-compose file is provided in the test directory that sets up both the LMS application and the load testing service:
```bash
cd /path/to/lms-c
podman-compose -f test/podman-compose.yml up --build
```
This will start both the LMS application and the Locust load testing service. The Locust web interface will be available at `http://localhost:8089`.
#### 2. Running only the load test container against an existing LMS
If you already have the LMS running, you can run just the load test container:
```bash
cd /path/to/lms-c
podman build -f test/Dockerfile -t lms-c-load-test .
podman run -p 8089:8089 --network=host lms-c-load-test
```
Note: Using `--network=host` allows the container to access the LMS running on localhost.
## Running the Load Test
### Local Load Test
#### Basic Load Test
To run a load test simulating 50 users with a hatch rate of 2 users per second:
```bash
cd /path/to/lms-c/test
locust -f load_test.py --host=http://localhost:5000
```
Then open your browser and go to `http://localhost:8089` to configure the load test.
#### Command Line Load Test
To run the load test directly from the command line without the web interface:
```bash
locust -f load_test.py --host=http://localhost:5000 --users=50 --spawn-rate=2 --run-time=5m --headless
```
This command will:
- Simulate 50 users (`--users=50`)
- Spawn 2 users per second (`--spawn-rate=2`)
- Run for 5 minutes (`--run-time=5m`)
- Run in headless mode without the web interface (`--headless`)
### Containerized Load Test
#### Using Podman Compose
To run both the LMS and load testing services together:
```bash
cd /path/to/lms-c
podman-compose -f test/podman-compose.yml up --build
```
Then access the Locust web interface at `http://localhost:8089` to configure and start the load test.
#### Running Load Test Only
To run just the load test against an existing LMS service:
```bash
cd /path/to/lms-c
podman build -f test/Dockerfile -t lms-c-load-test .
podman run -p 8089:8089 --network=host lms-c-load-test
```
#### Command Line Load Test in Container
To run the load test directly from the command line in headless mode:
```bash
cd /path/to/lms-c
podman build -f test/Dockerfile -t lms-c-load-test .
podman run --network=host lms-c-load-test locust -f /app/test/load_test.py --host=http://localhost:5000 --users=50 --spawn-rate=2 --run-time=5m --headless
```
## Test Scenarios
The load test script simulates the following student behaviors:
1. **Viewing the homepage** (30% of requests) - Students browsing available lessons
2. **Viewing lesson pages** (40% of requests) - Students reading lesson content
3. **Compiling code** (20% of requests) - Students submitting C code for compilation
4. **Logging in** (10% of requests) - Students authenticating with tokens
5. **Validating tokens** (10% of requests) - Students validating their access tokens
6. **Tracking progress** (10% of requests) - Students saving their lesson progress
## Monitoring and Metrics
During the load test, Locust provides the following metrics:
- **Users**: Number of concurrent users
- **Failures**: Number of failed requests
- **Requests per second**: Average request rate
- **Average response time**: Mean response time across all requests
- **95th percentile response time**: Response time below which 95% of requests complete
- **99th percentile response time**: Response time below which 99% of requests complete
- **Total requests**: Total number of requests made during the test
## Performance Benchmarks
For the application to be considered capable of handling 50 students simultaneously, it should meet these benchmarks:
- Less than 1% failure rate
- Average response time under 2 seconds
- 95th percentile response time under 5 seconds
- No significant degradation in performance as user count increases
## Customizing the Load Test
You can modify the `load_test.py` file to adjust:
- The number of users simulated
- The distribution of different types of requests
- The wait time between requests
- The sample code used for compilation tests
- The tokens used for authentication
## Troubleshooting
### Common Issues:
1. **Connection Refused**: Ensure the LMS application is running on the specified host/port
2. **High Failure Rate**: Check application logs for errors during the load test
3. **Memory Issues**: Monitor system resources during the test
4. **GCC Compilation Errors**: The application compiles C code, which may consume resources
### Resource Monitoring:
Consider monitoring system resources (CPU, memory, disk I/O) during the test using tools like `htop`, `iostat`, or `vmstat`.
## Conclusion
This load testing setup allows you to verify that the C Programming Learning Management System can handle at least 50 concurrent students. Regular load testing should be performed, especially after major updates, to ensure the application continues to meet performance requirements.
Both local and containerized options are available to suit different deployment scenarios. The containerized approach using Podman ensures consistency across different environments and simplifies the setup process.

View File

@ -1,145 +0,0 @@
"""
Load testing script for C Programming Learning Management System
Using Locust to simulate 50+ concurrent students accessing the application
"""
from locust import HttpUser, TaskSet, task, between
import random
import json
class StudentBehavior(TaskSet):
"""
Define the behavior of a student using the LMS
"""
def on_start(self):
"""Initialize the user session - potentially with a valid token"""
self.token = None
self.student_name = None
self.current_lesson = None
self.lessons = []
# Get available lessons
response = self.client.get("/")
if response.status_code == 200:
# In a real scenario, we would parse the response to get lesson names
# For now, we'll use a predefined list of possible lessons
self.lessons = [
"variables.md", "loops.md", "functions.md",
"arrays.md", "pointers.md", "structures.md"
]
@task(3)
def view_homepage(self):
"""View the main page with all lessons"""
self.client.get("/")
@task(4)
def view_lesson(self):
"""View a random lesson page"""
if self.lessons:
lesson = random.choice(self.lessons)
self.client.get(f"/lesson/{lesson}")
@task(2)
def compile_code(self):
"""Submit code for compilation (simulating exercise completion)"""
sample_codes = [
'''#include <stdio.h>
int main() {
printf("Hello, World!\\n");
return 0;
}''',
'''#include <stdio.h>
int main() {
int a = 5, b = 10;
printf("Sum: %d\\n", a + b);
return 0;
}''',
'''#include <stdio.h>
int main() {
for(int i = 0; i < 5; i++) {
printf("Count: %d\\n", i);
}
return 0;
}'''
]
code = random.choice(sample_codes)
response = self.client.post(
"/compile",
json={"code": code},
headers={"Content-Type": "application/json"}
)
@task(1)
def login(self):
"""Attempt to log in with a token"""
# Using a random token for testing - in a real scenario,
# we would use valid tokens from a pool
tokens_pool = [
"STUDENT001", "STUDENT002", "STUDENT003", "STUDENT004", "STUDENT005",
"STUDENT006", "STUDENT007", "STUDENT008", "STUDENT009", "STUDENT010"
]
token = random.choice(tokens_pool)
response = self.client.post(
"/login",
json={"token": token},
headers={"Content-Type": "application/json"}
)
if response.status_code == 200:
try:
data = response.json()
if data.get("success"):
self.token = token
self.student_name = data.get("student_name")
except json.JSONDecodeError:
pass # Handle non-JSON responses
@task(1)
def validate_token(self):
"""Validate a token"""
tokens_pool = [
"STUDENT001", "STUDENT002", "STUDENT003", "STUDENT004", "STUDENT005",
"STUDENT006", "STUDENT007", "STUDENT008", "STUDENT009", "STUDENT010"
]
token = random.choice(tokens_pool)
response = self.client.post(
"/validate-token",
json={"token": token},
headers={"Content-Type": "application/json"}
)
@task(1)
def track_progress(self):
"""Track progress for a lesson (only if logged in)"""
if self.token and self.lessons:
lesson_name = random.choice(self.lessons).replace('.md', '')
response = self.client.post(
"/track-progress",
json={
"token": self.token,
"lesson_name": lesson_name,
"status": "completed"
},
headers={"Content-Type": "application/json"}
)
class WebsiteUser(HttpUser):
"""
Main user class for the load test
"""
tasks = [StudentBehavior]
wait_time = between(1, 5) # Wait between 1-5 seconds between requests
# Host where the application is running
host = "http://localhost:5000"

105
test/locustfile.py Normal file
View File

@ -0,0 +1,105 @@
from locust import HttpUser, TaskSet, task, between
import random
import json
class CProgrammingLMSUser(HttpUser):
"""
Load testing for C Programming Learning Management System
"""
wait_time = between(1, 3)
def on_start(self):
"""
Called when a user starts
"""
# Get a list of available lessons to choose from
self.lessons = []
response = self.client.get("/")
if response.status_code == 200:
# In a real implementation, we would parse the response to get lesson URLs
# For now, we'll use a predefined list of lesson paths
self.lessons = [
"introduction_to_c.md",
"variables_and_data_types.md"
]
@task(3)
def view_home_page(self):
"""
Task to view the home page
"""
self.client.get("/")
@task(5)
def view_lesson(self):
"""
Task to view a random lesson
"""
if self.lessons:
lesson = random.choice(self.lessons)
self.client.get(f"/lesson/{lesson}")
@task(2)
def compile_c_code(self):
"""
Task to compile C code using the API
"""
# Sample C code for testing
c_code = """
#include <stdio.h>
int main() {
printf("Hello, World!\\n");
return 0;
}
"""
response = self.client.post(
"/compile",
data={"code": c_code},
headers={"Content-Type": "application/x-www-form-urlencoded"}
)
@task(1)
def validate_token(self):
"""
Task to validate a student token
"""
# Use a random token for testing
token = f"token_{random.randint(1000, 9999)}"
response = self.client.post(
"/validate-token",
json={"token": token}
)
@task(1)
def login_with_token(self):
"""
Task to login with a student token
"""
# Use a random token for testing
token = f"token_{random.randint(1000, 9999)}"
response = self.client.post(
"/login",
json={"token": token}
)
@task(1)
def track_progress(self):
"""
Task to track student progress
"""
# Use a random token and lesson for testing
token = f"token_{random.randint(1000, 9999)}"
lesson_name = random.choice(["introduction_to_c", "variables_and_data_types"])
response = self.client.post(
"/track-progress",
json={
"token": token,
"lesson_name": lesson_name,
"status": "completed"
}
)

View File

@ -0,0 +1,14 @@
version: '3.8'
services:
locust:
build: .
ports:
- "8089:8089"
volumes:
- ./test:/locust
command: ["locust", "-f", "locustfile.py", "--host", "http://<LMS_SERVER_IP>:5000"]
networks:
default:
driver: bridge

View File

@ -1,33 +0,0 @@
version: '3.8'
services:
lms-c:
build: .
ports:
- "5000:5000"
volumes:
- ./content:/app/content
- ./static:/app/static
- ./templates:/app/templates
- ./tokens.csv:/app/tokens.csv
environment:
- FLASK_ENV=development
command: python app.py
load-test:
build:
context: .
dockerfile: test/Dockerfile
ports:
- "8089:8089"
depends_on:
- lms-c
volumes:
- ./test:/app/test
environment:
- LOCUST_HOST=http://lms-c:5000
command: >
sh -c "sleep 10 &&
locust -f /app/test/load_test.py
--host=http://lms-c:5000
--web-host=0.0.0.0"

View File

@ -1 +1 @@
locust==2.24.1
locust==2.29.1