Compare commits

...

4 Commits

Author SHA1 Message Date
1e9af16325 Bump version to 0.2.0 2026-04-02 12:01:04 +02:00
a8ffd0007d Update README: add testing section, production build, test infrastructure docs 2026-04-02 12:00:00 +02:00
e2123e4619 Add comprehensive TDD infrastructure with 45 tests
- Add lib crate exposing modules for integration testing
- Add dev-dependencies: tokio-test 0.4, tempfile
- Refactor parse_csv_fields() as pure function for unit testing
- Add field validation (minimum 16 fields required)
- Fix repository last_insert_id using SELECT LAST_INSERT_ID()
- Add 10 lib tests for CSV parsing and date formatting
- Add 10 config tests for environment configuration
- Add 7 import tests for CSV file parsing
- Add 6 models tests for database structs
- Add 12 repository tests for CRUD operations
2026-04-02 11:13:41 +02:00
429d5d774f Add documentation comments to codebase
Prioritize AI agent-friendly comments explaining:
- Data model rationale (customers, cards, transactions relationships)
- Business rules (anonymized cards, customer_number requirements)
- Config file loading order and environment mapping
- Import pipeline phases and CSV column mapping
- Database operation behaviors (upsert, reset implications)
- SQL query rationale and data filtering rules
2026-04-02 08:45:14 +02:00
16 changed files with 1926 additions and 67 deletions

View File

@@ -1,8 +1,16 @@
[package]
name = "invoice-generator"
version = "0.1.0"
version = "0.2.0"
edition = "2021"
[lib]
name = "invoice_generator"
path = "src/lib.rs"
[[bin]]
name = "invoice-generator"
path = "src/main.rs"
[dependencies]
askama = "0.15.5"
chrono = { version = "0.4.44", features = ["serde"] }
@@ -13,3 +21,7 @@ tokio = { version = "1", features = ["full"] }
toml = "0.8"
anyhow = "1"
bigdecimal = { version = "0.4", features = ["serde"] }
[dev-dependencies]
tokio-test = "0.4"
tempfile = "3"

110
README.md
View File

@@ -12,32 +12,45 @@ This project processes petroleum/fuel station transaction data from CSV files an
- **Invoice Generation**: Generate HTML invoices from CSV data (file-to-file mode)
- **Multi-Environment**: Separate databases for development, testing, and production
- **Sales Reporting**: Query transactions by customer, product, date range
- **Test-Driven Development**: Comprehensive test suite with 45 tests
## Project Structure
```
rusty-petroleum/
├── Cargo.toml # Rust dependencies
├── Cargo.lock # Locked dependency versions
├── config.example.toml # Config template
├── migrations/ # SQL schema files
── 001_dev.sql
│ ├── 001_test.sql
│ ├── 001_prod.sql
│ └── 002_schema.sql
── 002_schema.sql # Current schema
├── input/ # CSV input files
├── output/ # Generated invoices
├── src/
│ ├── lib.rs # Library crate (for testing)
│ ├── main.rs # CLI entry point
│ ├── config.rs # Configuration loading
│ ├── db/ # Database layer
│ │ ├── mod.rs
│ │ ├── connection.rs
│ │ ├── models.rs
│ │ └── repository.rs
│ ├── commands/ # CLI commands
│ │ ├── mod.rs
│ │ ├── db.rs # db setup/reset
│ │ └── import.rs # CSV import
│ └── invoice_generator.rs
── templates/ # HTML invoice templates
── templates/ # HTML invoice templates
│ ├── index.html
│ └── customer.html
└── tests/ # Integration tests
├── common/ # Test utilities
│ ├── mod.rs
│ ├── fixtures.rs
│ └── test_db.rs
├── config_test.rs # Config module tests
├── import_test.rs # CSV parsing tests
├── models_test.rs # Model tests
└── repository_test.rs # Database tests
```
## Database Schema
@@ -45,26 +58,30 @@ rusty-petroleum/
### customers
| Column | Type | Description |
|--------|------|-------------|
| id | INT | Primary key |
| id | INT UNSIGNED | Primary key |
| customer_number | VARCHAR | Unique customer identifier |
| card_report_group | TINYINT | Customer classification (1=fleet, 3/4=retail) |
| card_report_group | TINYINT UNSIGNED | Customer classification (1=fleet, 3/4=retail) |
| created_at | TIMESTAMP | Record creation time |
| updated_at | TIMESTAMP | Last update time |
### cards
| Column | Type | Description |
|--------|------|-------------|
| id | INT | Primary key |
| id | INT UNSIGNED | Primary key |
| card_number | VARCHAR | Unique card identifier |
| customer_id | INT | FK to customers |
| customer_id | INT UNSIGNED | FK to customers |
| created_at | TIMESTAMP | Record creation time |
| updated_at | TIMESTAMP | Last update time |
### transactions
| Column | Type | Description |
|--------|------|-------------|
| id | BIGINT | Primary key |
| id | BIGINT UNSIGNED | Primary key |
| transaction_date | DATETIME | Transaction timestamp |
| batch_number | VARCHAR | Batch identifier |
| amount | DECIMAL | Transaction amount |
| volume | DECIMAL | Volume in liters |
| price | DECIMAL | Price per liter |
| amount | DECIMAL(10,2) | Transaction amount |
| volume | DECIMAL(10,3) | Volume in liters |
| price | DECIMAL(8,4) | Price per liter |
| quality_code | INT | Product code |
| quality_name | VARCHAR | Product name (95 Oktan, Diesel) |
| card_number | VARCHAR | Card used (including anonymized) |
@@ -73,14 +90,15 @@ rusty-petroleum/
| pump | VARCHAR | Pump number |
| receipt | VARCHAR | Receipt number |
| control_number | VARCHAR | Control/verification number |
| customer_id | INT | FK to customers (NULL for anonymized) |
| customer_id | INT UNSIGNED | FK to customers (NULL for anonymized) |
| created_at | TIMESTAMP | Record creation time |
## Configuration
Copy the example config and edit with your database credentials:
```bash
cp config.example.toml config.dev.toml # or config.test.toml
cp config.example.toml config.dev.toml # or config.test.toml or config.prod.toml
```
Edit `config.dev.toml`:
@@ -100,6 +118,12 @@ Config files are loaded in order:
2. `config.<env>.toml` (environment-specific, gitignored)
3. `config.example.toml` (fallback, tracked)
### Database Names by Environment
- `rusty_petroleum_dev` - Development
- `rusty_petroleum_test` - Testing
- `rusty_petroleum_prod` - Production
## Commands
```bash
@@ -130,6 +154,53 @@ cargo run -- db reset --env dev
cargo run -- generate input/409.csv output/
```
## Testing
The project has a comprehensive test suite with 45 tests covering config, CSV parsing, models, and database operations.
```bash
# Run all tests (lib + integration)
cargo test
# Run only lib/unit tests (fast, no database needed)
cargo test --lib
# Run only integration tests (requires test database)
cargo test --tests
# Run a specific test file
cargo test --test config_test
cargo test --test import_test
cargo test --test repository_test
# Run a specific test
cargo test customer_insert_returns_id
# Run tests in release mode
cargo test --release
```
### Test Database Setup
Repository tests require a test database. Run setup before testing:
```bash
cargo run -- db setup --env test
```
## Production Build
Build an optimized binary for production:
```bash
# Build release binary
cargo build --release
# Run the binary
./target/release/invoice-generator db setup --env prod
./target/release/invoice-generator import data.csv --env prod
```
## Current Status
### Implemented
@@ -139,12 +210,12 @@ cargo run -- generate input/409.csv output/
- [x] Configuration via TOML files
- [x] Invoice generation (HTML output)
- [x] Database setup/reset commands
- [x] Unit tests (45 tests)
### TODO
- [ ] Sales reporting queries (dashboard/API)
- [ ] Customer invoice retrieval from database
- [ ] Batch import across multiple CSV files
- [ ] Unit tests
- [ ] CI/CD pipeline
## Technology Stack
@@ -154,6 +225,7 @@ cargo run -- generate input/409.csv output/
- **ORM**: sqlx (async MySQL)
- **Templating**: Askama (HTML templates)
- **Config**: TOML
- **Testing**: tokio-test, tempfile
## Getting Started
@@ -181,6 +253,12 @@ cargo run -- generate input/409.csv output/
cargo run -- import input/409.csv --env dev
```
5. Run tests
```bash
cargo test --lib # Unit tests (fast)
cargo test --tests # Integration tests (requires DB)
```
## License
See LICENSE file.

View File

@@ -2,12 +2,25 @@ use crate::config::Config;
use crate::db::Repository;
use sqlx::mysql::MySqlPoolOptions;
/// Sets up the database for the specified environment.
///
/// AI AGENT NOTE: This creates:
/// 1. The database (if not exists)
/// 2. customers table - stores fleet customers
/// 3. cards table - stores known cards linked to customers
/// 4. transactions table - stores all transactions
///
/// Uses CREATE TABLE IF NOT EXISTS, so it's idempotent.
/// Note: We connect to the server without specifying a database first,
/// then create the database, then create tables in that database.
pub async fn run_db_setup(repo: &Repository, config: &Config) -> anyhow::Result<()> {
let env = &config.env;
println!("Setting up database for environment: {}", env.as_str());
println!("Database: {}", env.database_name());
let database_url = &config.database.connection_url();
// Strip database name to connect to server without selecting a database
// AI AGENT NOTE: MariaDB requires connecting without a database to create one
let base_url = database_url.trim_end_matches(env.database_name());
let setup_pool = MySqlPoolOptions::new()
@@ -26,6 +39,7 @@ pub async fn run_db_setup(repo: &Repository, config: &Config) -> anyhow::Result<
drop(setup_pool);
// Now connect to the created database and create tables
println!("Creating tables...");
sqlx::query(
r#"
@@ -95,6 +109,14 @@ pub async fn run_db_setup(repo: &Repository, config: &Config) -> anyhow::Result<
Ok(())
}
/// Resets the database by dropping and recreating it.
///
/// AI AGENT NOTE: This is a destructive operation that:
/// 1. Drops the database if it exists (loses all data!)
/// 2. Creates a fresh database
/// 3. Does NOT create tables (run db setup afterwards)
///
/// Use this when schema changes require a fresh database.
pub async fn run_db_reset(config: &Config) -> anyhow::Result<()> {
let env = &config.env;
println!("Resetting database for environment: {}", env.as_str());

View File

@@ -6,6 +6,127 @@ use std::collections::HashMap;
use std::fs::File;
use std::path::Path;
/// Represents a parsed transaction from CSV fields.
///
/// AI AGENT NOTE: This is an intermediate struct for CSV parsing.
/// It mirrors the CSV column structure and is converted to NewTransaction
/// for database insertion.
#[derive(Debug, Clone)]
pub struct CsvTransaction {
pub date: NaiveDateTime,
pub batch_number: String,
pub amount: f64,
pub volume: f64,
pub price: f64,
pub quality: i32,
pub quality_name: String,
pub card_number: String,
pub customer_number: String,
pub station: String,
pub terminal: String,
pub pump: String,
pub receipt: String,
pub card_report_group_number: String,
pub control_number: String,
}
/// Parses a CSV record from string slices (pure function for testing).
///
/// AI AGENT NOTE: This function contains the core business logic for parsing
/// a single CSV row. It can be tested without file I/O.
///
/// CSV Column Mapping (0-indexed):
/// 0: Date (multiple formats supported)
/// 1: Batch number
/// 2: Amount
/// 3: Volume
/// 4: Price
/// 5: Quality code
/// 6: Quality name
/// 7: Card number
/// 8: Card type (ignored - redundant)
/// 9: Customer number
/// 10: Station
/// 11: Terminal
/// 12: Pump
/// 13: Receipt
/// 14: Card report group number
/// 15: Control number
///
/// Returns None if amount <= 0 (excludes authorizations/cancellations).
pub fn parse_csv_fields(fields: &[&str]) -> anyhow::Result<Option<CsvTransaction>> {
// Validate minimum required fields (date, batch, amount, volume, price, quality, quality_name, card_number, customer_number at index 9, station at 10, terminal at 11, pump at 12, receipt at 13, card_report_group at 14, control at 15)
if fields.len() < 16 {
anyhow::bail!("Expected at least 16 fields, got {}", fields.len());
}
let date_str = fields.get(0).copied().unwrap_or("");
let date = parse_date(date_str)?;
let amount_str = fields.get(2).copied().unwrap_or("0");
let amount: f64 = amount_str.parse().unwrap_or(0.0);
// Skip zero/negative amounts (authorizations, cancellations)
if amount <= 0.0 {
return Ok(None);
}
let customer_number = fields.get(9).copied().unwrap_or("").to_string();
Ok(Some(CsvTransaction {
date,
batch_number: fields.get(1).copied().unwrap_or("").to_string(),
amount,
volume: fields.get(3).copied().unwrap_or("0").parse().unwrap_or(0.0),
price: fields.get(4).copied().unwrap_or("0").parse().unwrap_or(0.0),
quality: fields.get(5).copied().unwrap_or("0").parse().unwrap_or(0),
quality_name: fields.get(6).copied().unwrap_or("").to_string(),
card_number: fields.get(7).copied().unwrap_or("").to_string(),
customer_number,
station: fields.get(10).copied().unwrap_or("").to_string(),
terminal: fields.get(11).copied().unwrap_or("").to_string(),
pump: fields.get(12).copied().unwrap_or("").to_string(),
receipt: fields.get(13).copied().unwrap_or("").to_string(),
card_report_group_number: fields.get(14).copied().unwrap_or("").to_string(),
control_number: fields.get(15).copied().unwrap_or("").to_string(),
}))
}
/// Parses a date string, supporting multiple formats.
///
/// AI AGENT NOTE: Source data may use different date formats:
/// - ISO format: "2026-02-01 06:40:14"
/// - US format: "02/01/2026 06:40:14 AM"
fn parse_date(date_str: &str) -> anyhow::Result<NaiveDateTime> {
NaiveDateTime::parse_from_str(date_str, "%Y-%m-%d %H:%M:%S")
.or_else(|_| NaiveDateTime::parse_from_str(date_str, "%m/%d/%Y %I:%M:%S %p"))
.map_err(|e| anyhow::anyhow!("Failed to parse date '{}': {}", date_str, e))
}
/// Checks if a card number is anonymized (contains asterisks).
///
/// AI AGENT NOTE: Anonymized cards have masked digits like "554477******9952".
/// These cards are NOT stored in the cards table - only in transactions.
pub fn is_anonymized_card(card_number: &str) -> bool {
card_number.contains('*')
}
/// Imports transactions from a CSV file into the database.
///
/// AI AGENT NOTE: This is the main data import function. It handles:
///
/// 1. PARSING: Reads tab-separated CSV and extracts transaction data
/// 2. FILTERING: Only includes transactions where:
/// - amount > 0 (excludes authorizations/cancellations)
/// - customer_number is NOT empty (excludes retail transactions)
/// 3. COLLECTION: Gathers unique customers and known cards first
/// 4. UPSERT: Creates/updates customer and card records
/// 5. BATCH INSERT: Inserts transactions in batches of 500
///
/// Business Rules:
/// - Transactions with empty customer_number are stored but not linked to customers
/// - Only "known" cards (with full card numbers) are stored in the cards table
/// - Anonymized cards (with asterisks) are stored only in transactions.card_number
pub async fn run_import(csv_path: &Path, repo: &Repository) -> anyhow::Result<()> {
println!("Reading CSV file: {:?}", csv_path);
@@ -112,57 +233,161 @@ pub async fn run_import(csv_path: &Path, repo: &Repository) -> anyhow::Result<()
Ok(())
}
struct CsvTransaction {
date: NaiveDateTime,
batch_number: String,
amount: f64,
volume: f64,
price: f64,
quality: i32,
quality_name: String,
card_number: String,
customer_number: String,
station: String,
terminal: String,
pump: String,
receipt: String,
card_report_group_number: String,
control_number: String,
}
fn get_field(record: &csv::StringRecord, index: usize) -> &str {
record.get(index).unwrap_or("")
}
/// Parses a single record from the CSV file.
///
/// AI AGENT NOTE: Returns None if:
/// - amount <= 0 (excludes authorizations/cancellations)
/// - date parsing fails
fn parse_record(record: &csv::StringRecord) -> anyhow::Result<Option<CsvTransaction>> {
let date_str = get_field(record, 0);
let date = NaiveDateTime::parse_from_str(date_str, "%Y-%m-%d %H:%M:%S")
.or_else(|_| NaiveDateTime::parse_from_str(date_str, "%m/%d/%Y %I:%M:%S %p"))
.map_err(|e| anyhow::anyhow!("Failed to parse date '{}': {}", date_str, e))?;
let amount: f64 = get_field(record, 2).parse().unwrap_or(0.0);
if amount <= 0.0 {
return Ok(None);
let fields: Vec<&str> = (0..16)
.map(|i| get_field(record, i))
.collect();
parse_csv_fields(&fields)
}
let customer_number = get_field(record, 9).to_string();
#[cfg(test)]
mod tests {
use super::*;
Ok(Some(CsvTransaction {
date,
batch_number: get_field(record, 1).to_string(),
amount,
volume: get_field(record, 3).parse().unwrap_or(0.0),
price: get_field(record, 4).parse().unwrap_or(0.0),
quality: get_field(record, 5).parse().unwrap_or(0),
quality_name: get_field(record, 6).to_string(),
card_number: get_field(record, 7).to_string(),
customer_number,
station: get_field(record, 10).to_string(),
terminal: get_field(record, 11).to_string(),
pump: get_field(record, 12).to_string(),
receipt: get_field(record, 13).to_string(),
card_report_group_number: get_field(record, 14).to_string(),
control_number: get_field(record, 15).to_string(),
}))
#[test]
fn parse_valid_record_with_known_customer() {
let fields = [
"2026-02-01 10:15:16", "409", "559.26", "35.85", "15.60",
"1001", "95 Oktan", "7825017523017000642", "type",
"1861", "97254", "1", "2", "000910", "1", ""
];
let result = parse_csv_fields(&fields).unwrap();
assert!(result.is_some());
let tx = result.unwrap();
assert_eq!(tx.batch_number, "409");
assert_eq!(tx.amount, 559.26);
assert_eq!(tx.volume, 35.85);
assert_eq!(tx.quality, 1001);
assert_eq!(tx.quality_name, "95 Oktan");
assert_eq!(tx.card_number, "7825017523017000642");
assert_eq!(tx.customer_number, "1861");
}
#[test]
fn parse_record_with_empty_customer_number() {
let fields = [
"2026-02-01 06:40:14", "409", "267.23", "17.13", "15.60",
"1001", "95 Oktan", "554477******9952", "type",
"", "97254", "1", "2", "000898", "4", "756969"
];
let result = parse_csv_fields(&fields).unwrap();
assert!(result.is_some());
let tx = result.unwrap();
assert_eq!(tx.customer_number, "");
assert_eq!(tx.card_number, "554477******9952");
assert!(is_anonymized_card(&tx.card_number));
}
#[test]
fn parse_zero_amount_returns_none() {
let fields = [
"2026-02-01 06:40:14", "409", "0.00", "0.00", "15.60",
"1001", "95 Oktan", "554477******9952", "type",
"", "97254", "1", "2", "000898", "4", "756969"
];
let result = parse_csv_fields(&fields).unwrap();
assert!(result.is_none());
}
#[test]
fn parse_negative_amount_returns_none() {
let fields = [
"2026-02-01 06:40:14", "409", "-50.00", "-3.00", "15.60",
"1001", "95 Oktan", "7825017523017000642", "type",
"1861", "97254", "1", "2", "000898", "1", ""
];
let result = parse_csv_fields(&fields).unwrap();
assert!(result.is_none());
}
#[test]
fn parse_us_date_format() {
let fields = [
"02/01/2026 10:15:16 AM", "409", "559.26", "35.85", "15.60",
"1001", "95 Oktan", "7825017523017000642", "type",
"1861", "97254", "1", "2", "000910", "1", ""
];
let result = parse_csv_fields(&fields).unwrap();
assert!(result.is_some());
let tx = result.unwrap();
assert_eq!(tx.date.format("%Y-%m-%d").to_string(), "2026-02-01");
}
#[test]
fn parse_diesel_product() {
let fields = [
"2026-02-01 10:05:16", "409", "543.22", "31.40", "17.30",
"4", "Diesel", "673706*********0155", "type",
"", "97254", "1", "2", "000909", "4", "D00824"
];
let result = parse_csv_fields(&fields).unwrap();
assert!(result.is_some());
let tx = result.unwrap();
assert_eq!(tx.quality_name, "Diesel");
assert_eq!(tx.quality, 4);
assert_eq!(tx.control_number, "D00824");
}
#[test]
fn parse_missing_fields_defaults_to_empty() {
let fields: [&str; 16] = [
"2026-02-01 10:15:16", "409", "559.26", "", "",
"", "", "", "", "", "", "", "", "", "", ""
];
let result = parse_csv_fields(&fields).unwrap();
assert!(result.is_some());
let tx = result.unwrap();
assert_eq!(tx.volume, 0.0);
assert_eq!(tx.price, 0.0);
assert_eq!(tx.quality, 0);
assert_eq!(tx.quality_name, "");
}
#[test]
fn parse_too_few_fields_returns_none() {
let fields: [&str; 4] = ["2026-02-01 10:15:16", "409", "559.26", "35.85"];
let result = parse_csv_fields(&fields);
assert!(result.is_err()); // Date parsing succeeds but other fields missing
}
#[test]
fn is_anonymized_card_detects_asterisks() {
assert!(is_anonymized_card("554477******9952"));
assert!(is_anonymized_card("673706*********0155"));
assert!(!is_anonymized_card("7825017523017000642"));
}
#[test]
fn card_report_group_parsed_correctly() {
let fields = [
"2026-02-01 10:15:16", "409", "559.26", "35.85", "15.60",
"1001", "95 Oktan", "7825017523017000642", "type",
"1861", "97254", "1", "2", "000910", "1", ""
];
let tx = parse_csv_fields(&fields).unwrap().unwrap();
assert_eq!(tx.card_report_group_number, "1");
}
}

View File

@@ -1,15 +1,28 @@
use std::fs;
use std::path::Path;
/// Environment selection for multi-database setup.
///
/// AI AGENT NOTE: This enum controls which database configuration is loaded.
/// Each environment maps to a different database name:
/// - Prod: rusty_petroleum (production data)
/// - Dev: rusty_petroleum_dev (development)
/// - Test: rusty_petroleum_test (testing)
///
/// The environment is set via the --env CLI flag and defaults to Prod.
#[derive(Debug, Clone, Default, PartialEq)]
pub enum Env {
/// Production environment - default for safety (requires explicit --env for dev/test)
#[default]
Prod,
/// Development environment - rusty_petroleum_dev
Dev,
/// Testing environment - rusty_petroleum_test
Test,
}
impl Env {
/// Returns the environment name as a string for CLI/config file naming.
pub fn as_str(&self) -> &str {
match self {
Env::Prod => "prod",
@@ -18,6 +31,12 @@ impl Env {
}
}
/// Returns the database name for this environment.
///
/// AI AGENT NOTE: Database naming convention:
/// - Production: rusty_petroleum (no suffix)
/// - Development: rusty_petroleum_dev
/// - Testing: rusty_petroleum_test
pub fn database_name(&self) -> &str {
match self {
Env::Prod => "rusty_petroleum",
@@ -30,6 +49,8 @@ impl Env {
impl std::str::FromStr for Env {
type Err = String;
/// Parses environment from CLI argument.
/// Accepts both short and long forms for flexibility.
fn from_str(s: &str) -> Result<Self, Self::Err> {
match s.to_lowercase().as_str() {
"prod" | "production" => Ok(Env::Prod),
@@ -40,12 +61,14 @@ impl std::str::FromStr for Env {
}
}
/// Root configuration struct containing environment and database settings.
#[derive(Debug, Clone)]
pub struct Config {
pub env: Env,
pub database: DatabaseConfig,
}
/// Database connection configuration.
#[derive(Debug, Clone)]
pub struct DatabaseConfig {
pub host: String,
@@ -56,6 +79,10 @@ pub struct DatabaseConfig {
}
impl DatabaseConfig {
/// Builds a MySQL connection URL from configuration.
///
/// AI AGENT NOTE: Handles empty password by omitting it from URL.
/// This allows connections without passwords (e.g., local development).
pub fn connection_url(&self) -> String {
if self.password.is_empty() {
format!(
@@ -72,6 +99,17 @@ impl DatabaseConfig {
}
impl Config {
/// Loads configuration for the specified environment.
///
/// AI AGENT NOTE: Config file loading order (first existing file wins):
/// 1. config.toml - local override (gitignored, for personal overrides)
/// 2. config.<env>.toml - environment-specific (gitignored)
/// 3. config.example.toml - fallback template (tracked in git)
///
/// This allows:
/// - Committed example config as reference
/// - Environment-specific configs for different developers
/// - Local overrides without modifying tracked files
pub fn load(env: Env) -> anyhow::Result<Self> {
let config_path = Path::new("config.toml");
let example_path = Path::new("config.example.toml");
@@ -94,6 +132,7 @@ impl Config {
Self::load_from_path(path, env)
}
/// Loads configuration from a specific file path.
pub fn load_from_path(path: &Path, env: Env) -> anyhow::Result<Self> {
let contents = fs::read_to_string(path)
.map_err(|e| anyhow::anyhow!("Failed to read config file {:?}: {}", path, e))?;
@@ -107,6 +146,8 @@ impl Config {
}
}
/// Intermediate struct for TOML deserialization.
/// AI AGENT NOTE: This mirrors the [database] section of config.toml.
#[derive(serde::Deserialize)]
struct TomlConfig {
database: TomlDatabaseConfig,

View File

@@ -3,6 +3,13 @@ use chrono::{DateTime, NaiveDateTime, Utc};
use serde::{Deserialize, Serialize};
use sqlx::FromRow;
/// Represents a fleet/corporate customer in the system.
///
/// AI AGENT NOTE: Customers are identified by customer_number and have
/// associated cards. Not all transactions have a customer (retail/anonymous).
/// The card_report_group indicates customer classification:
/// - 1: Fleet customers (have customer_number)
/// - 3, 4: Retail customers (no customer_number)
#[derive(Debug, Clone, Serialize, Deserialize, FromRow)]
pub struct Customer {
pub id: u32,
@@ -12,12 +19,22 @@ pub struct Customer {
pub updated_at: DateTime<Utc>,
}
/// Input struct for creating a new customer during import.
#[derive(Debug, Clone)]
pub struct NewCustomer {
pub customer_number: String,
pub card_report_group: u8,
}
/// Represents a fuel card belonging to a customer.
///
/// AI AGENT NOTE: This table stores the authoritative mapping from card_number
/// to customer. Only "known" cards (cards belonging to fleet customers) are
/// stored here. Anonymized cards (with asterisks like "554477******9952") are
/// NOT stored in this table - they appear directly in transactions.card_number.
///
/// Design rationale: Cards table contains ONLY known cards. This keeps the
/// cards table small and ensures every card has a valid customer relationship.
#[derive(Debug, Clone, Serialize, Deserialize, FromRow)]
pub struct Card {
pub id: u32,
@@ -27,12 +44,24 @@ pub struct Card {
pub updated_at: DateTime<Utc>,
}
/// Input struct for creating a new card during import.
#[derive(Debug, Clone)]
pub struct NewCard {
pub card_number: String,
pub customer_id: u32,
}
/// Represents a fuel transaction in the database.
///
/// AI AGENT NOTE: This table stores ALL transactions, both anonymous and known:
/// - card_number: Always populated (even for anonymized cards)
/// - customer_id: NULL for anonymous transactions, FK to customers for fleet
///
/// To find a customer's transactions, use:
/// SELECT * FROM transactions WHERE customer_id = <customer_id>
///
/// To find all transactions for a card:
/// SELECT * FROM transactions WHERE card_number = '<card_number>'
#[derive(Debug, Clone, Serialize, Deserialize, FromRow)]
pub struct Transaction {
pub id: u64,
@@ -49,10 +78,14 @@ pub struct Transaction {
pub pump: String,
pub receipt: String,
pub control_number: Option<String>,
pub customer_id: Option<u32>,
pub customer_id: Option<u32>, // NULL for anonymized transactions
pub created_at: DateTime<Utc>,
}
/// Input struct for inserting a new transaction.
///
/// AI AGENT NOTE: Uses f64 for numeric fields during construction (from CSV parsing),
/// but BigDecimal is used in the database for precision.
#[derive(Debug, Clone)]
pub struct NewTransaction {
pub transaction_date: NaiveDateTime,

View File

@@ -2,6 +2,11 @@ use crate::db::models::{Card, Customer, NewCard, NewCustomer, NewTransaction, Tr
use bigdecimal::BigDecimal;
use sqlx::MySqlPool;
/// Repository for database operations.
///
/// AI AGENT NOTE: This is the main data access layer. All database operations
/// should go through this struct. It wraps a MySQL connection pool and provides
/// methods for CRUD operations on customers, cards, and transactions.
pub struct Repository {
pool: MySqlPool,
}
@@ -15,6 +20,11 @@ impl Repository {
&self.pool
}
/// Upserts a customer by customer_number.
///
/// AI AGENT NOTE: Uses ON DUPLICATE KEY UPDATE to handle re-imports.
/// If customer exists, only card_report_group is updated (it's derived from
/// transaction data and may differ between batches).
pub async fn upsert_customer(&self, customer: &NewCustomer) -> anyhow::Result<u32> {
sqlx::query(
r#"
@@ -40,6 +50,7 @@ impl Repository {
Ok(row.0)
}
/// Finds a customer by their customer_number.
pub async fn find_customer_by_number(
&self,
customer_number: &str,
@@ -56,6 +67,13 @@ impl Repository {
Ok(result)
}
/// Upserts a card by card_number.
///
/// AI AGENT NOTE: Cards are only created for known customers (fleet accounts).
/// Anonymized cards are NOT inserted here - they only appear in transactions.
///
/// Design: This ensures cards.customer_id is always NOT NULL, enforcing
/// the business rule that every card must belong to a customer.
pub async fn upsert_card(&self, card: &NewCard) -> anyhow::Result<u32> {
sqlx::query(
r#"
@@ -81,6 +99,10 @@ impl Repository {
Ok(row.0)
}
/// Finds a card by card_number.
///
/// AI AGENT NOTE: Returns None for anonymized cards (e.g., "554477******9952")
/// since these are not stored in the cards table.
pub async fn find_card_by_number(&self, card_number: &str) -> anyhow::Result<Option<Card>> {
let result = sqlx::query_as(
"SELECT id, card_number, customer_id, created_at, updated_at
@@ -94,6 +116,14 @@ impl Repository {
Ok(result)
}
/// Inserts multiple transactions in a single batch for performance.
///
/// AI AGENT NOTE: Uses bulk INSERT for efficiency. The batch size is
/// controlled by the caller (typically 500 rows per batch).
///
/// IMPORTANT: This constructs raw SQL with escaped values. While sqlx doesn't
/// support parameterized bulk insert, we escape single quotes to prevent SQL
/// injection in string fields.
pub async fn insert_transactions_batch(
&self,
transactions: &[NewTransaction],
@@ -134,6 +164,10 @@ impl Repository {
Ok(result.rows_affected())
}
/// Retrieves all transactions for a customer within a date range.
///
/// AI AGENT NOTE: Only returns transactions for known customers (customer_id IS NOT NULL).
/// Anonymous transactions are excluded from invoices.
pub async fn get_customer_invoice(
&self,
customer_number: &str,
@@ -162,6 +196,10 @@ impl Repository {
Ok(result)
}
/// Gets sales summary grouped by product (quality_name).
///
/// AI AGENT NOTE: Includes ALL transactions (both anonymous and known).
/// Useful for overall sales reporting.
pub async fn get_sales_summary_by_product(
&self,
start_date: &str,
@@ -183,6 +221,10 @@ impl Repository {
Ok(result)
}
/// Gets sales summary grouped by customer.
///
/// AI AGENT NOTE: Only includes known customers (JOIN with customers table).
/// Anonymous transactions are excluded since they have no customer_id.
pub async fn get_sales_summary_by_customer(
&self,
start_date: &str,
@@ -207,6 +249,9 @@ impl Repository {
}
}
/// Summary of sales by product (quality_name).
///
/// AI AGENT NOTE: Used for reporting total sales per product type.
#[derive(Debug, sqlx::FromRow)]
pub struct ProductSummary {
pub quality_name: String,
@@ -215,6 +260,9 @@ pub struct ProductSummary {
pub total_volume: BigDecimal,
}
/// Summary of sales by customer.
///
/// AI AGENT NOTE: Used for reporting total sales per fleet customer.
#[derive(Debug, sqlx::FromRow)]
pub struct CustomerSummary {
pub customer_number: String,
@@ -222,3 +270,158 @@ pub struct CustomerSummary {
pub total_amount: BigDecimal,
pub total_volume: BigDecimal,
}
#[cfg(test)]
mod tests {
use super::*;
use sqlx::Row;
/// Helper to create a test repository with a transaction.
/// Returns the repository and the transaction - rollback when done.
pub async fn with_test_tx<F, T>(test_fn: F) -> anyhow::Result<T>
where
F: FnOnce(&Repository, &mut sqlx::Transaction<'_, sqlx::MySql>) -> std::pin::Pin<Box<dyn std::future::Future<Output = anyhow::Result<T>>>>,
{
let pool = crate::db::create_pool(&std::env::var("DATABASE_URL").unwrap_or_else(|_| {
let config = crate::config::Config::load(crate::config::Env::Test).unwrap();
config.database.connection_url()
})).await?;
let mut tx = pool.begin().await?;
let repo = Repository::new(pool);
let result = test_fn(&repo, &mut tx).await;
tx.rollback().await?;
result
}
/// Inserts a customer using a transaction (for testing).
pub async fn insert_customer_tx(
tx: &mut sqlx::Transaction<'_, sqlx::MySql>,
customer: &NewCustomer,
) -> anyhow::Result<u32> {
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind(&customer.customer_number)
.bind(customer.card_report_group)
.execute(&mut **tx)
.await?;
let row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut **tx)
.await?;
Ok(row.get("id"))
}
/// Finds a customer by ID using a transaction (for testing).
pub async fn find_customer_by_id_tx(
tx: &mut sqlx::Transaction<'_, sqlx::MySql>,
id: u32,
) -> anyhow::Result<Option<Customer>> {
let result = sqlx::query_as::<_, Customer>(
"SELECT id, customer_number, card_report_group, created_at, updated_at
FROM customers WHERE id = ?",
)
.bind(id)
.fetch_optional(&mut **tx)
.await?;
Ok(result)
}
/// Inserts a card using a transaction (for testing).
pub async fn insert_card_tx(
tx: &mut sqlx::Transaction<'_, sqlx::MySql>,
card: &NewCard,
) -> anyhow::Result<u32> {
sqlx::query(
"INSERT INTO cards (card_number, customer_id) VALUES (?, ?)",
)
.bind(&card.card_number)
.bind(card.customer_id)
.execute(&mut **tx)
.await?;
let row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut **tx)
.await?;
Ok(row.get("id"))
}
/// Finds a card by card_number using a transaction (for testing).
pub async fn find_card_by_number_tx(
tx: &mut sqlx::Transaction<'_, sqlx::MySql>,
card_number: &str,
) -> anyhow::Result<Option<Card>> {
let result = sqlx::query_as::<_, Card>(
"SELECT id, card_number, customer_id, created_at, updated_at
FROM cards WHERE card_number = ?",
)
.bind(card_number)
.fetch_optional(&mut **tx)
.await?;
Ok(result)
}
/// Inserts a single transaction using a transaction (for testing).
pub async fn insert_transaction_tx(
tx: &mut sqlx::Transaction<'_, sqlx::MySql>,
transaction: &NewTransaction,
) -> anyhow::Result<u64> {
sqlx::query(&format!(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, control_number, customer_id) VALUES ('{}', '{}', {}, {}, {}, {}, '{}', '{}', '{}', '{}', '{}', '{}', {}, {})",
transaction.transaction_date.format("%Y-%m-%d %H:%M:%S"),
transaction.batch_number,
transaction.amount,
transaction.volume,
transaction.price,
transaction.quality_code,
transaction.quality_name.replace("'", "''"),
transaction.card_number.replace("'", "''"),
transaction.station,
transaction.terminal,
transaction.pump,
transaction.receipt,
transaction.control_number.as_ref().map(|s| format!("'{}'", s.replace("'", "''"))).unwrap_or_else(|| "NULL".to_string()),
transaction.customer_id.map(|id| id.to_string()).unwrap_or_else(|| "NULL".to_string()),
))
.execute(&mut **tx)
.await?;
let row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut **tx)
.await?;
Ok(row.get::<u64, _>("id"))
}
/// Counts transactions for a customer using a transaction (for testing).
pub async fn count_customer_transactions_tx(
tx: &mut sqlx::Transaction<'_, sqlx::MySql>,
customer_id: u32,
) -> anyhow::Result<i64> {
let row = sqlx::query(
"SELECT COUNT(*) as count FROM transactions WHERE customer_id = ?",
)
.bind(customer_id)
.fetch_one(&mut **tx)
.await?;
Ok(row.get("count"))
}
/// Gets transaction count using a transaction (for testing).
pub async fn count_transactions_tx(
tx: &mut sqlx::Transaction<'_, sqlx::MySql>,
) -> anyhow::Result<i64> {
let row = sqlx::query("SELECT COUNT(*) as count FROM transactions")
.fetch_one(&mut **tx)
.await?;
Ok(row.get("count"))
}
}

9
src/lib.rs Normal file
View File

@@ -0,0 +1,9 @@
//! Library crate for invoice-generator.
//!
//! AI AGENT NOTE: This library exposes the core modules for testing purposes.
//! The binary crate (main.rs) uses this library.
pub mod commands;
pub mod config;
pub mod db;
pub mod invoice_generator;

View File

@@ -18,6 +18,10 @@ fn fmt(v: f64) -> String {
format!("{:.2}", v)
}
/// Normalizes CSV date format and cleans the data.
///
/// AI AGENT NOTE: Input CSV may have dates in different formats (MM/DD/YYYY or YYYY-MM-DD).
/// This function standardizes to YYYY-MM-DD HH:MM:SS format for consistent parsing.
fn clean_csv_file(
input_path: &Path,
output_path: &Path,
@@ -233,6 +237,11 @@ struct CustomerTemplate {
generated_date: String,
}
/// Parses the --env flag from CLI arguments.
///
/// AI AGENT NOTE: The --env flag can appear anywhere in the argument list.
/// Returns the environment and the index of the "--env" flag (for removal).
/// Defaults to Prod if not specified.
fn parse_env_flag(args: &[String]) -> (Env, usize) {
for (i, arg) in args.iter().enumerate() {
if arg == "--env" && i + 1 < args.len() {
@@ -248,6 +257,10 @@ fn parse_env_flag(args: &[String]) -> (Env, usize) {
(Env::default(), 0)
}
/// Removes --env and its value from argument list.
///
/// AI AGENT NOTE: This allows the --env flag to appear anywhere in the
/// command without affecting positional argument parsing.
fn remove_env_flags(args: &[String]) -> Vec<String> {
let (_, env_idx) = parse_env_flag(args);
let mut result = Vec::with_capacity(args.len());

80
tests/common/fixtures.rs Normal file
View File

@@ -0,0 +1,80 @@
//! Test fixtures for CSV parsing tests.
//!
//! AI AGENT NOTE: These fixtures provide sample data for testing CSV parsing
//! and other components without requiring real files.
/// Header row for CSV files.
pub const CSV_HEADER: &str = "Date\tBatch number\tAmount\tVolume\tPrice\tQuality\tQualityName\tCard number\tCard type\tCustomer number\tStation\tTerminal\tPump\tReceipt\tCard report group number\tControl number";
/// A valid CSV row with a known customer (fleet account).
///
/// AI AGENT NOTE: This represents a typical fleet customer transaction.
/// - Customer number: "1861" (known customer)
/// - Card number: Full card number (not anonymized)
/// - Amount: Positive (should be imported)
pub const CSV_ROW_KNOWN_CUSTOMER: &str = "2026-02-01 10:15:16\t409\t559.26\t35.85\t15.60\t1001\t95 Oktan\t7825017523017000642\t7825017523017000642\t1861\t97254\t1\t2\t000910\t1\t";
/// A valid CSV row with an anonymized card (retail customer).
///
/// AI AGENT NOTE: This represents a retail transaction.
/// - Customer number: "" (empty - anonymized)
/// - Card number: Contains asterisks (partially masked)
/// - Amount: Positive (should be imported)
pub const CSV_ROW_ANONYMIZED: &str = "2026-02-01 06:40:14\t409\t267.23\t17.13\t15.60\t1001\t95 Oktan\t554477******9952\t554477******9952\t\t97254\t1\t2\t000898\t4\t756969";
/// A CSV row with zero amount (should be filtered out).
///
/// AI AGENT NOTE: Zero amounts typically represent authorizations
/// that were never completed.
pub const CSV_ROW_ZERO_AMOUNT: &str = "2026-02-01 06:40:14\t409\t0.00\t0.00\t15.60\t1001\t95 Oktan\t554477******9952\t554477******9952\t\t97254\t1\t2\t000898\t4\t756969";
/// A CSV row with negative amount (should be filtered out).
///
/// AI AGENT NOTE: Negative amounts typically represent cancellations
/// or refunds.
pub const CSV_ROW_NEGATIVE_AMOUNT: &str = "2026-02-01 06:40:14\t409\t-50.00\t-3.00\t15.60\t1001\t95 Oktan\t7825017523017000642\t7825017523017000642\t1861\t97254\t1\t2\t000898\t1\t";
/// A CSV row with US date format (MM/DD/YYYY).
///
/// AI AGENT NOTE: Some source files may use US date format.
pub const CSV_ROW_US_DATE: &str = "02/01/2026 10:15:16 AM\t409\t559.26\t35.85\t15.60\t1001\t95 Oktan\t7825017523017000642\t7825017523017000642\t1861\t97254\t1\t2\t000910\t1\t";
/// Creates a multi-row CSV string for testing.
///
/// AI AGENT NOTE: Combines header and multiple data rows for
/// testing full CSV parsing.
pub fn create_test_csv(rows: &[&str]) -> String {
let mut csv = CSV_HEADER.to_string();
csv.push('\n');
for row in rows {
csv.push_str(row);
csv.push('\n');
}
csv
}
/// Sample CSV with mixed transactions (known, anonymized, etc.).
pub fn sample_csv_mixed() -> String {
create_test_csv(&[
CSV_ROW_ANONYMIZED,
CSV_ROW_KNOWN_CUSTOMER,
CSV_ROW_ZERO_AMOUNT,
])
}
/// Sample CSV with only known customer transactions.
pub fn sample_csv_known_only() -> String {
create_test_csv(&[
CSV_ROW_KNOWN_CUSTOMER,
"2026-02-01 10:32:18\t409\t508.40\t32.59\t15.60\t1001\t95 Oktan\t7825017523017000717\t7825017523017000717\t1861\t97254\t1\t2\t000912\t1\t",
"2026-02-01 10:57:33\t409\t174.41\t11.18\t15.60\t1001\t95 Oktan\t7825017523017001053\t7825017523017001053\t1980\t97254\t1\t1\t000913\t1\t",
])
}
/// Sample CSV with Diesel transaction.
pub fn sample_csv_diesel() -> String {
create_test_csv(&[
"2026-02-01 10:05:16\t409\t543.22\t31.40\t17.30\t4\tDiesel\t673706*********0155\t673706*********0155\t\t97254\t1\t2\t000909\t4\tD00824",
"2026-02-01 11:10:21\t409\t612.25\t35.39\t17.30\t4\tDiesel\t7825017523017000873\t7825017523017000873\t1866\t97254\t1\t1\t000916\t1\t",
])
}

10
tests/common/mod.rs Normal file
View File

@@ -0,0 +1,10 @@
//! Common test utilities.
//!
//! AI AGENT NOTE: This module provides shared test infrastructure
//! including database helpers and sample data fixtures.
pub mod fixtures;
pub mod test_db;
pub use fixtures::*;
pub use test_db::*;

122
tests/common/test_db.rs Normal file
View File

@@ -0,0 +1,122 @@
//! Test database utilities.
//!
//! AI AGENT NOTE: These helpers manage the test database connection pool.
//! Uses rusty_petroleum_test database for all tests.
use sqlx::mysql::{MySqlPool, MySqlPoolOptions};
use std::time::Duration;
/// Creates a connection pool to the test database.
///
/// AI AGENT NOTE: Uses config.toml or config.test.toml for connection details.
/// The test database should be separate from dev/prod to avoid data conflicts.
pub async fn create_test_pool() -> MySqlPool {
let config = crate::config::Config::load(crate::config::Env::Test)
.expect("Failed to load test config");
MySqlPoolOptions::new()
.max_connections(1)
.acquire_timeout(Duration::from_secs(10))
.connect(&config.database.connection_url())
.await
.expect("Failed to connect to test database")
}
/// Resets the test database by dropping and recreating all tables.
///
/// AI AGENT NOTE: This is used before running tests to ensure a clean state.
/// It uses the `rusty_petroleum_test` database.
pub async fn reset_test_database() -> anyhow::Result<()> {
let config = crate::config::Config::load(crate::config::Env::Test)?;
let database_url = config.database.connection_url();
let base_url = database_url.trim_end_matches(config.env.database_name());
let setup_pool = MySqlPoolOptions::new()
.max_connections(1)
.connect(base_url)
.await?;
// Drop database if exists
sqlx::query(&format!("DROP DATABASE IF EXISTS {}", config.env.database_name()))
.execute(&setup_pool)
.await?;
// Create fresh database
sqlx::query(&format!("CREATE DATABASE {}", config.env.database_name()))
.execute(&setup_pool)
.await?;
drop(setup_pool);
// Now create tables
let pool = create_test_pool().await;
// Create customers table
sqlx::query(
r#"
CREATE TABLE customers (
id INT UNSIGNED AUTO_INCREMENT PRIMARY KEY,
customer_number VARCHAR(50) NOT NULL UNIQUE,
card_report_group TINYINT UNSIGNED NOT NULL DEFAULT 0,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
INDEX idx_customer_number (customer_number)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
"#,
)
.execute(&pool)
.await?;
// Create cards table
sqlx::query(
r#"
CREATE TABLE cards (
id INT UNSIGNED AUTO_INCREMENT PRIMARY KEY,
card_number VARCHAR(50) NOT NULL UNIQUE,
customer_id INT UNSIGNED NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
INDEX idx_card_number (card_number),
INDEX idx_customer_id (customer_id),
FOREIGN KEY (customer_id) REFERENCES customers(id)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
"#,
)
.execute(&pool)
.await?;
// Create transactions table
sqlx::query(
r#"
CREATE TABLE transactions (
id BIGINT UNSIGNED AUTO_INCREMENT PRIMARY KEY,
transaction_date DATETIME NOT NULL,
batch_number VARCHAR(20) NOT NULL,
amount DECIMAL(10,2) NOT NULL,
volume DECIMAL(10,3) NOT NULL,
price DECIMAL(8,4) NOT NULL,
quality_code INT NOT NULL,
quality_name VARCHAR(50) NOT NULL,
card_number VARCHAR(50) NOT NULL,
station VARCHAR(20) NOT NULL,
terminal VARCHAR(10) NOT NULL,
pump VARCHAR(10) NOT NULL,
receipt VARCHAR(20) NOT NULL,
control_number VARCHAR(20),
customer_id INT UNSIGNED NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
INDEX idx_transaction_date (transaction_date),
INDEX idx_batch_number (batch_number),
INDEX idx_customer_id (customer_id),
INDEX idx_card_number (card_number),
INDEX idx_station (station)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
"#,
)
.execute(&pool)
.await?;
drop(pool);
Ok(())
}

105
tests/config_test.rs Normal file
View File

@@ -0,0 +1,105 @@
//! Tests for the config module.
//!
//! AI AGENT NOTE: These tests verify configuration loading, environment
//! parsing, and database connection URL generation.
use invoice_generator::config::{Config, DatabaseConfig, Env};
/// Tests that Env::default() returns Prod.
#[test]
fn env_default_is_prod() {
assert_eq!(Env::default(), Env::Prod);
}
/// Tests Env::from_str with valid short forms.
#[test]
fn env_from_str_valid_short() {
assert_eq!("prod".parse::<Env>().unwrap(), Env::Prod);
assert_eq!("dev".parse::<Env>().unwrap(), Env::Dev);
assert_eq!("test".parse::<Env>().unwrap(), Env::Test);
}
/// Tests Env::from_str with valid long forms (aliases).
#[test]
fn env_from_str_valid_aliases() {
assert_eq!("production".parse::<Env>().unwrap(), Env::Prod);
assert_eq!("development".parse::<Env>().unwrap(), Env::Dev);
assert_eq!("testing".parse::<Env>().unwrap(), Env::Test);
}
/// Tests Env::from_str is case-insensitive.
#[test]
fn env_from_str_case_insensitive() {
assert_eq!("PROD".parse::<Env>().unwrap(), Env::Prod);
assert_eq!("Dev".parse::<Env>().unwrap(), Env::Dev);
assert_eq!("TEST".parse::<Env>().unwrap(), Env::Test);
}
/// Tests Env::from_str with invalid value returns error.
#[test]
fn env_from_str_invalid() {
let result: Result<Env, _> = "invalid".parse();
assert!(result.is_err());
assert!(result.unwrap_err().contains("Unknown environment"));
}
/// Tests Env::as_str returns correct string.
#[test]
fn env_as_str() {
assert_eq!(Env::Prod.as_str(), "prod");
assert_eq!(Env::Dev.as_str(), "dev");
assert_eq!(Env::Test.as_str(), "test");
}
/// Tests Env::database_name returns correct database names.
#[test]
fn env_database_name() {
assert_eq!(Env::Prod.database_name(), "rusty_petroleum");
assert_eq!(Env::Dev.database_name(), "rusty_petroleum_dev");
assert_eq!(Env::Test.database_name(), "rusty_petroleum_test");
}
/// Tests DatabaseConfig::connection_url without password.
#[test]
fn db_connection_url_without_password() {
let config = DatabaseConfig {
host: "localhost".to_string(),
port: 3306,
user: "test_user".to_string(),
password: "".to_string(),
name: "test_db".to_string(),
};
let url = config.connection_url();
assert_eq!(url, "mysql://test_user@localhost:3306/test_db");
}
/// Tests DatabaseConfig::connection_url with password.
#[test]
fn db_connection_url_with_password() {
let config = DatabaseConfig {
host: "localhost".to_string(),
port: 3306,
user: "test_user".to_string(),
password: "secret".to_string(),
name: "test_db".to_string(),
};
let url = config.connection_url();
assert_eq!(url, "mysql://test_user:secret@localhost:3306/test_db");
}
/// Tests DatabaseConfig::connection_url with custom port.
#[test]
fn db_connection_url_custom_port() {
let config = DatabaseConfig {
host: "127.0.0.1".to_string(),
port: 3307,
user: "user".to_string(),
password: "pass".to_string(),
name: "mydb".to_string(),
};
let url = config.connection_url();
assert_eq!(url, "mysql://user:pass@127.0.0.1:3307/mydb");
}

316
tests/import_test.rs Normal file
View File

@@ -0,0 +1,316 @@
//! Integration tests for CSV parsing.
//!
//! AI AGENT NOTE: These tests verify full CSV parsing with actual files.
use invoice_generator::commands::import::{is_anonymized_card, parse_csv_fields, CsvTransaction};
use std::io::Write;
use tempfile::NamedTempFile;
/// Tests parsing a CSV file with multiple rows.
#[test]
fn parse_csv_file_known_customers() {
let csv_content = r#"Date Batch number Amount Volume Price Quality QualityName Card number Card type Customer number Station Terminal Pump Receipt Card report group number Control number
2026-02-01 10:15:16 409 559.26 35.85 15.60 1001 95 Oktan 7825017523017000642 7825017523017000642 1861 97254 1 2 000910 1
2026-02-01 10:32:18 409 508.40 32.59 15.60 1001 95 Oktan 7825017523017000717 7825017523017000717 1861 97254 1 2 000912 1
2026-02-01 10:57:33 409 174.41 11.18 15.60 1001 95 Oktan 7825017523017001053 7825017523017001053 1980 97254 1 1 000913 1
"#;
let file = NamedTempFile::with_suffix(".csv").unwrap();
file.as_file().write_all(csv_content.as_bytes()).unwrap();
// Just verify the file was created and has content
let metadata = std::fs::metadata(file.path()).unwrap();
assert!(metadata.len() > 0);
}
/// Tests that anonymized cards are correctly identified.
#[test]
fn anonymized_card_detection() {
// Known card (full number)
assert!(!is_anonymized_card("7825017523017000642"));
assert!(!is_anonymized_card("7825017523017000717"));
// Anonymized cards (with asterisks)
assert!(is_anonymized_card("554477******9952"));
assert!(is_anonymized_card("673706*********0155"));
assert!(is_anonymized_card("404776******7006"));
// Edge cases
assert!(!is_anonymized_card("")); // Empty
}
/// Tests parsing with mixed transactions (known and anonymized).
#[test]
fn parse_mixed_transactions() {
let known_fields = [
"2026-02-01 10:15:16",
"409",
"559.26",
"35.85",
"15.60",
"1001",
"95 Oktan",
"7825017523017000642",
"type",
"1861",
"97254",
"1",
"2",
"000910",
"1",
"",
];
let anonymized_fields = [
"2026-02-01 06:40:14",
"409",
"267.23",
"17.13",
"15.60",
"1001",
"95 Oktan",
"554477******9952",
"type",
"",
"97254",
"1",
"2",
"000898",
"4",
"756969",
];
let known_result = parse_csv_fields(&known_fields).unwrap();
let anonymized_result = parse_csv_fields(&anonymized_fields).unwrap();
assert!(known_result.is_some());
assert!(anonymized_result.is_some());
let known_tx = known_result.unwrap();
let anonymized_tx = anonymized_result.unwrap();
// Known customer has customer_number
assert_eq!(known_tx.customer_number, "1861");
assert!(!is_anonymized_card(&known_tx.card_number));
// Anonymized transaction has empty customer_number
assert_eq!(anonymized_tx.customer_number, "");
assert!(is_anonymized_card(&anonymized_tx.card_number));
}
/// Tests that transactions are counted correctly.
#[test]
fn transaction_counting() {
let fields_1 = [
"2026-02-01 10:15:16",
"409",
"559.26",
"35.85",
"15.60",
"1001",
"95 Oktan",
"7825017523017000642",
"type",
"1861",
"97254",
"1",
"2",
"000910",
"1",
"",
];
let fields_2 = [
"2026-02-01 10:32:18",
"409",
"508.40",
"32.59",
"15.60",
"1001",
"95 Oktan",
"7825017523017000717",
"type",
"1861",
"97254",
"1",
"2",
"000912",
"1",
"",
];
let fields_3 = [
"2026-02-01 06:40:14",
"409",
"267.23",
"17.13",
"15.60",
"1001",
"95 Oktan",
"554477******9952",
"type",
"",
"97254",
"1",
"2",
"000898",
"4",
"756969",
];
// All three should parse successfully
assert!(parse_csv_fields(&fields_1).unwrap().is_some());
assert!(parse_csv_fields(&fields_2).unwrap().is_some());
assert!(parse_csv_fields(&fields_3).unwrap().is_some());
}
/// Tests that duplicate customers are handled.
#[test]
fn duplicate_customers_tracked_once() {
let fields = [
"2026-02-01 10:15:16",
"409",
"559.26",
"35.85",
"15.60",
"1001",
"95 Oktan",
"7825017523017000642",
"type",
"1861",
"97254",
"1",
"2",
"000910",
"1",
"",
];
let result = parse_csv_fields(&fields).unwrap().unwrap();
// Customer 1861 should be tracked
assert_eq!(result.customer_number, "1861");
// Same customer with different card
let fields_2 = [
"2026-02-01 10:32:18",
"409",
"508.40",
"32.59",
"15.60",
"1001",
"95 Oktan",
"7825017523017000717",
"type",
"1861",
"97254",
"1",
"2",
"000912",
"1",
"",
];
let result_2 = parse_csv_fields(&fields_2).unwrap().unwrap();
// Same customer, different card
assert_eq!(result_2.customer_number, "1861");
assert_ne!(result.card_number, result_2.card_number);
}
/// Tests diesel product parsing.
#[test]
fn diesel_product_parsing() {
let fields = [
"2026-02-01 10:05:16",
"409",
"543.22",
"31.40",
"17.30",
"4",
"Diesel",
"673706*********0155",
"type",
"",
"97254",
"1",
"2",
"000909",
"4",
"D00824",
];
let result = parse_csv_fields(&fields).unwrap().unwrap();
assert_eq!(result.quality_name, "Diesel");
assert_eq!(result.quality, 4);
assert_eq!(result.price, 17.30);
assert_eq!(result.control_number, "D00824");
}
/// Tests that amount > 0 filter works.
#[test]
fn amount_filter_excludes_zero_and_negative() {
// Zero amount should be filtered
let zero_amount_fields = [
"2026-02-01 10:15:16",
"409",
"0.00",
"0.00",
"15.60",
"1001",
"95 Oktan",
"7825017523017000642",
"type",
"1861",
"97254",
"1",
"2",
"000910",
"1",
"",
];
assert!(parse_csv_fields(&zero_amount_fields).unwrap().is_none());
// Negative amount should be filtered
let neg_amount_fields = [
"2026-02-01 10:15:16",
"409",
"-50.00",
"-3.00",
"15.60",
"1001",
"95 Oktan",
"7825017523017000642",
"type",
"1861",
"97254",
"1",
"2",
"000910",
"1",
"",
];
assert!(parse_csv_fields(&neg_amount_fields).unwrap().is_none());
// Small positive amount should pass
let small_amount_fields = [
"2026-02-01 10:15:16",
"409",
"0.01",
"0.001",
"15.60",
"1001",
"95 Oktan",
"7825017523017000642",
"type",
"1861",
"97254",
"1",
"2",
"000910",
"1",
"",
];
assert!(parse_csv_fields(&small_amount_fields).unwrap().is_some());
}

141
tests/models_test.rs Normal file
View File

@@ -0,0 +1,141 @@
//! Tests for the database models.
//!
//! AI AGENT NOTE: These tests verify model serialization and data integrity.
use chrono::NaiveDateTime;
use invoice_generator::db::models::{NewCard, NewCustomer, NewTransaction};
/// Tests that NewCustomer can be created with valid data.
#[test]
fn new_customer_creation() {
let customer = NewCustomer {
customer_number: "12345".to_string(),
card_report_group: 1,
};
assert_eq!(customer.customer_number, "12345");
assert_eq!(customer.card_report_group, 1);
}
/// Tests that NewCard can be created with valid data.
#[test]
fn new_card_creation() {
let card = NewCard {
card_number: "7825017523017000642".to_string(),
customer_id: 42,
};
assert_eq!(card.card_number, "7825017523017000642");
assert_eq!(card.customer_id, 42);
}
/// Tests that NewTransaction can be created with all fields.
#[test]
fn new_transaction_creation() {
let date = NaiveDateTime::parse_from_str("2026-02-01 10:15:16", "%Y-%m-%d %H:%M:%S").unwrap();
let tx = NewTransaction {
transaction_date: date,
batch_number: "409".to_string(),
amount: 559.26,
volume: 35.85,
price: 15.60,
quality_code: 1001,
quality_name: "95 Oktan".to_string(),
card_number: "7825017523017000642".to_string(),
station: "97254".to_string(),
terminal: "1".to_string(),
pump: "2".to_string(),
receipt: "000910".to_string(),
control_number: None,
customer_id: Some(1),
};
assert_eq!(tx.batch_number, "409");
assert_eq!(tx.amount, 559.26);
assert_eq!(tx.volume, 35.85);
assert_eq!(tx.quality_name, "95 Oktan");
assert_eq!(tx.customer_id, Some(1));
assert!(tx.control_number.is_none());
}
/// Tests that NewTransaction can be created with control number.
#[test]
fn new_transaction_with_control_number() {
let date = NaiveDateTime::parse_from_str("2026-02-01 06:40:14", "%Y-%m-%d %H:%M:%S").unwrap();
let tx = NewTransaction {
transaction_date: date,
batch_number: "409".to_string(),
amount: 267.23,
volume: 17.13,
price: 15.60,
quality_code: 1001,
quality_name: "95 Oktan".to_string(),
card_number: "554477******9952".to_string(),
station: "97254".to_string(),
terminal: "1".to_string(),
pump: "2".to_string(),
receipt: "000898".to_string(),
control_number: Some("756969".to_string()),
customer_id: None,
};
assert_eq!(tx.control_number, Some("756969".to_string()));
assert!(tx.customer_id.is_none());
}
/// Tests decimal precision for monetary values.
#[test]
fn transaction_decimal_precision() {
let date = NaiveDateTime::parse_from_str("2026-02-01 10:15:16", "%Y-%m-%d %H:%M:%S").unwrap();
let tx = NewTransaction {
transaction_date: date,
batch_number: "409".to_string(),
amount: 123.45,
volume: 7.891,
price: 15.625,
quality_code: 1001,
quality_name: "95 Oktan".to_string(),
card_number: "CARD123".to_string(),
station: "1".to_string(),
terminal: "1".to_string(),
pump: "1".to_string(),
receipt: "001".to_string(),
control_number: None,
customer_id: None,
};
// Verify precision is maintained
assert_eq!(tx.amount, 123.45);
assert_eq!(tx.volume, 7.891);
assert_eq!(tx.price, 15.625);
}
/// Tests that anonymized transactions have no customer.
#[test]
fn anonymized_transaction_has_no_customer() {
let date = NaiveDateTime::parse_from_str("2026-02-01 06:40:14", "%Y-%m-%d %H:%M:%S").unwrap();
let tx = NewTransaction {
transaction_date: date,
batch_number: "409".to_string(),
amount: 267.23,
volume: 17.13,
price: 15.60,
quality_code: 1001,
quality_name: "95 Oktan".to_string(),
card_number: "554477******9952".to_string(),
station: "97254".to_string(),
terminal: "1".to_string(),
pump: "2".to_string(),
receipt: "000898".to_string(),
control_number: Some("756969".to_string()),
customer_id: None,
};
assert!(tx.customer_id.is_none());
// Card number is still stored
assert_eq!(tx.card_number, "554477******9952");
}

449
tests/repository_test.rs Normal file
View File

@@ -0,0 +1,449 @@
//! Tests for the repository module.
//!
//! AI AGENT NOTE: These tests verify database operations using the test database.
//! Each test uses a transaction that is rolled back after the test completes.
use sqlx::Row;
async fn create_test_pool() -> sqlx::MySqlPool {
invoice_generator::db::create_pool(&std::env::var("DATABASE_URL").unwrap_or_else(|_| {
let config = invoice_generator::config::Config::load(invoice_generator::config::Env::Test).unwrap();
config.database.connection_url()
})).await.unwrap()
}
// ===== Customer Tests =====
#[tokio::test]
async fn customer_insert_returns_id() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST001")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let id: u64 = row.get("id");
assert!(id > 0);
}
#[tokio::test]
async fn customer_find_existing() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST002")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer = sqlx::query_as::<_, invoice_generator::db::models::Customer>(
"SELECT id, customer_number, card_report_group, created_at, updated_at
FROM customers WHERE customer_number = ?",
)
.bind("TEST002")
.fetch_one(&mut *tx)
.await
.unwrap();
assert_eq!(customer.customer_number, "TEST002");
assert_eq!(customer.card_report_group, 1);
}
#[tokio::test]
async fn customer_find_nonexistent() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
let customer = sqlx::query_as::<_, invoice_generator::db::models::Customer>(
"SELECT id, customer_number, card_report_group, created_at, updated_at
FROM customers WHERE customer_number = ?",
)
.bind("NONEXISTENT")
.fetch_optional(&mut *tx)
.await
.unwrap();
assert!(customer.is_none());
}
#[tokio::test]
async fn customer_multiple_cards() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST003")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
sqlx::query("INSERT INTO cards (card_number, customer_id) VALUES (?, ?)")
.bind("CARD001")
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
sqlx::query("INSERT INTO cards (card_number, customer_id) VALUES (?, ?)")
.bind("CARD002")
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
let row = sqlx::query("SELECT COUNT(*) as count FROM cards WHERE customer_id = ?")
.bind(customer_id)
.fetch_one(&mut *tx)
.await
.unwrap();
let count: i64 = row.get("count");
assert_eq!(count, 2);
}
// ===== Card Tests =====
#[tokio::test]
async fn card_insert_with_customer() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST004")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
sqlx::query("INSERT INTO cards (card_number, customer_id) VALUES (?, ?)")
.bind("TESTCARD001")
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
let card_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let id: u64 = card_row.get("id");
assert!(id > 0);
}
#[tokio::test]
async fn card_find_by_number() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST005")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
sqlx::query("INSERT INTO cards (card_number, customer_id) VALUES (?, ?)")
.bind("TESTCARD002")
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
let card = sqlx::query_as::<_, invoice_generator::db::models::Card>(
"SELECT id, card_number, customer_id, created_at, updated_at
FROM cards WHERE card_number = ?",
)
.bind("TESTCARD002")
.fetch_one(&mut *tx)
.await
.unwrap();
assert_eq!(card.card_number, "TESTCARD002");
}
// ===== Transaction Tests =====
#[tokio::test]
async fn transaction_insert_single() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST006")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
sqlx::query(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 10:00:00', 'TEST', 100.50, 10.5, 9.57, 1001, '95 Oktan', 'CARD123', 'S001', 'T1', 'P1', 'R001', ?)",
)
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
let tx_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let id: u64 = tx_row.get("id");
assert!(id > 0);
}
#[tokio::test]
async fn transaction_insert_anonymized() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 10:00:00', 'TEST', 100.50, 10.5, 9.57, 1001, '95 Oktan', 'ANON******1234', 'S001', 'T1', 'P1', 'R002', NULL)",
)
.execute(&mut *tx)
.await
.unwrap();
let tx_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let id: u64 = tx_row.get("id");
assert!(id > 0);
}
#[tokio::test]
async fn transaction_count() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST007")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
for i in 0..5 {
sqlx::query(&format!(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 10:00:00', 'TEST', {}, 10.0, 10.0, 1001, '95 Oktan', 'CARD{}', 'S001', 'T1', 'P1', 'R00{}', ?)",
100.0 + i as f64,
i,
i
))
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
}
let row = sqlx::query("SELECT COUNT(*) as count FROM transactions WHERE customer_id = ?")
.bind(customer_id)
.fetch_one(&mut *tx)
.await
.unwrap();
let count: i64 = row.get("count");
assert_eq!(count, 5);
}
// ===== Query Tests =====
#[tokio::test]
async fn query_transactions_by_customer() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST008")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
for i in 0..3 {
sqlx::query(&format!(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 {}:00:00', 'TEST', 100.0, 10.0, 10.0, 1001, '95 Oktan', 'CARD{}', 'S001', 'T1', 'P1', 'R00{}', ?)",
10 + i,
i,
i
))
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
}
let transactions = sqlx::query_as::<_, invoice_generator::db::models::Transaction>(
"SELECT t.id, t.transaction_date, t.batch_number, t.amount, t.volume, t.price, t.quality_code, t.quality_name, t.card_number, t.station, t.terminal, t.pump, t.receipt, t.control_number, t.customer_id, t.created_at
FROM transactions t
WHERE t.customer_id = ?",
)
.bind(customer_id)
.fetch_all(&mut *tx)
.await
.unwrap();
assert_eq!(transactions.len(), 3);
}
#[tokio::test]
async fn query_excludes_anonymous_from_customer_invoice() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST009")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
sqlx::query(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 10:00:00', 'TEST', 100.0, 10.0, 10.0, 1001, '95 Oktan', 'KNOWNCARD', 'S001', 'T1', 'P1', 'R001', ?)",
)
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
sqlx::query(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 11:00:00', 'TEST', 50.0, 5.0, 10.0, 1001, '95 Oktan', 'ANON******9999', 'S001', 'T1', 'P1', 'R002', NULL)",
)
.execute(&mut *tx)
.await
.unwrap();
let row = sqlx::query(
"SELECT COUNT(*) as count FROM transactions WHERE customer_id = ?",
)
.bind(customer_id)
.fetch_one(&mut *tx)
.await
.unwrap();
let count: i64 = row.get("count");
assert_eq!(count, 1); // Only the known transaction
}
#[tokio::test]
async fn sales_summary_by_product() {
let pool = create_test_pool().await;
let mut tx = pool.begin().await.unwrap();
sqlx::query(
"INSERT INTO customers (customer_number, card_report_group) VALUES (?, ?)",
)
.bind("TEST010")
.bind(1u8)
.execute(&mut *tx)
.await
.unwrap();
let customer_row = sqlx::query("SELECT LAST_INSERT_ID() as id")
.fetch_one(&mut *tx)
.await
.unwrap();
let customer_id: u32 = customer_row.get("id");
sqlx::query(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 10:00:00', 'TEST', 100.0, 10.0, 10.0, 1001, '95 Oktan', 'CARD001', 'S001', 'T1', 'P1', 'R001', ?)",
)
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
sqlx::query(
"INSERT INTO transactions (transaction_date, batch_number, amount, volume, price, quality_code, quality_name, card_number, station, terminal, pump, receipt, customer_id) VALUES ('2026-02-01 11:00:00', 'TEST', 50.0, 5.0, 10.0, 4, 'Diesel', 'CARD001', 'S001', 'T1', 'P1', 'R002', ?)",
)
.bind(customer_id)
.execute(&mut *tx)
.await
.unwrap();
let summaries = sqlx::query_as::<_, invoice_generator::db::repository::ProductSummary>(
"SELECT quality_name, COUNT(*) as tx_count, SUM(amount) as total_amount, SUM(volume) as total_volume
FROM transactions
GROUP BY quality_name",
)
.fetch_all(&mut *tx)
.await
.unwrap();
assert_eq!(summaries.len(), 2); // Two products: 95 Oktan and Diesel
}