This document outlines the detailed requirements for the DiRPi Accelerator Diagnostics Project. The project aims to develop a web-based interface for monitoring and controlling a network of Raspberry Pi devices (DiRPis) connected to detector hardware at accelerator facilities. The project involves multiple stakeholders and locations, including LANL and FEL, and intends to serve different groups working on accelerator diagnostics.
- DiRPi Devices: Raspberry Pi computers equipped with detector hardware. There are 15 DiRPi devices distributed across multiple locations like Drift Tube #1, etc.
- Server: Latest Mac Mini with 14TB external storage for the database and 256GB local storage for metadata.
- Backend: Django Framework
- Database: MongoDB for storing run data and configurations.
- Frontend: HTML, CSS, JavaScript for interface design.
- Data Processing: Python for data analytics and ROOT for specialized data processing tasks.
- Job Queue Service: To be determined for managing data processing tasks.
-
core
- User Authentication: Secure access to the application with password-protected user accounts.
- User Profile Management: Allows users to manage their profiles, including settings and preferences.
-
control
- DiRPi Device Management: Enables users to manage individual DiRPi devices, send configurations, and initiate runs.
-
monitor
- DiRPi Monitoring: Real-time status tracking of DiRPi devices and groups. Includes ping-based health checks and alerting mechanisms.
-
analysis
- Run Management: Record, search, and monitor runs. Supports tagging and keyword-based search.
- Data Analysis and Visualization: Data analytics with support for specialized data filters, anomaly detection, and various visualization options.
-
docs
- Documentation: User guides, FAQs, technical documentation, and self-help resources.
-
queue
- Job Queue Management: Enables users to queue data processing tasks for both real-time and batch processing.
-
User
- Custom fields:
associated_groups
,associated_runs
,settings
- Custom fields:
-
DiRPiGroup
- Fields:
group_name
,location
,password
,associated_users
,health_status
- Fields:
-
DiRPiDevice
- Fields:
device_id
,group_id
,configurations
,status
,last_ping_time
,health_status
- Fields:
-
Run
- Fields:
run_id
,metadata
,configurations
,associated_group
,tags
,status
- Fields:
-
DataRecord
- Fields:
run_id
,timestamp
,data_values
,anomalies
- Fields:
-
JobQueue
- Fields:
job_id
,status
,associated_run
,created_at
,completed_at
- Fields:
-
Documentation
- Fields:
doc_id
,title
,content
,type
- Fields:
- UserRegistrationForm
- UserLoginForm
- UserSettingsForm: For updating user settings.
- DiRPiGroupForm
- DiRPiDeviceForm
- RunForm
- DataFilterForm: For selecting data filters during analysis.
- JobQueueForm
- DocumentationForm: For adding or editing documentation.
- HomeView: Dashboard with an overview of DiRPi groups, runs, and job queue status.
- AboutView
- UserRegisterView
- UserLoginView
- UserSettingsView: User settings page.
- DiRPiGroupListView
- DiRPiGroupDetailView
- DiRPiDeviceDetailView
- RunListView
- RunDetailView
- DataAnalysisView: For performing advanced analytics on a run.
- JobQueueView
- DocumentationListView: List all documentation.
- DocumentationDetailView: View details of a specific document.
/
: Home/about/
: About/register/
: User Registration/login/
: User Login/settings/
: User Settings/logout/
: User Logout/groups/
: DiRPi Groups/groups/<id>
: DiRPi Group Details/devices/<id>
: DiRPi Device Details/runs/
: Run List/runs/<id>
: Run Details/analysis/<id>
: Data Analysis for a Run/queue/
: Job Queue/docs/
: Documentation List/docs/<id>
: Documentation Details
- Users must be able to securely login and logout.
- Password recovery options must be available.
- Create, modify, and delete password-protected DiRPi groups.
- Add, modify, and delete individual DiRPi devices within a group.
- View real-time status and configurations of DiRPi devices.
- Monitor for inactivity or malfunction based on DiRPi pings.
- Send configurations to DiRPi devices, which will be applied upon the next ping from the device.
- Initiate, terminate, and monitor data acquisition runs.
- Search for runs based on various metadata such as run number, tags, keywords, dates, number of events, and trigger settings.
- View detailed analytics for each run, including rate as a function of time, pulse height spectra, and other Data Quality Monitoring (DQM) plots.
- Apply various data processing algorithms including filters for signal correction and anomaly detection.
- User-defined Python or ROOT scripts for custom data processing.
- Machine Learning algorithms for detecting anomalies based on characteristic functions.
- Users should be able to queue jobs for data processing.
- Support for recalibration and reprocessing of old runs while preserving original data.
- Comprehensive user guides and technical documentation focusing on aspects like self-calibration of LYSO, etc.
- Security: All groups should be password protected. Additionally, the system should be protected against arbitrary code execution.
- Concurrent Operations: Multiple DiRPi devices and data processing jobs must be able to operate concurrently.
- Extensibility: The system should be designed to be easily extensible for future features and requirements.
- Data Integrity: Ensure all data is consistently recorded and stored. Implement checks to validate the integrity of the data.
- Integration of GPT for automated macro writing tasks and possibly for checking malicious code.
- Specialized data filters for signal correction based on peak width, double peak structure, etc.