An Automated Video Feedback Application
Feelback is deployed at https://feelbackdemo.me
Update: Azure hosting subscription has ended.
demo.mp4
CLI is used only to create annotated video, visualizations, and testing
It DOES NOT generate analytics
Sample Usage
python feelback_cli.py input_video -o output -f native -v
All Options
Feelback is An Automated Video Feedback Framework
positional arguments:
input_video Input Video File Path
options:
-h, --help show this help message and exit
-o filename, --output filename
Save processed video to filename.mp4
--output-annotations annotations [annotations ...]
Which annotations to add to processed video
annotations can be any of: ['all', 'none', 'ids', 'age', 'gender', 'emotions', 'attention']
You can add multiple annotations separated by space
Default: all
--dump filename Dump Feelback Object After Processing
--load filename Load Dumped Feelback Object [For Debug]
--output-key-moments filename
Save Key Moments Visualizations to file
-f N | native, --fps N | native
Process N frames every second, Or `native` to process all frames
-v, --verbose Enable more verbosity
- Create a
.env
file like theexample.env
file and change the following variables:UPLOAD_FOLDER
,ANNOTATED_UPLOAD_FOLDER
,DATABASE_URL
,THUMBNAILS_FOLDER
- run
python feelback_server.py
to run development server
- Install node.js >= 16
- cd into
feelback-front
- run
npm install
- run
npm run serve
- to build for production, run
npm run build