I am starting a new task whereby users need to upload some data files and attach some metadata about them. There could be a bunch (50ish) in a single upload. Each file needs to be annotated individually, however there are some group properties that can get applied across the lot of uploads.
The workflow will go something like this:
- Identify files to be archived
- Ask use for group attributes
- upload files to server
- create an entry for each file in the database
- extract out as much info from the binary files (multiple formats) and attach to that record
- For each incomplete record, ask the user to fill-in the blanks
- This could be done by original uploader or assigned to another user for help
- When all data is complete, submit the record, and file to another existing service for archiving in a database.
I am totally fine to build this as a custom tool (website, or CLI), but it sounds like this pattern of creating a workflow, with automation (file reading) and manual steps in it might fit with something already out there.
I guess I am just asking if anyone has some immediate tools that come to mind that are easy for humans to use, and do this kind of thing well.
Perhaps some ETL tools? I thought about Apache Airflow but it doesn't look like it does the manual intervention I am after (form for manual data entry).
Any suggestions would be appreciated.
question from:https://stackoverflow.com/questions/65931003/workflow-etl-tools-with-manual-entry-steps