uploading sequence needs to be handled properly
** single upload fastq works however it's very slow, even on the same VM... takes 10+ minutes to upload 400MB
** create the proper tables (IUS?) and link to proper records
** configurable upload location, store as processing/file events in the DB
** file checking, need to make sure we just support fastq files encoding either base or color space data
** after uploading there is a new processing row with workflow_run_id and ancestor_workflow_run_id blank, status pending, exit_status and process_exit_status null, run_start/stop_tstmp null, algorithm null
** this processing row is linked to the lane! It shouldn't be, it should be linked via the IUS
** this is ambiguous. The web form asks for a sample to associate it with. yet it then links the uploaded processing event to a lane. A given sample can have many lanes associated with it, and each of these lanes can have more than one sample. So by attaching it to lane rather than an IUS means it's ambiguous. Further, a given sample may not have any sequencer_runs/lanes/IUSs associated with it. So how would the upload processing event get linked to this? There is no processing_sample or sample_processing table to link a sample. Although there probably should be along with processing_experiment, processing_study.
** when I try to upload to a sample that is not associated to a lane/IUS the web app throws an error... one out of three times... not sure what happened, maybe a state issue?
** in the GUI it just shows the link through...