Automating pipelines open-source reproducible toolkit Docker containers for data quality control, processing, and analysis maximizes data value and minimizes time spent performing manual tests. Achieving high throughput with these pipelines requires more computational resources than a standard laboratory workstation, leading to migrating pipelines to high-performance computing systems. We created an open-source wrapper for higher security Singularity images required for resting-state functional connectivity workflows on high-performance computing systems, which extends function to report collection, network-based statistics, and versioning documentation. This pipeline was then tested with an existing aging and cognition data set for benchmarking and demonstration.
This abstract and the presentation materials are available to members only; a login is required.