Here are a few areas where Symphony can help solve problems and add value.
[Use Cases]
Here are a few areas where Symphony can help solve problems and add value.
When generating large quantities of LC-MS files, or very large LC-MS files, it is not unusual to have a bottleneck in the analysis workflow where files build up on the instrument-linked PC and can then take a substantial amount of time to copy to a network drive for archiving or processing. This process can be fully automated so that the user doesn’t have to waste time while copying is taking place, and also expensive MS instrumentation can spend more time generating data rather than sitting idle.
It is often desirable to reduce the size of raw LC-MS files to speed up transfer over a network, to reduce data archiving overheads, or to speed up data processing. This process can be fully automated using the Waters Data Compression tool in combination with Symphony. Both nominal mass and high resolution MS data can be reduced, and file size reductions of 5-10x in size are not uncommon. The level of reduction can be tailored to meet your particular needs.
It is possible to link a series of tasks together to create a completely automated data analysis pipeline, from LC-MS acquisition to identified analytes. An example is with a proteomics experiment where samples are added to the MassLynx Sample List and then the entire process through acquisition, data transfer off the instrument-linked PC to a processing PC, peak detection, de-isotoping, database searching, and results visualisation can all be automated.
Symphony contains a number of logging report functions that allow you to check that a specified processing pipeline has run to completion. If any parts of the process fail to execute, this can be easily detected and noted.
“By automating routine data-processing steps, Symphony saves our operator time, and allows us to conduct the most time-consuming parts of the informatics workflow in parallel to acquisition. Best case, it can save MONTHS of processing time, and in combination with noise-reduction, petabytes of storage.
We see great value in the modular nature of Symphony, allowing us to rapidly develop and test new processes for handling experimental data, including real-time QC, prospective fault detection, and tools for ensuring data-integrity.”
“Symphony offers a solution to address many challenges, providing a platform with automated, flexible and adaptable workflows for high-throughput handling of proteomic data.
Just the simple step of being able to seamlessly and automatically copy raw files to a remote file location whilst a column is conditioning, maximises the time we can use the instrument for analysis. Previously, the instrument could be idle for 1-2 hours whilst data is copied to a filestore in preparation for processing. With three Synapts generating data 24/7 in our laboratory, this alone is a major advance.
Symphony’s flexibility of being able to execute sample specific workflows directly from the Masslynx sample list will have a major impact on our productivity.
The scalable client-server architecture makes Symphony perfect for large scale high-throughput MS data processing, where the processing of highly complex data can only be addressed by calling on a range of computational resources.”
“New approaches are continuously being developed to extract increasing amounts of data from very data-rich ion mobility-assisted HDMSE experiments.
Plugging new algorithms into an automated Symphony pipeline provides the ingredients for exponential growth in information content that can be extracted from both new, and archived, samples.
Automation brings the possibilities of finding optimal parameter settings and reducing the possibility of errors, without significant time penalties.
I was amazed at the level of detail that I can see using these approaches!”