Difference between revisions of "MIRC CTP"
|Line 77:||Line 77:|
<Server port="80" />
<Pipeline name="Main Pipeline">
<Pipeline name="Main Pipeline">
Revision as of 14:43, 5 August 2007
This article describes a proposed project to develop a stand-alone, configurable processing application for clinical trials data, based on MIRC components and using the MIRC internet transport mechanism. The article is intended for people who have used MIRC for clinical trials data acquisition and management. Please use the Discussion tab to add comments.
MIRC supports clinical trials through two applications, one for data acquisition at an imaging center (FieldCenter) and one for management of the data at a principal investigator's site (MIRC).
The FieldCenter application acquires images via the DICOM protocol, anonymizes them, and transfers them (typically using HTTP, although DICOM is also supported) to a principal investigator's MIRC site. FieldCenter also contains a client for the Update Service of a MIRC site, allowing the FieldCenter application to save date on, and obtain software updates from, the principal investigator's site.
The MIRC site software contains a partially configurable processing pipeline for clinical trials data, consisting of:
- A receiver for HTTP connections from FieldCenter applications transferring data files into the processing pipeline.
- A receiver form DICOM datasets for iinsertion into the processing pipeline.
- A user-defined component for processing data received by the HttpImportService before it is further processed by other components.
- A component for anonymizing DICOM objects or XML objects.
- A component providing queue management and submission of data objects to a user-defined interface to an external database management system.
- A component in the DicomImportService pipeline providing queue management and transmission of data objects to one or more external systems using the HTTP protocol.
- A component in the HttpImportService pipeline providing queue management and transmission of data objects to one or more external systems using the DICOM protocol.
The processing pipelines for the HttpImportService and DicomImportService are different. They are not symmetrical. For example, the HttpImportService does not have access to the anonymizer except as part of the DatabaseExportService. Another limitation is that objects received via one protocol can only be exported via the other. While these limitations are consistent with the requirements of most trials, it is clear that a completely symmetrical design would provide better support for more sophisticated trials while still satisfying the requirements of simple ones.
The following are proposed top-level requirements for the implementation:
- Single-click installation.
- Processing pipeline supporting configurable number of stages, with the class implementing each stage being configurable.
- Support for multiple quarantines.
- Pre-defined implementations for key components:
- HTTP Import
- DICOM Import
- DICOM Anonymizer
- XML Anonymizer
- Storage Service
- Database Export
- HTTP Export
- DICOM Export
- Web-based monitoring of the application's status, including:
- quarantine sizes
- status of each pipeline stage:
- stage name
- queue size, when relevant
- data/time of last object received
- Support for the FieldCenter Update Service client.
The core of the proposed implementation is a manager that orchestrates one or more pipelines.
A Pipeline is a manager that moves data objects through a sequence of processing stages. Each stage in the pipeline performs a specific function on one or more of the four basic object types supported by MIRC:
Each stage is an implementation of a specific interface. All stages expose certain basic methods that provide status information as well as access to the stage's output object. Each Pipeline contains one ImportService as its first component. Each pipeline stage is provided access to a Quarantine directory, which may be unique to the stage, into which the Pipeline will place objects that are rejected by a stage, thus aborting further processing. At the end of the pipeline, the manager calls the ImportService to remove the processed object from its queue.
An ImportService receives objects via a protocol and enqueues them for processing by subsequent stages.
A StorageService stores an object in a file system. It is not queued, and it therefore must complete before subsequent stages can proceed. A StorageService may return the current object or the stored object in response to a request for the output object, depending on its implementation.
A Processor is a generic class to perform some kind of processing on an object. It is not queued. A processor exposes methods with calling signatures that are unique to the object type. In the context of the current MIRC implementation, a Preprocessor is a Processor, as is an Anonymizer. The result of a processing stage is an object that is passed to the next stage in the pipeline.
An ExportService provides queued transmission to an external system via a defined protocol. Objects in the queue are full copies of the objects submitted; therefore, subsequent processing is not impeded if a queue is paused, and modifications made subsequently do not affect the queue entry, even if they occur before transmission. (Note: This is different from the current MIRC implementation.)
The configuration is specified by an XML file as in this example:
<Configuration> <Server port="80" /> <Pipeline name="Main Pipeline"> <ImportService name="HTTP Import" class="org.rsna.trials.HttpImportService" root="D:\abc\http-import" port="7777" quarantine="HttpImportQuarantine" /> <Processor name="The Preprocessor" class="org.myorg.MyPreprocessor" quarantine="PreprocessorQuarantine" /> <Processor name="Main Anonymizer" class="org.rsna.trials.Anonymizer" dicom-script="dicom-anonymizer-1.properties" xml-script="xml-anonymizer-1.script" zip-script="zip-anonymizer-1.script" quarantine="MainAnonymizerQuarantine" /> <ExportService name="Database Export" class="org.rsna.trials.DatabaseExportService" adapter-class="org.myorg.MyDatabaseAdapter" root="D:\abc\database-export" quarantine="DatabaseExportQuarantine" /> <Processor name="Provenance Remover" class="org.rsna.trials.Anonymizer" dicom-script="dicom-anonymizer-2.properties" xml-script="xml-anonymizer-2.script" zip-script="zip-anonymizer-2.script" quarantine="ProvenanceRemoverQuarantine" /> <StorageService name="Storage" class="org.rsna.trials.StorageService" root="D:\abc\storage" return-stored-file="no" quarantine="StorageQuarantine" /> <ExportService name="PACS Export" class="org.rsna.trials.DicomExportService" root="D:\abc\pacs-export" dest-url="dicom://DestinationAET:ThisAET@ipaddress:port" quarantine="PacsExportQuarantine" /> <ExportService name="Other Export" class="org.rsna.trials.HttpExportService" root="D:\abc\other-export" dest-url="http://ipaddress:port" quarantine="OtherExportQuarantine" /> </Pipeline> </Configuration>
Multiple Pipeline elements may be included, but each must have its own ImportService element, and their ports must not conflict.
Each pipeline stage class has a constructor that is called with its configuration element, making it possible for special processor implementations to be passed additional parameters from the configuration.
To provide access to the status of the components, the application can be configured with one or more HTTP or HTTPS connectors which serve web pages and support servlet-like functionality. If multiple connectors are configured, they all access the same set of servlets and the same server ROOT directory.
The ConfigurationServlet displays the contents of the configuration file in a web page.
The StatusServlet displays the status of all the pipeline stages in a web page.
The UpdateService supports the Update Service clients in FieldCenter applications, serving software updates and saving the remapping tables in trials configured to use it.
4 Open Questions
- Is it necessary to support more than one Pipeline in the application?
- Is it necessary or desirable to serve files from Storage objects via HTTP or HTTPS?