The Hosted Processing Facility for the Exploitation of Large Datasets (HPFELD) project aims to demonstrate a means to make the processing of very large datasets much easier for users.
HPFELD is a collaborative project, involving STFC, Magellium and Terradue. There are several barriers to the expolitation of larger datasets held at data centres. There are many large science and commercial data archives, but members of the user community may be unable to access this data easily. Even if they can, there is the problem of transferring (and storing) the huge quantities of data involved. This confinement of data is a major obstacle to the progress of both research and business. A far more sensible and forward looking approach is to couple the archives to processing capacity, made available over the network. Remote users upload their processing algorithms for local execution, and only the data-reduced result is then downloaded. This generates large savings in time and effort for the data users. The key innovation being developed within the project is the generalisation of such a service to work with all types of data centre.