It’s been a busy year for the OLRC team. As we wind down for the break, we wanted to share a quick update on what’s been happening, as well as what’s in store for 2015. 

This fall, remote node locations were selected, and Scholars Portal systems staff travelled to assist with the installation of the storage racks. In early 2015, two more installations will take place, giving us a total of five nodes distributed widely across Ontario.

Photo of an IBM server stack Photos of a man standing in a server room surrounded by server stacks full of wires 

Also this fall, we congratulated Steve Marks as he went on parental leave, and welcomed Dale Askey from McMaster University as the new chair of the OLRC steering committee.

Earlier this month, Dale gave a short talk on the progress of the OLRC at Scholars Portal Day. You can view his slides, which outline the initial drive to build the OLRC, as well as what's ahead here



Naming a project is tricky. You want to convey the breadth and scope of the work you’re doing, as well as the benefits to users of the final service. But you also want the name to be catchy, memorable, and easy to say. 

The Ontario Digital Library Research Cloud, aka the ODLRC, was our first try at a name. It acknowledged the breadth of what we were trying to do, but you wouldn’t call it memorable or easy to remember!

After much discussion, the Steering Committee has opted to keep most of this name (with the ‘Digital’ deemed unnecessary). We will use ‘the OLRC’ to refer to the large-scale distributed storage and preservation project currently underway. Services built on top of the OLRC, such as text mining or data visualization tools, will be given their own names, but will be “powered by the OLRC.”

Some legacy documentation may be altered to reflect this change and avoid confusion.

And so: the Ontario Library Research Cloud is born!


When the ODLRC is fully operational, it will consist of four or more data storage centres at different OCUL locations. These data centres, or nodes, will be constantly talking to each other, making sure data is being replicated between them for preservation and ease of access. 

In order to make an informed decision about where the nodes should live, the ODLRC steering committee needs to get some evidence of how the servers will perform in various possible circumstances and configurations. What’s the most efficient way to set up this network? How much bandwidth will it need? How will the node 1 react if node 4 goes offline?

For the past month, Scholars Portal and the University of Toronto have been working with York University and Ryerson University on a pilot project to answer these questions. Using a shared 10GB connection through ORION, with routers extended through all three schools, we’ve started testing network capacity for content uploading, replication, and downloading. 

During testing, the proxy node reached maximum network capacity during the upload of 30 TB of test content, while the data notes were working at 50-60% capacity. We’ll next test increasing proxy node capacity (we started with 1 GB), and measure the affect of adding additional nodes.

A graph showing network traffic at Ryerson

Next up, we’re going to look at the way these nodes are talking to each other, how viewing and downloading of content is affected during uploads of large datasets, and do some disaster scenario testing.

Within the next few weeks we will have a much better idea of the network’s needs, and can begin making plans to add nodes outside of Toronto.




Last Friday, library staff from nine OCUL institutions gathered at Ryerson University for the first ODLRC Hackfest.

photo by MJ Suhonos

One goal of the Hackfest was to build familiarity with Swift, a module of the OpenStack cloud management software used to connect individual storage nodes into a distributed network. More broadly, the day was meant to strengthen relationships across OCUL, encouraging future collaborative efforts around the ODLRC. Knowledge gained from the hackfest, and from future days like it, will help each institution leverage their investment in this multi-year project, which will provide storage, preservation, and access to over a petabyte of local content.

photo by Mari Vihuri


The morning was spent learning about Swift; in the afternoon, small groups got to work on a range of projects, many of which are now posted to the ODLRC github.

For a more visual look at our day, see the ODLRC Hackfest Storify.

A huge thank-you to Nick Ruest  (York University) who organized the day, MJ Suhonos (Ryerson University), who handled on-site logistics, and Mari Vihuri (Scholars Portal Graduate Student Library Assistant) for documenting the day! 


Through an RFP process administered by the University of Toronto, the ODLRC technical team has selected Dell as the hardware vendor for the cloud storage systems.

Purchased equipment includes Dell Power Edge R720xd server populated with two 2.8GHz Xeon processors, 256GB of RAM, and two 200GB SSD drives, which will be used to run the operating system and the OpenStack software. Each head unit also contains twelve 4TB SAS drives for an internal storage capacity of 48TB.

Server rack Server rack


The Ontario Digital Library Research Cloud (ODLRC) has been awarded a Productivity and Innovation Fund grant from the Ontario Ministry of Training, Colleges, and Universities (MTCU).

Ten OCUL institutions jointly requested $1.2 million to be spent on hardware, software, and staffing for the development of the ODLRC.  The collaborative nature of the project, as well as the significant cost savings brought about through large-scale consortial purchasing, made the proposal a success, and full funding was awarded.

 Using this funding, as well as funding committed by each partner institution, Scholars Portal and the partner libraries will build a high capacity, geographically distributed storage and computing network for Ontario researchers. The ODLRC will house large volumes of digital content, allowing for cost effective and sustainable long-term preservation. It will also support data and text mining through the development, hosting, and support of emerging digital research tools.

For more information contact