I have spent my entire career either working in the labs or supporting them, first in Quality and later, in IT. Having started my career as a QC Analyst in the labs, I can honestly say that IT was our least favorite department. They were always bringing in new software for us to use and introducing new controls that we felt were limiting our ability to perform our work. After only a few years in Quality, I was asked to participate in a software project as an SME. I always enjoy a challenge and thought this would be a great way for me to “educate IT” on how things worked in the lab. Little did I know that this project would change my entire career, and my perspective!
After several years, I transitioned to an IT role, supporting the QC labs. I was tasked with bringing in new systems and introducing controls. I was part of the Y2K team and was responsible for ensuring our computerized systems in the labs were compliant with the latest regulations. I was part of the transition to CFR 21 Part 11 and implementing electronic records and electronic signatures (ERES) in the lab. I had become the person that everyone thought was a hinderance.
As I moved through my career, most of my time was spent deploying and supporting enterprise systems such as LIMS and ELNs. Several years ago, I expanded my role to supporting instruments and analytical software for both QC and Research labs and quickly learned that supporting Research was different type of challenge. While Quality labs expect to have controls in place, Research expects full access to their systems and software from anywhere, at any time. This expectation was more prevalent post-pandemic as scientists had become used to accessing their experiments remotely.
So, how does one successfully implement controls in a Research lab? And how do you balance the desire to move at the speed of science with the need to protect your data?
The first steps were to assess how the scientists were accessing their data. Were they using VPN directly to the lab? Was the lab even on a segregated network? Was there data encryption? We knew that our labs for most of our research sites were on a segregated network. This was a step in the right direction. Unfortunately, we found that our users were connecting to the business network using VPN and then using Remote Desktop to connect to their instruments. Users were moving data between networks using VPN and Remote Desktop. Not only was this inefficient, but also a security risk. We had to address this quickly, while ensuring there wasn’t a negative impact to the scientists. Fortunately, we were able to identify tools that could be leveraged to allow our scientists direct access to their instruments. We created a direct VPN connection specific to the labs and disabled Remote Desktop on instrument computers. With some training and user guides, we quickly gained adoption from the scientists. After about a year, the solution was so well received, that our QC labs wanted to leverage a similar setup so they could monitor instrument runs remotely.
Several years later, and at a new company, I was faced with a similar challenge. We didn’t have any QC labs, but we had several Research labs. We were also a rather young company, with a small, very new IT department. Our scientists had been managing their own instruments and access from the start of the company. Now, they had an IT department telling them they could no longer buy and install any software they wanted. They also had to get approval to remotely access their systems. We also segregated the lab network and created significant hardship for the labs. This was not a great start to the relationship with a new IT department and we almost immediately had to roll back the network segregation.
Over the next year, the IT team worked with the scientists to understand the true need for data access. We took the time to educate the scientists on the importance of data security and why these new controls were so important. IT partnered with the scientists, and even recruited one of them to join the IT team. We implemented an ELN and we encourage our scientists to engage with IT to put controls and systems in place that protect their data. We eased into new policies for remotely accessing lab data and solicited scientist feedback for selection of a Scientific Data Management Solution. We also started to build a data governance framework.
There have been many lessons learned, but there’s still so much work to be done.
When working at a large, well-established company, it was much easier to deploy controls in the Research labs. We leveraged corporate policies to help ensure compliance. There was little resistance within the labs. A smaller, younger company requires significantly more finesse to implement these controls. If you try to deploy too much, too fast, you will fail. You will damage the relationship between IT and the Scientists, and it will be difficult to regain that trust. By taking things slow, and engaging the scientists throughout the process, they will feel like they are part of the solution and will contribute to the success of the team. Because, at the end of the day, we all want the same thing: Secure data that is readily accessible to our scientists when they need it. When done right, IT CAN move at the speed of science!