The amount of data available to clinical research teams and drug developers is massive. The real challenge lies in how best to put that mountain of data to work for the optimal benefit.
Outsourcing-Pharma recently connected with two representatives from CitiusTech, touching upon how to get the right infrastructure in place, efficiently process the data, and elevate outcomes for patients:
- Shreejit Nair, senior vice president and head of life sciences
- Ravinder Singh, vice president - consulting
OSP: Could you please share your perspective on the evolution of data in clinical research and drug development?
CT: The quote “Data is the new oil” holds true especially for the pharma industry. The last few years have been the perfect data storm for Biopharma. The variety of patient data available from NGS, imaging, and diagnostics has opened up new possibilities of personalized medicine through a better understanding of diseases, leading to novel breakthrough therapies.
Data has helped accelerate the pace of innovation with AI and ML being brought into the mainstream of clinical research and drug development augmenting human knowledge with speed and volume. This was quite evident in the quest for COVID treatment, how quickly pharma companies were able to screen thousands of compounds for potential treatments in a very short period of time.
Lastly, population and reimbursement datasets also make pharma a serious player in transitioning to value-based care, making drugs accessible and affordable to patients.
OSP: With the amount of data available to pharma firms and their research partners, what are some of the key considerations and challenges according to you?
CT: The first challenge is making data reliable. Managing large volumes of data as well as making it reliable for end-use is critical. This ensures that you trust the results it provides whether you use it for research, market access, manufacturing, planning, or evidence generation.
Given how critical data is in life sciences, there is little room for error. As we continue to innovate with AI/ ML, a robust data foundation that provides reliable data along with reliable and replicable results, it becomes the core to scale and succeed with AI in life sciences.
The second challenge is dealing with the complexity of privacy and confidentiality regulations associated with data coming in from inbound and outbound data streams. Ensuring that data is being rightfully used for the intended purpose requires a robust implementation of data privacy procedures, consent management processes, and necessary infrastructure to manage large-scale and high-speed data streams.
The last challenge is making data meaningful. While business users focus on deriving analytics from data, sorting out the complexities of data integration and harmonization can significantly help derive value at the edge for business users.
OSP: Could you please share best practices to securely store massive amounts of data, and put it to good use?
CT: Use of newer technologies, specifically in the life sciences domain, is picking up pace rapidly. For example, the latest instruments being used for research enable scientists to visualize highly detailed 3D models in much less time as compared to the time it took in the past. These new adoptions generate a huge amount of data sets that need to be effectively stored and managed.
Some simple rules to ensure secure data storage include:
- Building a strong data storage security policy
- Protecting management interfaces
- Implementing a data loss prevention (DLP) solution
- Monitoring user data access controls
- Keeping a strict control on data in the cloud
- Maintaining a “Policy as Code – Access”
Some of the best practices to put data to good use include:
- Data catalogs
- Data discovery solutions
- API based access
- Data governance solutions
- Data virtualization (in case of varied data storage)
Making data uniquely identifiable and storing it in structured formats for analysis ensures ease of usage and could act as a remedy that cures many of today’s life-threatening health conditions.
OSP: Analytics can be harnessed to improve life-sciences operations and to combat fraud, waste, and abuse. Could you please share insights on how analytics will harness the improvement?
Data cycles in the healthcare ecosystem across payers, providers, pharmacies, and regulatory bodies are evolving. There is a collaborative effort in developing closed-loop models where all stakeholders get access to cross-domain data.
This provides visibility into insights which was difficult when data was stored in siloed systems. For example, patterns in prescription vs indications or even severity of indications can provide insights into the appropriate use of drugs and therapies. Developing biomarkers and adoption of key biomarkers further helps provide the right treatment to the right person with maximum precision.
Transition to a value-based care model will also drive the ecosystem in accelerating their efforts of leveraging deep disease insights and translating them effectively into clinical practices. Collaboration across pharma and payer organizations will help streamline clinical practices by precisely finding outliers and making targeted interventions.
The third emerging area is the emergence of decision support systems that will accelerate the standardization of care with early and accurate diagnosis to reduce waste, due to hit and trial methods in several therapy areas.
OSP: What role do artificial intelligence, machine learning, and other advanced technologies play to help create new drugs and advanced personalized medicines?
CT: AI is revolutionizing the healthcare industry, especially the pharma & life sciences industry. Analyzing huge amounts of data in a short period of time is key and with the increasing use of AI, organizations derive insights from each piece of data available. Using AI organizations have also built the ability to rapidly re-evaluate their analysis as they receive new data sets.
Since patients now have the ability to monitor their own health using smartwatches and many other wearables, they are getting smarter and so is the pharma industry.
Organizations looking at the data, now have the freedom to take an increasingly personalized approach in designing therapies and treatments, to accurately predict and manage what health conditions may arise among certain patient groups. Personalized medicine has the potential to improve, and even save the lives of many people. This is only possible because of the adoption of AI/ ML which acts as the driving force behind making future breakthroughs.
OSP: Where do you see the greatest opportunities for improvement and advancement in the gathering, processing, and utilization of patient and clinical data?
CT: In today’s value-based care environment, it is pivotal that data analysts are equipped with the right tools to optimize data processing which helps them create actionable recommendations that drive improved outcomes. Using a data operating system with an Enterprise Data Warehouse (EDW) is a critical step towards building a robust analytics infrastructure.
Creating a new streamlined model to improve care and deliver better outcomes will reduce lead time. This will enable analysts to provide strategic insights from the disparate data collected across various systems.