Skip to main content

The drug discovery challenge and how imaging can help

Nate Holmes, product manager for machine vision and motion control at National Instruments, finds that sophisticated imaging systems are required for high-content screening used in drug discovery. Here, he reports from the Vision Summit at this year’s NIWeek, which took place in Austin, Texas at the beginning of August

‘Over half the drugs on the market today are discovered by accident,’ explained Dr Urban Liebel, co-founder of Acquifer based in Karlsruhe Germany. ‘Also noteworthy, major drug companies invest over $2 billion on average to develop a new drug.’

While there is some debate on what the actual figures are, nearly everyone agrees that bringing a new drug to market is expensive considering it can cost billions of dollars. Dr Liebel recently delivered a compelling keynote address at the Vision Summit during NIWeek 2014 in Austin, Texas where he spoke about introducing the next generation of standardised platforms to decrease cost and improve results in the drug discovery process.

Drug discovery involves determining the effects of various compounds and combinations of compounds on a sample organism. The process is called high-content screening (HCS) and requires modifying the biology of a suitable sample with a green fluorescent protein, which helps identify the reactions of the sample organism to the new compound under test. Put simply, Dr Liebel shares that it is a matter of asking a question and converting that question to photons. The result of an experiment can then be measured using vision and image processing to determine what is happening with the fluorescent compounds in the organisms.

One of the challenges with HCS is the sheer number of experiments that must be performed to screen on a genome-wide level or a compound library-wide level to determine effects. Each question requires hundreds of thousands, if not millions, of experiments. Zebrafish larva and yeast are examples of sample organisms Dr Liebel uses in his experiments. When dealing with such large sample sets consistency is critical and the orientation of samples can cause issues with the process. He and his team are solving challenges in new ways by implementing simple changes, such as cost effective and readily available 3D printed parts to orient samples, as well as drive consistency and simplicity in the automated inspection process.

Once an experiment has been devised to convert a question to photons, image analysis is performed using high-performance processing elements and open source machine vision libraries to convert the photons to bytes. It is worth noting that HCS generates a massive amount of bytes. The collection of data generated equates to 100 Gb to 100 Tb per experiment. Consider: 384 wells x 50 z slices x 8 Mbyte images x 3 colours x 5 time points equals 288,000 images or 884 Gbyte of data. Of course, millions of images or pieces of data are required to answer the right questions and discover a promising drug candidate.

To make this process feasible, HCS must be automated. Dr Liebel calls this automation challenge a true team sport because different disciplines need to come together to build an HCS machine. There are many necessary components of this machine, including motion control of five axis (x, y, z, light, objectives), positioning resolution on the order of 1nm, complex machine vision analysis, fluorescence microscopy, as well as very large data set processing, management and storage. Ultimately, driving efficiencies in the process will reduce sampling time and energy.

Dr Liebel explains that there are a relatively small number of true microscope experts (usually physicists) available for this type of work. Traditional systems are expensive (ranging from $400-600k) and daily operation usually requires experts. Because these experts tweak and tune the machine, it is very difficult to achieve consistent, reproducible results across different machines due to the varying conditions with experiments. For example, nobody has developed a light source that will stay constant across thousands of hours of operation because there has not been a need. Rather there have always been experts operating the machine that could tweak settings as needed to compensate. Traditional instruments are not designed to meet reproducible quality across different labs. One of the main goals of developing a next generation HCS machine is to make systems easy to operate and designed in such a way that machines in various labs can deliver reproducible quality standards.

To complicate matters, cloud or off-site storage is not an option for all of the generated data as it is too slow and the datasets cannot be accessed quick enough. To directly address this challenge Dr Liebel and team developed a ‘micro IT’ setup local to each machine. This includes RAID arrays for fast storage and processing, rapid network interfaces to the instruments using 10 Gbit/sec+ transfer speeds and a high-performance processing module to handle image processing.

The development of next generation HCS machines is possible with the collaboration of multi-disciplinary teams using customisable off-the-shelf technology. Ultimately, this powerful combination will drive disruptive innovation. As equipment becomes more cost effective, easier to use and more reliable, HCS capabilities will also become more readily available to a larger audience, accelerating the search for a cure to many diseases.

Related articles

NIWeek Vision Summit to highlight automated and robotic inspection

Patient monitoring in hospitals to improve with non-contact thermal imaging device

Further information

National Instruments

Topics

Read more about:

Pharmaceuticals, Business

Media Partners