The development of high-throughput, data-intensive biomedical research assays and technologies has created a need for researchers to develop strategies for analyzing, integrating, and interpreting the massive amounts of data they generate. Although a wide variety of statistical methods have been designed to accommodate 'big data,' experiences with the use of artificial intelligence (AI) techniques suggest that they might be particularly appropriate. In addition, the results of the application of these assays reveal a great heterogeneity in the pathophysiologic factors and processes that contribute to disease, suggesting that there is a need to tailor, or 'personalize,' medicines to the nuanced and often unique features possessed by individual patients. Given how important data-intensive assays are to revealing appropriate intervention targets and strategies for treating an individual with a disease, AI can play an important role in the development of personalized medicines. We describe many areas where AI can play such a role and argue that AI's ability to advance personalized medicine will depend critically on not only the refinement of relevant assays, but also on ways of storing, aggregating, accessing, and ultimately integrating, the data they produce. We also point out the limitations of many AI techniques in developing personalized medicines as well as consider areas for further research.
Keywords: Artificial intelligence; Big data; Clinical trials; Personalized medicine.