Large data management is the organization, administration, and governance behind large volumes of structured and unstructured data. It ensures a high level of data quality and accessibility for intelligence and analyses applications.
-
Biological datasets have become increasing large and complex. Knowledge databases and publicly available datasets are available for use in experimental planning and running results comparison...
Technological advances in high throughput, low cost DNA sequencing coupled with the availability of a high quality reference assembly allow us to interrogate the genome with greater precision...
Survival rates for early stage non-small cell lung cancer (NSCLC) remain unacceptably low compared to other common solid tumors. This mortality reflects a weakness in conventional staging, as...
Illumina next-generation sequencing (NGS) and microarray technologies are revolutionizing cancer research, enabling cancer variant discovery and detection and molecular monitoring. Join u...
In this presentation I describe pathway based analyses of genotyping data to identify pathways related to the development of complex diseases, with a focus on lung cancer and selected autoimm...
With advances in next-generation sequencing, whole-exome and genome sequencing (WGES) is now accessible as a tool in many applications. In the clinical setting, WGES is proving to be very val...
As next-generation sequencing (NGS) platforms advance in their speed, ease-of-use, and cost-effectiveness, many translational researchers are transitioning from microarrays to RNA sequencing...