`
Crimson Publishers Publish With Us Reprints e-Books Video articles

Full Text

Trends in Telemedicine & E-health

Advanced Methods Towards Functional Genomics & Medical Informatics: The Artificial Intelligence (AI) and Integrated Computing Perspectives

Zarif Bin Akhtar1* and Ahmed Tajbiul Rawol2

1Department of Computing, Institute of Electrical and Electronics Engineers (IEEE), USA

2Department of Computer Science, American International University-Bangladesh (AIUB), Bangladesh

*Corresponding author:Zarif Bin Akhtar, Department of Computing, Institute of Electrical and Electronics Engineers (IEEE), USA

Submission: May 15, 2025;Published: June 19, 2025

DOI: 10.31031/TTEH.2025.05.000622

ISSN: 2689-2707
Volume 5 Issue 5

Abstract

The convergence of Artificial Intelligence (AI) and enhanced computing has revolutionized functional genomics and medical science, enabling groundbreaking advancements in disease diagnosis, treatment and personalized medicine. AI-driven approaches, particularly Machine Learning (ML) and Deep Learning (DL), facilitate the rapid analysis of genomic data, unveiling critical genetic variations linked to complex diseases. Additionally, AI-powered predictive models improve drug discovery, biomarker identification and targeted therapeutic interventions. This investigation explores innovative methodologies integrating AI and computational techniques to enhance genomic research, focusing on their impact on clinical decision-making and translational medicine. Moreover, it discusses the ethical considerations and challenges in deploying AI-driven solutions in genomic medicine, including data privacy, algorithmic bias and interpretability. By leveraging AI and advanced computing frameworks, functional genomics can achieve unprecedented precision in disease modeling and therapeutic development. This research investigation study aims to provide a comprehensive analysis of emerging AI-powered innovations in functional genomics, paving the way for future advancements in personalized healthcare and precision medicine.

Keywords:Artificial Intelligence (AI); Biomedical Engineering (BME); Bioinformatics; Biomedical instrumentation measurements and design; Biomedical image processing; Deep Learning (DL); Functional genomics; Machine Learning (ML); Medical informatics

Introduction

Functional genomics has emerged as a transformative field in medical science, offering deep insights into the complex interplay between genetic variations and disease phenotypes. With the exponential growth of genomic data, traditional methods of analysis have become increasingly inadequate in handling the scale, complexity and variability of such datasets. In response, Artificial Intelligence (AI) and enhanced computing have been integrated into functional genomics, revolutionizing the way genetic data is processed, interpreted and applied in clinical settings. AI-powered methodologies, particularly Machine Learning (ML) and Deep Learning (DL), have demonstrated remarkable potential in biomarker discovery, disease prediction, drug development and personalized medicine [1-3]. These techniques enable researchers to extract meaningful patterns from vast genomic datasets, accelerating the identification of disease-associated mutations and enhancing precision in diagnostics and therapeutic decision-making. Additionally, advanced computing frameworks, including cloud computing, High-Performance Computing (HPC) and quantum computing, have further optimized genomic data processing, facilitating real-time analysis and large-scale simulations that were previously unattainable [4-6].

The integration of AI in Genome-Wide Association Studies (GWAS), transcriptomics, proteomics and epigenomics has significantly improved our understanding of the genetic basis of diseases, allowing for the development of targeted treatment strategies. Furthermore, AI-driven predictive models have been instrumental in drug repurposing, pharmacogenomics and CRISPR-based gene editing, offering new avenues for disease management and precision therapeutics [7-9]. However, despite these advancements, challenges persist, including data privacy concerns, algorithmic bias and the interpretability of AI-generated insights. Ethical considerations and regulatory frameworks must be carefully addressed to ensure the responsible application of AI in genomic research and medical practice [10]. This investigation explores the innovative role of AI and enhanced computing in functional genomics and medical science, analyzing cutting-edge methodologies, real-world applications and future directions. By leveraging AI’s computational power, functional genomics can achieve unparalleled precision in genetic research, leading to groundbreaking discoveries and advancements in personalized medicine and next-generation healthcare solutions.

Methods and Experimental Analysis

The methodology for this study follows a systematic approach integrating functional genomics, Artificial Intelligence (AI) and enhanced computing techniques to analyze and interpret genomic data for medical applications. The framework consists of several key components, including data collection, preprocessing, AI model development, computational techniques and validation methods.

Data collection and sources

The study utilizes multi-omics datasets, including genomic, transcriptomic, proteomic and epigenomic data, obtained from publicly available databases.
a. The Cancer Genome Atlas (TCGA)-for cancer-related genomic data.
b. Genome Aggregation Database (gnomAD)-for large-scale population genetics.
c. Gene Expression Omnibus (GEO)-for transcriptomic and proteomic datasets.
d. Human Gene Mutation Database (HGMD)-for pathogenic mutation analysis. Additionally, clinical datasets related to genetic diseases, drug responses, and biomarker discovery are incorporated to assess AI-based predictions.

Data preprocessing and feature engineering

The collected data undergoes rigorous preprocessing to ensure quality, accuracy and consistency.
A. Data cleaning: Removal of noise, duplicate sequences and missing values using statistical imputation techniques.
B. Normalization & scaling: Log transformation and minmax scaling applied to genomic expression values for uniform distribution.
C. Feature selection & dimensionality reduction: Principal Component Analysis (PCA), autoencoders and Recursive Feature Elimination (RFE) to extract the most informative genetic markers.

AI model development

A variety of AI models are employed to analyze functional genomic data.
a) Supervised learning models: Random Forest, Support Vector Machines (SVM) and XGBoost for disease classification.
b) Deep learning architectures: Convolutional Neural Networks (CNNs) for image-based genomic sequencing data and Recurrent Neural Networks (RNNs) for sequential gene expression analysis.
c) Graph Neural Networks (GNNs): Used to model gene-gene interactions and predict functional pathways.
d) Self-Supervised Learning (SSL): Leveraged for training models on large unlabelled genomic datasets before fine-tuning with labelled data.

Enhanced computing techniques

To handle large-scale genomic data efficiently, advanced computing techniques are integrated.
1) High-Performance Computing (HPC): Parallelized processing of sequencing data for accelerated analysis.
2) Cloud computing (google cloud, AWS, azure): Enables scalable storage and real-time AI model deployment.
3) Quantum computing (experimental phase): Investigated for solving complex genomic interactions using quantum algorithms.

Model validation and performance metrics

The trained AI models are validated using cross-validation techniques and benchmarked against traditional bioinformatics methods. Performance is evaluated.
a. Accuracy, precision, recall and F1-score for classification tasks.
b. ROC-AUC and precision-recall curves for predictive modeling.
c. Mean Squared Error (MSE) and R² scores for regressionbased genomics predictions.
d. Shapley Additive Explanations (SHAP) for interpretability and feature importance analysis.

Ethical considerations and data privacy

A. Compliance with ethical guidelines (HIPAA, GDPR) for genomic data protection.
B. Bias mitigation strategies to ensure AI models do not disproportionately affect specific populations.
C. Transparency and explainability (XAI): Implementing explainable AI techniques to enhance trust in AI-driven genomic interpretations.

Case studies and real-world applications

The investigation study also includes several case studies demonstrating AI’s impact.
a) Early cancer detection using deep learning on mutation signatures.
b) Predictive modeling for drug response based on pharmacogenomic data.
c) AI-assisted CRISPR gene-editing optimization for precision medicine.

This methodology provides a structured approach to integrating AI and enhanced computing in functional genomics and medical science. By combining advanced machine learning techniques, high-performance computing and ethical AI frameworks, the study aims to enhance genomic interpretation, accelerate precision medicine applications and pave the way for AI-driven genomic advancements.

Background Research and Available Knowledge

Functional genomics is a rapidly evolving field within molecular biology that seeks to understand the dynamic functions of genes, proteins and their interactions on a genome-wide scale. Unlike classical genetics, which primarily focuses on studying individual genes and their mutations, functional genomics employs largescale experimental and computational approaches to investigate how genetic components work together to regulate biological processes [1-3]. By integrating data from various high-throughput techniques, this field provides a holistic view of gene expression, transcriptional regulation and protein interactions, ultimately shedding light on complex cellular mechanisms [4-6]. Advances in Next-Generation Sequencing (NGS), gene-editing tools like CRISPRCas9 and bioinformatics platforms have significantly accelerated discoveries in this domain, allowing researchers to systematically analyze the roles of thousands of genes simultaneously [7-9]. One of the fundamental aspects of functional genomics is the exploration of gene function from two major perspectives: The “selected effect” and the “causal role.” The “selected effect” concept considers gene function from an evolutionary standpoint, suggesting that genes have developed specific roles due to natural selection.

In contrast, the “causal role” perspective defines function based on a gene’s necessity within biological systems, regardless of evolutionary history. These perspectives guide researchers in interpreting gene behavior across different organisms, particularly in understanding conserved genetic pathways and identifying novel regulatory networks [10-12]. Functional genomics also encompasses various subfields, such as transcriptomics, proteomics, metabolomics and epigenomics, each of which provides valuable insights into different layers of genetic regulation. By integrating data from these disciplines, scientists can better understand how genes influence health, disease and adaptation in different environments [13-15]. A wide range of experimental techniques enables the study of gene function at multiple levels. At the DNA level, technologies like chromatin immunoprecipitation followed by sequencing (ChIPseq) help identify transcription factor binding sites, revealing key regulatory elements involved in gene expression. Additionally, techniques such as DNase I hypersensitivity assays and ATAC-seq (Assay for Transposase-Accessible Chromatin using sequencing) provide insights into chromatin accessibility, highlighting potential enhancer and promoter regions that modulate gene activity.

Genetic interaction mapping, another essential tool, allows researchers to systematically study how different genes interact, helping to uncover functional relationships between seemingly unrelated genetic elements [1-11]. By leveraging these approaches, scientists can construct comprehensive gene regulatory networks that explain how different genes contribute to various cellular functions. At the RNA level, functional genomics employs transcriptome-wide approaches such as RNA sequencing (RNAseq) and microarrays to analyze gene expression patterns across different conditions, tissues and developmental stages. RNA-seq, in particular, has revolutionized the field by providing high-resolution data on gene expression, alternative splicing events and non-coding RNA functions. Another powerful tool, Massively Parallel Reporter Assays (MPRAs), enables high-throughput functional testing of regulatory DNA elements by linking them to a reporter gene and assessing their activity in different cellular contexts [11-33]. These approaches help identify gene expression changes associated with diseases, environmental stressors or drug treatments, paving the way for novel therapeutic interventions. At the protein level, functional genomics utilizes techniques like yeast two-hybrid screening, Affinity Purification-Mass Spectrometry (AP-MS) and proximity labelling methods to study protein-protein interactions.

Understanding how proteins interact is crucial, as many cellular processes rely on protein complexes rather than individual proteins acting in isolation [12-22]. Mass spectrometry-based proteomics has provided significant advancements in mapping protein interaction networks, revealing key signaling pathways that govern cell behavior. Additionally, emerging techniques such as singlecell proteomics and spatial transcriptomics enable researchers to study gene and protein function in unprecedented detail, allowing for the dissection of cellular heterogeneity in complex tissues. The integration of functional genomics with computational biology and Artificial Intelligence (AI) further enhances its capabilities, enabling the analysis of vast datasets generated from genome-wide studies. Machine learning algorithms and network-based models help identify gene-disease associations, predict protein structures and infer regulatory interactions [18-38]. These computational approaches accelerate the discovery of novel biomarkers, drug targets and personalized therapeutic strategies. Functional genomics has already made significant contributions to precision medicine, synthetic biology and agricultural biotechnology, with applications ranging from identifying cancer-related genes to engineering stress-resistant crops. Functional genomics serves as a cornerstone of modern biological research, providing crucial insights into the molecular mechanisms underlying gene function, regulation and interaction. As technologies continue to evolve, this field will play an increasingly vital role in advancing our understanding of complex biological systems [22-44], ultimately leading to new discoveries in medicine, biotechnology and environmental sciences.

Bioinformatics methods for functional genomics

With the vast amounts of data generated by functional genomics techniques, bioinformatics plays an essential role in extracting biologically meaningful patterns from complex datasets. Computational methods such as data clustering and Principal Component Analysis (PCA) are commonly used for unsupervised machine learning tasks, including class detection, while supervised machine learning techniques like artificial neural networks and Support Vector Machines (SVMs) are employed for classification and class prediction [25-45]. These approaches allow researchers to uncover hidden patterns within gene expression data, identify relationships between genes and classify biological samples based on functional characteristics. Additionally, functional enrichment analysis is widely used to assess the over- or under-expression of gene categories relative to a background set, which is particularly useful in RNA interference (RNAi) screens for determining positive or negative regulators [36-55]. Various tools facilitate these analyses, including gene ontology-based methods such as DAVID and Gene Set Enrichment Analysis (GSEA) for functional category assessments, pathway-based approaches like Ingenuity and Pathway Studio, and protein complex-based analysis through tools like COMPLEAT. These bioinformatics techniques enable researchers to navigate and interpret large-scale functional genomics data, leading to more precise insights into gene functions and regulatory mechanisms.

Deep mutational scanning and phylogenetic analysis

Deep Mutational Scanning (DMS) has emerged as a powerful approach to study protein function, stability and interactions by systematically introducing mutations and assessing their effects. New computational techniques, such as ‘phydms,’ have been developed to integrate the results of DMS experiments with phylogenetic trees, enabling researchers to compare experimental mutation effects with evolutionary constraints observed in nature. This approach provides valuable insights into whether laboratorybased selection pressures align with natural selection processes. By leveraging phydms, scientists can refine their experimental conditions to better reflect evolutionary constraints, enhancing the biological relevance of their findings [28-48]. Furthermore, DMS has been employed to investigate protein-protein interactions, using thermodynamic models to predict the effects of mutations across dimer interfaces. These studies help identify critical amino acids involved in protein interactions and provide a deeper understanding of protein function at the molecular level. Additionally, DMS can aid in protein structure determination, as strong positive epistasis between two mutations often indicates spatial proximity within a protein’s three-dimensional structure. This principle has been successfully demonstrated in studies using the GB1 protein, highlighting the potential of DMS to reveal structural features and interaction networks within proteins.

Machine learning applications in functional genomics

The interpretation of large-scale functional genomics experiments, such as Massively Parallel Reporter Assays (MPRA), has increasingly relied on machine learning models. One such approach involves using a gapped k-mer Support Vector Machine (SVM) model to identify enriched sequence motifs within cisregulatory elements that exhibit high activity [33-55]. This method enables the discovery of key regulatory sequences that influence gene expression by comparing functional and non-functional variants. In addition to SVM models, deep learning and random forest algorithms have been employed to analyze high-dimensional functional genomics data. These advanced computational models have significantly improved the prediction accuracy of gene regulatory elements, providing deeper insights into the role of noncoding DNA in gene expression. As machine learning techniques continue to evolve, their integration with functional genomics promises to enhance our understanding of gene regulation, facilitate the identification of novel regulatory elements and refine models of gene expression networks.

Major consortium projects in functional genomics

The ENCODE project: The Encyclopaedia of DNA Elements (ENCODE) project represents one of the most ambitious initiatives in functional genomics, aiming to map all functional elements in the human genome. ENCODE has provided groundbreaking insights into both coding and non-coding regions of DNA, revealing that a significant portion of the genome is transcribed into various RNA molecules, including non-coding RNAs that play essential regulatory roles. Through genomic tiling arrays, researchers have identified previously unknown transcriptional regulatory sites, expanding our understanding of gene control mechanisms. Additionally, ENCODE has elucidated various chromatin-modifying processes, shedding light on how epigenetic factors influence gene expression. The project’s findings have transformed genomic research by demonstrating that vast regions of the genome, once considered “junk DNA” are actively involved in regulatory functions, thereby redefining the concept of genetic functionality.

The GTEx project: The Genotype-Tissue Expression (GTEx) project focuses on elucidating the role of genetic variation in shaping transcriptomic differences across multiple tissues. By collecting and analyzing over 11,000 tissue samples from more than 700 post-mortem donors, GTEx has provided a comprehensive resource for understanding how genetic variation affects gene expression across diverse biological contexts. The project has been instrumental in identifying expression Quantitative Trait Loci (eQTLs), which are genetic variants associated with differences in gene expression levels. One of GTEx’s key contributions has been in distinguishing tissue-specific from tissue-shared eQTLs, allowing researchers to explore how genetic variation contributes to tissue-specific functions and disease susceptibility. The dataset serves as a vital genomic resource for studying gene regulation and offers valuable insights into the genetic basis of complex diseases by linking genetic variations to functional consequences at the transcriptomic level.

The atlas of variant effects alliance: Founded in 2020, the Atlas of Variant Effects alliance (AVE) is an international consortium dedicated to mapping the functional impact of all possible genetic variants related to disease. AVE aims to construct comprehensive variant effect maps that detail the functional consequences of every single nucleotide change in genes and regulatory elements. This initiative is crucial for understanding how genetic variations contribute to disease phenotypes and for developing precision medicine approaches tailored to an individual’s genetic makeup. Supported by institutions such as the Brotman Baty Institute at the University of Washington and the National Human Genome Research Institute (NHGRI), AVE leverages high-throughput functional genomics methods to systematically assess genetic variant effects. By providing a detailed catalog of variant functions, the project facilitates more accurate disease modeling, improves genetic risk assessments and accelerates the discovery of therapeutic targets. Bioinformatics and computational methods have become indispensable tools in functional genomics, enabling the analysis of large-scale data and the identification of biologically meaningful patterns. Techniques such as clustering, machine learning and functional enrichment analysis allow researchers to classify gene functions, predict regulatory elements and explore genedisease associations. Deep mutational scanning and phylogenetic comparisons further enhance our ability to study protein function, interactions and structure. Meanwhile, machine learning models are advancing our understanding of non-coding DNA and gene regulation. Large-scale consortium projects like ENCODE, GTEx and AVE continue to drive progress in functional genomics by mapping genomic elements, exploring tissue-specific gene regulation and characterizing genetic variant effects. As computational techniques evolve, their integration with experimental functional genomics will lead to deeper insights into gene function, advancing personalized medicine and genomic research.

Functional Genetics and Genomics: A Deep Dive

Functional genomics plays a crucial role in modern clinical research by utilizing high-throughput technologies such as bulk RNA sequencing and spatial transcriptomics. These approaches, often paired with Next-Generation Sequencing (NGS), provide deep insights into the cellular transcriptome, allowing for more precise disease diagnosis and personalized treatment. Although alternative techniques like microarrays and real-time PCR exist, they lack the comprehensive data offered by RNA sequencing. Bioinformatics plays a key role in analyzing patient-specific functional genomic data, ultimately forming the foundation of precision medicine by tailoring treatments to individual genetic profiles. Current clinical research in functional genomics has focused primarily on cancer, where genomic and epigenomic alterations drive disease progression. The UK government is actively advancing this field through initiatives such as the National NHS Genomic Medicine Service, aiming to integrate Whole-Genome Sequencing (WGS) into routine cancer diagnostics. Functional genomic techniques, such as RNA sequencing, have demonstrated their efficacy by detecting relapsing cancer up to 200 days before it appears on CT scans. Research institutions and pharmaceutical companies, including AstraZeneca and GSK, are leveraging CRISPR technology to study cancer biology and accelerate drug discovery. This progress has the potential to improve treatment effectiveness and pave the way for novel therapeutic strategies.

The rapid advancements in functional genomics have significantly impacted clinical applications, particularly in the treatment of Acute Myeloid Leukemia (AML) and breast cancer. Technologies such as short hairpin RNA (shRNA) and CRISPR-Cas9 have facilitated deeper investigations into disease mechanisms, bringing potential cures closer to reality. The field has also played a pivotal role in guiding drug development, exemplified by the discovery of HER2 overexpression in breast cancer, which led to the creation of Herceptin. By identifying new drug targets, functional genomics accelerates drug discovery while lowering costs, benefiting both the pharmaceutical industry and patients. Expanding such studies across various diseases could revolutionize medical treatments and improve patient outcomes. Despite its promise, the field faces several challenges. One of the primary barriers is the high cost of sequencing, limiting widespread clinical adoption. Although WGS has become more accessible, costs remain a concern, particularly for healthcare systems like the NHS, where routine genomic testing is not universally available. However, early cancer diagnosis through genomic methods can significantly reduce long-term treatment costs, supporting the argument for increased investment in this technology.

Another challenge is the lack of population diversity in genomic studies. Most Genome-Wide Association Studies (GWAS) are predominantly conducted in high-income countries with a heavy European bias, leading to disparities in clinical genome interpretation for underrepresented populations. Addressing this gap requires more diverse reference genomes to ensure equitable access to genomic medicine. Further uncertainties exist regarding the applicability of functional genomic data across different tissues and cell types. Blood-based genomic data may not always be directly comparable to data derived from complex tissues such as the brain, necessitating comprehensive functionally annotated genome assemblies. Additionally, most multi-omics studies rely on animal models rather than human cells, creating challenges in translating findings to human disease applications. Future research must prioritize human-based clinical trials and engineered physiologically relevant animal models to improve translatability. Ethical considerations also play a significant role in shaping the field. The debate over human gene editing, particularly germline editing, has led to strict regulations, slowing research progress. While these ethical constraints ensure patient safety, they also highlight the need for continued discussion on responsible genomic advancements. Clinical functional genomics is transforming modern medicine by enabling early disease detection, personalized treatments and targeted drug development.

Although challenges such as cost, diversity, tissue specificity and ethical concerns persist, ongoing research and technological advancements will likely address these limitations. The integration of functional genomics into healthcare has the potential to revolutionize medical practice, leading to more effective treatments and improved patient outcomes worldwide. To provide a better understanding concerning the perspectives Figure 1 & 2 provides illustrative visualizations for an overview retrospective. The advancement of functional genomics has been significantly driven by the rapid evolution of single-cell analysis technologies, omics datasets and high-throughput platforms. These innovations have enhanced our ability to examine the genome, transcriptome, proteome and metabolome at a single-cell resolution. One of the key technologies in this domain is single-cell sequencing, which enables large-scale analysis of thousands of individual cells, allowing researchers to understand how genetic variants contribute to disease pathogenesis. The reemergence of mass spectrometry as a critical analytical tool has further revolutionized proteomic and metabolomic studies at the single-cell level, providing deeper insights into cellular functions. Single-cell analysis has become a cornerstone of functional genomics since its development in 2011, offering unprecedented insights into cell-specific interactions and molecular profiling. This technology has been particularly beneficial in haematological cancer diagnosis, immune response monitoring and the study of infectious diseases such as COVID-19.

Figure 1:A visualization of functional genomics 1.


Figure 2:A visualization of functional genomics 2.


The application of single-cell multi-omics approaches has proven instrumental in characterizing tumour heterogeneity, making it an essential tool for cancer research. One of the most widely used single-cell transcriptomics techniques involves RNA sequencing at the single-cell level, enabling comprehensive gene activity measurements. Cells are individually labelled using sequencing barcode technologies like microfluidics, pooled together and sequenced, allowing researchers to analyze gene clusters and associate them with specific cell types or tissue domains. Another prominent technology in functional genomics is nanostring, a highly sensitive DNA microarray initially developed for cancer diagnostics but now widely applied in immunology and other medical fields.

Unlike traditional Next-Generation Sequencing (NGS) methods, the nanostring ncounter system provides amplification-free, direct molecular profiling, minimizing amplification bias and increasing the accuracy of nucleic acid quantification. Although Nano String is a powerful tool, it is currently considered complementary to NGS rather than a complete replacement, particularly in highthroughput genomic studies. Spatial transcriptomics has emerged as a revolutionary molecular profiling technique that enables gene activity mapping across entire tissue samples. By positioning samples on spatially barcoded reverse-transcription primers, this method facilitates high-resolution transcriptional profiling, aiding in disease diagnosis, cellular heterogeneity assessment and cancer stem cell identification.

The integration of spatial transcriptomics with NGS platforms provides a holistic view of gene expression patterns, offering valuable insights into tissue-specific molecular functions. This technique holds promise for personalized medicine, particularly in oncology, where understanding spatial gene expression variations is critical for effective therapeutic interventions. CRISPRCas9 technology represents one of the most groundbreaking developments in functional genomics, offering precise gene-editing capabilities that can be leveraged to study genetic mutations and their implications for disease. By utilizing guide RNA (gRNA) to target specific DNA sequences, CRISPR enables the Cas9 enzyme to introduce double-stranded breaks, which are subsequently repaired through non-homologous end joining or homology-directed repair. This ability to efficiently edit, delete or replace DNA sequences has made CRISPR-Cas9 a cost-effective and widely accessible tool in genetic research. However, while CRISPR technology holds immense therapeutic potential, concerns regarding off-target effects and unintended mutations remain significant challenges, particularly in clinical applications. Clinical trials have highlighted several safety concerns with CRISPR-based gene editing, including the risk of inducing harmful mutations that could contribute to oncogenesis. The immunogenic nature of the Cas9 enzyme has restricted initial trials to immunologically privileged sites such as the eye. To mitigate these risks, advanced editing approaches using Cas9 nickase have been developed, which produce single-strand breaks rather than double-strand breaks, reducing the likelihood of large genomic deletions and chromosomal rearrangements.

These base and prime editing techniques offer a more precise and reliable method for genome correction, making them potential candidates for future applications in germline and in utero editing. Looking ahead, the continued refinement of CRISPR-based technologies could revolutionize clinical functional genomics, with potential applications in genetic screening and disease prevention. If proven safe and effective, in utero gene editing may become a preferable alternative to preimplantation genetic diagnosis, as it eliminates the ethical concerns associated with embryo destruction. However, the ethical implications of germline editing remain a contentious issue, necessitating careful consideration before these technologies are widely adopted. As advancements in functional genomics continue, it is crucial to balance innovation with ethical responsibility, ensuring that gene-editing technologies are both safe and ethically sound before they are integrated into clinical practice. The functional genomics market has seen significant growth, particularly following the success of genome sequencing during the COVID-19 pandemic. Government interest in genomics has surged, resulting in substantial investments such as the UK’s £200 million life science investment program launched in 2021, which is expected to generate around £600 million in long-term capital for the industry.

This investment is driven by the increasing incidence of diseases such as cancer and the expanding applications of genomics in healthcare. North America held the largest share of the genomics market in 2020, led by companies such as illumina, Inc. and thermos fisher scientific. The market’s expansion is primarily fueled by sequencing technologies, which accounted for the largest technological share in 2019. Drug discovery and development, a key application area, also dominated the genomics market, highlighting the sector’s growing importance in pharmaceutical innovation. With the global genomics market expected to more than double by 2025 and create approximately 133,000 jobs by 2030, a significant challenge remains: The shortage of trained professionals in the field. Addressing this gap requires strategic investments in workforce training to meet the rising demand for genomics expertise. The future of functional genomics lies in its integration into drug production and diagnostics. While companies such as GSK, Novartis and AstraZeneca have begun incorporating genomic methods into their workflows, traditional drug development processes still dominate the pharmaceutical industry. One of the primary obstacles in drug development is the high failure rate of clinical trials, with over 50% of drugs failing in phase III due to inefficacy. These failures result in substantial financial losses and prolonged development timelines. However, functional genomics offers a promising solution by enhancing target identification and improving drug efficacy, leading to higher clinical trial success rates.

By leveraging functional genomic techniques, pharmaceutical companies can streamline drug discovery, reduce costs and expedite the availability of critical medications, ultimately transforming the industry’s approach to drug development. Despite its potential, functional genomics remains underutilized in personalized medicine. There is a pressing need for more clinical investigations to integrate functional genomic pathways into mainstream healthcare. Personalized medicine, powered by genomics, has the potential to revolutionize treatment strategies by identifying the most effective medications tailored to an individual’s genetic profile. This approach is particularly valuable as molecular changes often precede clinical symptoms, enabling early intervention and improved patient outcomes. Conducting clinical trials that utilize functional genomics will provide deeper insights into disease mechanisms and treatment responses, leading to a more precise and effective medical landscape. By tracking molecular changes alongside clinical phenotypes, researchers can refine therapeutic approaches and enhance the overall success of medical treatments.

Case Studies Analysis

The case studies highlight the complexities of genetic testing, particularly in managing expectations, interpreting negative findings and ensuring appropriate follow-up for Incidental Findings (IFs). In the first scenario, a 5-year-old female with developmental disabilities underwent exome sequencing, but no diagnostic results were found. Despite thorough counseling, the parents mistakenly assumed that a negative result meant the condition was not genetic. This case underscores the importance of setting clear expectations about the limitations of genetic testing during pretest and post-test counseling. Misconceptions about the scope of exome sequencing must be addressed to ensure families understand that negative findings do not rule out a genetic cause. In the second case, a healthy 40-year-old woman enrolled in a study discovered an IF-a pathogenic MYH7 variant linked to Hypertrophic Cardiomyopathy (HCM). While she understood the importance of follow-up care, she prioritized her child’s congenital heart defect over her own screening. This case highlights the challenges healthy individuals face in acting upon unexpected genetic findings, particularly when they lack symptoms or a family history of the condition. Genetic counselors must balance reinforcing medical recommendations with preventing undue anxiety, especially when discussing conditions with reduced penetrance and variable expressivity.

The third case involves a 16-month-old male with a brain tumour, whose tumour and blood exome sequencing revealed no treatment-impacting mutations but identified an IF-a maternally inherited SCN5A variant associated with Long QT Syndrome (LQTS). While the child required cardiology follow-up, the father primarily focused on the absence of chemo-resistance markers, perceiving the negative tumor sequencing results as “good news.” This case illustrates that families dealing with acute illnesses may struggle to prioritize follow-up for IFs and may misinterpret negative results as reassuring. Ensuring patient-participants grasp the implications of their genomic findings, regardless of their immediate clinical significance, is crucial. Collectively, these case studies emphasize the need for clear and continuous genetic counseling to manage expectations, reinforce the importance of follow-up and prevent misunderstandings about the significance of negative or incidental genetic findings. They also underscore the evolving nature of genetic knowledge and the necessity for longitudinal follow-up to ensure the effective clinical integration of genomic testing results. The challenges of returning large amounts of data in genetic testing are exemplified by a 17-year-old female with probable LQTS, whose results included several variants of unknown significance and Incidental Findings (IFs).

The overwhelming nature of such detailed reports can cause confusion and anxiety, underscoring the need for structured result delivery, potentially across multiple visits and the use of simplified reports or counseling letters. The varying responses of families to identical genetic results highlight the unpredictability of emotional reactions. Two unrelated families received negative exome sequencing results for their daughters with developmental delays. One family found relief in avoiding a diagnostic label, while the other expressed disappointment at not having an answer. This contrast emphasizes the importance of personalized counseling that acknowledges diverse perspectives rather than assuming uniform responses.

Follow-up testing for family members can present significant hurdles, particularly when targeted clinical tests are unavailable. A 34-year-old female identified as a cystinuria carrier faced this issue when clinical testing for her husband was not an option. The case highlights the limitations in genetic testing accessibility and the need to consider follow-up testing availability when designing research protocols. The atypical presentation of well-known mendelian conditions can further complicate genetic counseling. A 66-year-old female diagnosed with Hypertrophic Cardiomyopathy (HCM) was later found to carry a pathogenic variant in PTPN11, associated with LEOPARD syndrome, despite lacking its classic symptoms.

Such findings challenge traditional diagnostic approaches and require clinicians to support patients in navigating uncertain prognoses and accessing relevant resources. This continued case study highlights the complexities surrounding the return of genomic results to family members when a patient-participant dies before receiving them. A 43-year-old woman diagnosed with inflammatory breast cancer was enrolled in a study performing tumor and germline exome sequencing. The analysis identified a TP53 pathogenic variant, raising concerns about germline mosaicism and its implications for her family. Unfortunately, the patient-participant passed away before further testing could be conducted and the study’s consent process did not specify how to handle result disclosure in such cases. After internal discussions, study personnel decided to share the results with her husband due to the clinical significance for her relatives. The disclosure had significant implications for the patient-participant’s family. Her daughter underwent site-specific testing and was negative for the TP53 mutation, providing reassurance about her cancer risk. Separately, the patient-participant’s sister, who had previously been diagnosed with a brain tumor, sought clinical genetic evaluation at another institution.

Based on her personal and family history, she met the criteria for TP53 germline testing and opted for full sequencing, which revealed no mutations. While the results impacted the family in different ways, they underscored the necessity of clear guidelines for posthumous result disclosure in genomic studies. This case emphasizes the importance of informed consent discussions that include provisions for sharing results after a patient-participant’s death. Establishing a clear plan regarding who should receive such information and what specific findings should be disclosed can help guide families through genetic risk assessments. Additionally, given the growing use of exome and genome sequencing in oncology research, especially for terminally ill patients, standardized protocols for posthumous result communication are needed. Future research should explore best practices for obtaining informed consent regarding result disclosure and effectively returning findings to designated family members. To give a more concise retrospect Figure 3 & 4 demonstrates the visualizations for a better understanding relating to the matters.

Figure 3:An overview of the case studies analysis 1.


Figure 4:An overview of the case studies analysis 2.


Recent Innovations within Functional Genomics

Genomic and transcriptomic profiling

Techniques like ATAC-seq, various DNA methylation assays and ChIP-seq provide detailed information about chromatin structure and epigenetic modifications. RNA-seq and CAGE reveal insights into gene expression and transcription start sites. Specialized methods like ribosome profiling, CLIP-seq and RNA modification assays expand our understanding of RNA biology. These methods, coupled with computational tools like segmentation algorithms, allow for the integration and interpretation of complex genomic datasets. Spatial transcriptomics, a groundbreaking advancement, combines functional genomics assays with positional information within tissues, enabling the study of cellular organization and interactions.

3D genome structure and interactions

Techniques like Hi-C and ChIA-PET have revealed the importance of 3D genome structure in gene regulation. These methods map chromatin interactions, identifying Topologically Associated Domains (TADs) and chromatin loops, which bring distal regulatory elements into close proximity. Methods like ChIRP-seq, MARGI, GRID-seq and ChAR-seq help identify interactions between RNA and chromatin, further elucidating the complex interplay between different genomic components.

High-throughput perturbation and functional genomics

The CRISPR/Cas9 system has revolutionized functional genomics by enabling highly efficient and scalable genome editing and perturbation. Various CRISPR-based tools allow for gene knockout, knockdown (CRISPRi), activation and even epigenome editing. Large-scale CRISPR screens, coupled with next-generation sequencing, facilitate the identification of genes involved in specific cellular processes. Perturb-seq combines CRISPRi with single-cell expression profiling, providing insights into the global transcriptional response to perturbations. Complementary RNA-targeting methods like ASO gapmers, siRNAs and shRNAs offer alternative approaches to study gene function by directly manipulating RNA transcripts.

Long-read sequencing and capture methods

Long-read sequencing technologies are crucial for resolving complex genomic structures and transcript isoforms. These methods allow for the sequencing of long DNA and RNA molecules, overcoming limitations of short-read sequencing. Capture methods, employing oligonucleotide probes, enable targeted sequencing of specific genomic regions, particularly useful for studying lowabundance transcripts.

Single-cell and comparative genomics

Single-cell technologies have revolutionized our understanding of cellular heterogeneity. Methods like STRT, Smart-seq and droplet-based approaches allow for the profiling of individual cells, revealing variations in gene expression and genomic makeup. Deconvolution methods offer an alternative approach to analyze bulk tissue samples by estimating the proportions of different cell types. Comparative genomics, while traditionally used for gene annotation, faces challenges with less conserved regulatory regions. Comparative transcriptomics, particularly between human and mouse, helps understand the conservation and divergence of gene expression patterns and regulatory networks. These studies reveal complex relationships between species and highlight the importance of considering cell-type-specific and gene-specific expression patterns.

Significant Trends in Functional Genomics

Functional genomics is a rapidly evolving field focused on understanding the roles of genes and their interactions, particularly in relation to disease. Several key trends are driving innovation in this area.

Leveraging omics and NGS

Functional genomics integrates various “omics” technologies (genomics, transcriptomics, proteomics, etc.) with Next-Generation Sequencing (NGS) to comprehensively analyze gene function. This allows researchers to study gene expression, protein interactions, and other functional aspects of the genome.

Epigenome editing

Advances in epigenome editing, often using CRISPR-based systems, enable researchers to directly modify epigenetic marks (like DNA methylation or histone modifications) to study their impact on gene expression and cellular function. This is crucial for understanding how gene regulation is influenced by these modifications.

Improved understanding of interactions

Functional genomics is providing deeper insights into genetic interaction mapping (how genes work together) and DNA/protein interactions. Researchers are gaining a more detailed picture of how proteins bind to DNA to regulate gene expression and other processes.

Enhanced deep mutational sequencing

Deep mutational sequencing, particularly at the protein level, is becoming more sophisticated. This allows scientists to study the effects of a large number of mutations on protein function, helping to identify critical regions of proteins and understand how mutations contribute to disease.

Disease modeling and drug discovery

A major application of functional genomics is in improving disease modeling. By understanding how genes contribute to disease, researchers can create better models to study disease mechanisms and identify potential drug targets. This is leading to more effective and targeted therapies.

High-throughput platforms

Startups are developing high-throughput functional genomics platforms that enable rapid and large-scale analysis of gene function. These platforms are accelerating the pace of research and allowing for the study of complex biological systems. They are particularly useful for identifying vulnerabilities in cells, such as cancer cells, which can be exploited for therapeutic purposes.

Deep Learning (DL) & Machine Learning (ML) in Functional Genomics

Machine learning has become an essential tool in biomedical research due to its ability to address complex datasets and provide valuable insights. One common use of machine learning is for making predictions based on measurable data. For example, in psychiatric medicine, machine learning has been used to predict mood based on smartphone recordings of everyday behaviors. In neuroscience, machine learning techniques have been employed to decode neural activity and infer intentions from brain measurements, enabling advancements in prosthetics and interactive devices. Machine learning also serves as a benchmark for evaluating human-generated models, helping to identify missing principles or misguided approaches. Additionally, machine learning aids in understanding complex systems by determining nonlinear relationships between variables and identifying shared information between components of a system. As datasets in biomedical research continue to grow in complexity, machine learning becomes indispensable. Humans are limited in their ability to comprehend and model complex datasets, often missing important patterns and structures. Machine learning techniques excel in capturing complex relationships and can handle large, multifaceted datasets. Moreover, machine learning addresses challenges posed by nonlinearity and recurrence, which are prevalent in biological systems. By embracing the complexity inherent in biomedical data, machine learning provides better fits and more accurate predictions compared to simpler models. Machine learning also supports the collection of a large number of variables, as it can improve predictions even when the contributions of individual variables are unclear.

The application of machine learning in neuroscience serves as a compelling example of its capabilities. In neural decoding, machine learning techniques have outperformed traditional linear approaches in predicting intentions based on brain activity. Neural network-based methods, along with ensemble methods that combine multiple techniques, have achieved remarkable results. Machine learning also challenges the common practice of using simple models in neural encoding, where signals from neurons are analyzed in relation to external variables. Machine learning algorithms, such as neural networks and extreme gradient-boosted trees, have surpassed generalized linear models in capturing the complex relationships between neural activity and external variables. By setting benchmarks and providing more accurate descriptions of neural computations, machine learning enhances our understanding of the human brain. While machine learning techniques may seem complex, their implementation has become increasingly accessible. With the availability of userfriendly software packages and automated machine learning tools, biomedical scientists can easily apply machine learning to their research without extensive knowledge of specific algorithms. This empowers researchers to focus on formulating scientific questions and interpreting the results generated by machine learning models. To be more specific, machine learning has become a necessity in biomedical research due to its ability to address complex datasets, make accurate predictions, benchmark human-generated models, and enhance understanding.

From predicting mood based on smartphone data to decoding neural activity and modeling complex biological systems, machine learning offers valuable insights and advancements in various biomedical disciplines. As datasets continue to grow, machine learning’s capacity to handle complexity and capture nonlinear relationships will be crucial in furthering our understanding of biological processes and improving healthcare outcomes. Deep learning (DL) is increasingly being applied to functional genomics, showing promising results in various tasks. Several DL architectures are employed, including Convolutional Neural Networks (CNNs) for feature extraction and image-like data, Recurrent Neural Networks (RNNs) and Long Short-Term Memory networks (LSTMs) for sequential data like gene expression time series, Generative Adversarial Networks (GANs) for data generation and improving classification and Autoencoders (AEs) for unsupervised learning and dimensionality reduction. Capsule Networks (CapsuleNets), a newer architecture are also being explored. Techniques like multimodel fusion and transfer learning further enhance predictive accuracy by combining different models or leveraging pre-trained models. In functional genomics, DL models are used for various purposes. RNNs and LSTMs have been used for miRNA and target prediction with higher accuracy than traditional methods.

Feed-forward neural networks analyze RNA-Seq data and can outperform other methods like LASSO. Deep networks, particularly AEs, are effective as a pre-processing step for clustering gene expression data and learning the encoding of transcriptomic machinery. Denoising AEs identify biological signals and patterns in gene expression data. Multi-layer feedforward networks infer gene expression from landmark genes. CNNs like DeepChrome predict gene expression from histone modifications and LSTMs like AttentiveChrome enhance this by interpreting dependencies among chromatin factors. DeepVariant a CNN, is a highly accurate variant caller. DeepFIGV predicts epigenetic variation from DNA sequence. DL models are also used to predict drug response in cancer, identify cancer subtypes from multi-omics data (DNA methylation, gene and miRNA expression) and infer various properties of biological samples through multi-task and transfer learning. CNNs like CNNC infer gene relationships from single-cell expression data and DeepCpG predicts missing methylation states in single-cell methylation data. DanQ combines CNNs and RNNs to predict DNA function. FBGAN uses GANs to optimize synthetic gene sequences. These examples demonstrate the versatility and potential of DL in addressing diverse challenges in functional genomics.

AI Integrations within Functional Genomics

Machine learning has revolutionized the field of biology and bioinformatics, enabling researchers to analyze complex biological data, make predictions and gain deeper insights into various biological processes. In genomics, machine learning techniques have been applied to regulatory genomics, structural genomics and functional genomics. They have helped predict gene expression, classify protein structures and identify gene functions and interactions. Machine learning methods combined with natural language processing have also facilitated the analysis of large genomics-related datasets, aiding in relation extraction and named entity recognition. One of the significant applications of machine learning in genomics is genome sequencing. Next-generation sequencing techniques, empowered by machine learning algorithms, have drastically reduced the time and cost required to sequence genomes. Machine learning has also played a crucial role in gene editing processes, such as CRISPR, by assisting in the selection of the correct DNA sequence for editing. In proteomics, machine learning has contributed to the analysis of protein components, their interactions and their roles within organisms. Mass spectrometry-enabled proteomics has been enhanced by machine learning algorithms, which help identify proteins from mass spectral peaks and improve the accuracy of protein recognition. These advancements have facilitated the diagnosis of diseases and expanded our understanding of protein patterns.

Microarrays, used to detect gene expressions, have benefited from machine learning techniques, particularly in gene classification and clustering. Machine learning has made it easier to identify significant interactions in complex experiments and analyze large-scale microarray datasets. It has also enabled the prediction of future gene stages and the discovery of relationships between genes and diseases. Text mining, powered by machine learning and natural language processing, has been valuable in extracting and analyzing information from biological publications. This technology enables researchers to process and analyze large volumes of documents, aiding in large-scale protein and molecule interaction analysis, translation of content into different languages, searching for drug targets and automatic annotation of gene and protein functions. In systems biology, machine learning has become instrumental in modeling complex biological interactions and behaviors. It helps capture the interactions between biological components and simulate the behavior of biological systems.

Machine learning techniques, such as probabilistic graphical models and genetic algorithms, have been used to model genetic networks and regulatory structures. They have also facilitated the identification of relationships between phenotypes and genotypes, shedding light on the critical genetic composition of organisms. Machine learning has transformed the field of biology and bioinformatics by providing powerful tools to analyze and interpret complex biological data, make accurate predictions and enhance our understanding of biological processes. It has accelerated research in genomics, proteomics, microarrays, text mining and systems biology, opening up a whole new avenue for discoveries and advancements in the field of biology.

Functional Genomics: A Healthcare Retrospective

Machine learning and Artificial Intelligence (AI) have made significant contributions to the healthcare industry, enhancing patient care and improving quality of life. These technologies are being used in various applications to transform healthcare delivery. One important area is drug discovery and manufacturing, where machine learning is used in the early stages to assist in finding alternative options for multifactorial disease therapy. Precision medicine and next-generation sequencing techniques have proven valuable in this process. Medical imaging and diagnosis have also benefited greatly from machine learning and AI. Computer vision technologies, powered by deep learning and machine learning algorithms, enable advanced analysis of medical images. This technology is used in applications such as tumor detection, radiology interpretation and quantitative analysis of 3D medical images. Projects like Microsoft’s Inner Eye are utilizing machine learning to improve medical image analysis and diagnosis. Personalized medicine is another promising application of machine learning in healthcare. By leveraging predictive analytics on patient data, machine learning algorithms can assist in generating personalized treatment options.

This approach goes beyond traditional diagnostic methods and takes into account individual patient characteristics, health history and genetic information. Machine learning algorithms can analyze large datasets and identify patterns that can guide personalized treatment decisions. Machine learning is also being used in stroke diagnosis and treatment. Pattern recognition algorithms help in diagnosing, treating and predicting complications in neurological diseases, including stroke. Algorithms such as Support Vector Machines (SVM) and 3D Conventional Neural Network (CNN) are used to predict motor deficits in stroke patients, aiding in personalized rehabilitation planning. In the field of biology and bioinformatics, machine learning tools have revolutionized data analysis and modeling. Deep Variant is a deep-learning tool used for genome data mining. It accurately predicts common genetic variations and provides scalable, cloud-based solutions for complex genomics datasets.

Atom wise algorithms enable the study of the 3D structure of proteins and other molecules with atomic precision, facilitating drug discovery. Cell profiler, a software powered by machine learning methods, allows for the quantitative measurement of individual cell features from microscopy images, enabling a very high-throughput analysis of biological samples. Machine learning, particularly through deep learning algorithms, extracts meaningful information from large datasets such as genomes or images and builds models based on the extracted features. These models can then be used for analysis and prediction on other biological datasets. The application of machine learning in biology and bioinformatics has accelerated research, enabling the discovery of new patterns and relationships in complex biological systems. Machine learning is transforming healthcare and biology by enabling more accurate diagnosis, personalized treatment and advanced data analysis. These technologies have the potential to revolutionize the field, leading to better patient outcomes and advancements in biological research.

AI Perspectives within Healthcare Informatics

Artificial Intelligence (AI) has revolutionized the healthcare industry by improving patient care and outcomes. AI in healthcare has the potential to transform the way we diagnose diseases, develop treatments and prevent illnesses. The use of AI technology, such as machine learning and natural language processing, has enabled medical professionals to make more accurate diagnoses, personalize treatments and streamline clinical processes. Machine learning, one of the most common AI techniques in healthcare, has facilitated medical diagnosis and treatment by processing large amounts of clinical data, identifying patterns and making predictions with higher accuracy. It has been used for precision medicine, predicting treatment success based on individual patient characteristics and detecting correlations and changes in health data that may indicate health risks. Deep learning, a subset of machine learning, has also been applied to tasks such as speech recognition and natural language processing, aiding in medical record analysis and clinical decision-making. To better understand Generative AI enabled computing mechanics in terms of healthcare informatics Figure 5 gives a visual representation of the matter.

Figure 5:AI enabled healthcare visualization in terms of generative AI.


Natural Language Processing (NLP) is another AI technology transforming healthcare. NLP enables computers to interpret and use human language, allowing for improved diagnosis and accuracy, personalized treatment recommendations and streamlined clinical processes. By extracting valuable information from medical records and health data, NLP helps healthcare professionals make informed decisions and manage complex data more efficiently. Rule-based expert systems, although less prevalent today, have played a role in clinical decision support by providing sets of rules for specific knowledge areas. However, machine learning approaches are gradually replacing rule-based systems, offering more flexibility and accuracy in healthcare analytics and decision-making. AI within healthcare has diverse applications, including diagnosis and treatment, administrative tasks and data analysis. By automating administrative processes, AI reduces human error, saves time and allows medical professionals to focus on patient care. Challenges associated with AI adoption in healthcare include data privacy and security, patient safety and accuracy, integration with existing IT systems, physician acceptance and trust and compliance with regulations. Addressing these challenges is crucial to ensure ethical and responsible use of AI in healthcare.

Looking forward, AI in healthcare holds tremendous potential for further innovation and advancement.

The use of AI-powered tools and algorithms can enable faster disease detection, personalized treatments and automation of processes such as drug discovery and diagnostics. The future of AI in healthcare promises improved patient outcomes, increased safety and reduced costs. However, the successful adoption of AI in healthcare relies on overcoming challenges and ensuring collaboration between AI technologies and medical professionals. Moreover, AI has transformed healthcare by enhancing diagnosis accuracy, personalizing treatments, streamlining processes and improving patient care. As AI continues to advance, its impact on healthcare is expected to grow, leading to further advancements, better health outcomes and improved patient experiences. Artificial Intelligence (AI) is making significant strides in various clinical applications, revolutionizing healthcare practices. In the field of cardiovascular medicine, AI algorithms are being developed to aid in diagnosing and risk stratifying patients with conditions such as coronary artery disease. Wearable devices and internetbased technologies are also being used to monitor cardiac data, enabling early detection of cardiac events outside of the hospital. AI has shown promise in dermatology for diagnosing skin cancer and classifying various skin diseases. It has achieved high accuracy levels in skin cancer detection, surpassing human dermatologists in some cases. Gastroenterology is another area where AI can enhance endoscopic procedures, allowing for faster disease identification and visualization of blind spots.

Infectious diseases are being tackled with the help of AI, with applications ranging from predicting treatment outcomes and identifying antimicrobial resistance to diagnosing diseases such as malaria, meningitis and tuberculosis. Musculoskeletal applications of AI include identifying causes of knee pain, particularly in underserved populations, to improve diagnosis and management. Oncology has seen significant progress in using AI for cancer diagnosis, risk stratification, molecular characterization of tumors and drug discovery. AI algorithms have shown promise in detecting breast cancer and prostate cancer with high accuracy rates. To better understand the significance concerning the matter Figure 6 representation provides a global view of AI in healthcare insights. Ophthalmology benefits from AI applications for screening eye diseases, with FDA approval granted for the use of AI algorithms in diagnosing diabetic retinopathy. AI can assist pathologists in analyzing digital pathology images, aiding in the diagnosis of diseases such as breast cancer, gastric cancer and colorectal cancer. Primary care is also utilizing AI for decision support, predictive modeling and business analytics to improve patient care and treatment outcomes. In psychiatry, AI is being explored for predictive modeling of diagnoses and treatment outcomes, as well as the development of chat-bot therapy for conditions like anxiety and depression.

Figure 6:AI in healthcare trends global view insights.


Figure 7:A History of AI in healthcare.


Radiology is an area where AI is making significant strides, particularly in interpreting medical imaging scans. Deep learning models have demonstrated accuracy comparable to that of human experts in identifying diseases through CT and MR imaging. AI also offers non-interpretive benefits to radiologists, such as reducing noise in images, enhancing image quality and automatically assessing image quality. Disease diagnosis and classification benefit from AI techniques, including artificial neural networks and Bayesian networks. AI-assisted diagnosis based on Electronic Health Records (EHRs) is helping physicians make more accurate diagnoses and treatment decisions by leveraging mass data and identifying similar cases. Telemedicine is another area where AI is gaining traction, enabling remote patient monitoring and providing real-time alerts to physicians based on sensor data. EHRs are being analyzed and interpreted using natural language processing, making reports more concise and standardized. AI algorithms are also used to predict disease risk based on patient records and family history. Drug interactions pose a threat to patients taking multiple medications and AI is being utilized to identify potential drug-drug interactions by analyzing medical literature and usergenerated content such as adverse event reports. To provide an overall visualization on the aspect of the matter concerning the medical domain Figure 7 provides the history of AI in healthcare.

While AI holds great potential in these clinical applications, challenges remain. Validation of AI models against human performance is essential, as is addressing issues of bias, interpretability and privacy. Further research and clinical trials are needed to assess the true clinical utility of AI in various healthcare settings. Overall, AI has the potential to revolutionize clinical practices, improve patient outcomes and enhance the efficiency and accuracy of healthcare delivery. The healthcare industry is witnessing the implementation of Artificial Intelligence (AI) through the collaboration and mergers of large health companies, allowing for greater accessibility to health data. These partnerships provide a foundation for the development and integration of AI algorithms. Many companies are exploring the incorporation of big data in healthcare, focusing on data assessment, storage, management and analysis technologies. Several prominent companies have contributed to the advancement of AI algorithms in healthcare. IBM’s Watson Oncology is being developed in partnership with leading cancer centers to assist in personalized cancer treatment. Microsoft’s hanover project analyzes medical research to predict highly effective cancer drug treatments. Google’s DeepMind platform is being used by the UK national health service for risk detection and cancer tissue analysis. Tencent is working on various medical systems and services, including AI-powered diagnostic imaging and intelligent healthcare through their WeChat platform.

Intel has invested in startups like Lumiata, which uses AI to identify at-risk patients and develop care options. Neuralink, founded by Elon Musk, has developed a next-generation neuroprosthetic that interfaces with neural pathways in the human brain. AI is also transforming healthcare delivery in developing nations by improving access to diagnosis and treatment. With the increasing capabilities of AI over the internet, machine learning algorithms can accurately diagnose life-threatening diseases in areas where healthcare resources are limited. AI enables a level of personalized care that is often lacking in developing countries. The regulatory landscape for AI in healthcare is evolving. Regulations such as the Health Insurance Portability and Accountability Act (HIPAA) and the European General Data Protection Regulation (GDPR) protect patient data and privacy. The US FDA has published an action plan for the regulation of medical devices incorporating AI. The US department of health and human services has issued guidance on the ethical use of AI, emphasizing principles such as respect for autonomy, beneficence, non-maleficence and justice. Similar regulations and guidelines exist in other countries, such as Denmark and the European Union, to ensure responsible data use and protect individual rights. AI is revolutionizing the healthcare industry by improving clinical decision support systems, expanding access to care and enhancing patient outcomes. Large companies are investing in AI research and development and regulations are being developed to address ethical concerns and protect patient data. The implementation of AI in healthcare holds great promise for improving healthcare delivery, particularly in underserved areas and developing nations to a great extent.

Result and Findings

This investigative research exploration provides the overviews retrospective along with the results and findings related to the application of Deep Learning (DL) in genomics, acknowledging both the promise and the current limitations of this approach. While DL models are considered state-of-the-art for classification and clustering with big data like omics data, their translation to clinical practice for precision medicine is still in its early stages. The success of DL hinges on selecting the right architecture for the specific research question and data, a process that can be challenging given the variety of available DL methods (e.g., LSTMs, CapsNets, GANs). To provide a better understanding relating to the matters Figure 8-10 gives the visualizations for the research results and findings along with the representations of Table 1 for an overall overview.

Table 1:An overview of ML and DL perspectives in functional genomics.


Figure 8:A visual representation of the findings from the research results 1.


Figure 9:A visual representation of the findings from the research results 2.


Figure 10:A visual representation of the findings from the research results 3.


Limitations of DL in genomics

The findings and results identify five key limitations hindering the widespread adoption of DL in genomics.

Model interpretation (sblacsk box): DL models are often difficult to interpret, making it hard to understand the reasoning behind their predictions. This “black box” nature is a significant concern in bioinformatics, where researchers often prefer “white box” approaches that offer transparency and explainability. While explainable AI techniques are gaining traction in genomics, this remains a challenge.

Curse of dimensionality: Omics data, despite being large in volume, often suffers from the “curse of dimensionality.” This means there are many more variables (e.g., genes) than samples (e.g., patients), making it difficult to train robust DL models. While public data repositories offer a potential solution, combining datasets requires extensive preprocessing and harmonization.

Imbalanced classes: Genomics datasets, particularly those used for classification tasks (e.g., disease vs. healthy), are frequently imbalanced. DL models struggle with imbalanced classes, requiring a sufficient number of instances per class for effective training. Transfer learning, where models are pre-trained on larger, more balanced datasets, is a potential strategy to address this.

Data heterogeneity: Genomics data is inherently heterogeneous, encompassing various types of information, including gene/transcript sequencing, gene expression profiles, gene variants, genome alterations and gene interactions. Integrating these diverse data types is challenging due to the complex interdependencies between them. While bioinformatics tools exist for analyzing individual data types, combining them effectively remains a significant obstacle.

Parameter and hyperparameter tuning: Tuning DL models, particularly setting parameters and hyperparameters like learning rate, batch size, momentum and weight decay, is a complex and crucial step. Incorrect settings can lead to underfitting or overfitting. This tuning process often requires careful analysis of initial results and is specific to the dataset and research question.

Future directions

The mentioned perspectives highlight promising future directions for DL in genomics.

Multilayer and multi-scale modeling: Inspired by systems biology, multi-layer models are being explored to integrate heterogeneous omics data and capture the complexity of biological systems. Multi-scale dynamic modeling, while challenging, aims to model the human body as a single complex system. DL’s ability to handle multimodal data makes it well-suited for this approach, particularly in the context of precision medicine.

Automated preprocessing: DL has the potential to automate traditionally manual and error-prone preprocessing steps in genomics data analysis. By directly feeding raw data to DL models, the models can learn relevant features, potentially increasing predictive power.

Addressing data challenges: Researchers are working on developing DL models that can handle the specific characteristics of genomics data, such as the limited size of some datasets and the heterogeneity of data types. For example, while text mining architectures might seem applicable to SNP analysis, they currently struggle with the vast number of SNPs in the human genome.

Translating research to clinical practice: The section acknowledges the slow translation of genomics research findings into clinical tools. This is partly due to challenges in validation, standardization and the fragmented nature of genomics research. However, regulatory bodies like the FDA are exploring frameworks for computational technologies that could accelerate this process while maintaining safety and effectiveness.

Explainable AI for genomics: The need for explainable AI in genomics is emphasized, particularly for integrating heterogeneous data types. Models that can integrate such data but are difficult to interpret are less useful for clinical applications. The development of explainable AI methods for genomics remains a critical area of research.

Discussion

Research in the field of AI and healthcare has been crucial in developing the technology we have today. It has led to breakthroughs in areas such as medical imaging, drug development and personalized medicine. AI algorithms can analyze vast amounts of data, including patient records and clinical trials, to identify patterns and make accurate predictions, enhancing diagnoses and treatment plans. Additionally, the development of natural language processing algorithms has enabled the analysis of unstructured data, such as doctor’s notes and patient records, to extract vital information and trends, further improving medical decisions. Wearable devices are another outcome of AI research in healthcare. These devices can monitor a patient’s health remotely, collecting data on vital signs and transmitting it to healthcare professionals in real-time. This facilitates early intervention and improves patient outcomes significantly. In response to all that, AI has the potential to revolutionize healthcare by providing faster and more accurate diagnoses, enhancing treatment plans and improving patient care and safety. Ongoing research in the field of AI and healthcare will continue to push the boundaries, leading to further advancements and a healthier world. The future of AI in healthcare looks promising, and its applications are poised to make a significant positive impact on the industry.

Conclusion

Accelerated computing has completely changed the way how we humans view and visualize particulars to an overwhelming level of superiority. In spite of all the advancements and rapid development and deployment of data device peripherals and AI integration there is still a huge concern in the relation towards human health. The better the system the more complexify the scenario and outcome becomes. As a civilization, we are constantly evolving with time and the computing technicality continues to grow our interest to dive higher into the unknown. But we cannot ignore the matter when it revolves around and concerns between life and death. Proper ethics and moral integrity are very crucial for a context like human health. If not checked properly and without authentic guidelines and instructions even the greatest of systems can produce an unimaginable error. When that level of hierarchy concerns around health it is a matter of life and death and to beyond. The COVID-19 Pandemic was a prime example of how devastating the consequences can shape out to be and impact human lives to a scale of unimaginable heights. In the context of today and in the near future and in the following years to come the whole world will go through a wide range of shift in terms of both engineering and medical science to a great degree and will change the way how humans deal with machinery and technicality to a great extent.

True, remarkable and great innovations and extraordinary achievements will be applicable but in the midst of all the betterment we must not lose sight of what is truly needed and required in terms of health. The way the prospect is moving towards perhaps when at its peak even the most unorthodox matter and issue might rise in terms of human mortality and how there can a path beyond that line of scaling and a situation alternate can be made possible. So, there is a lot of matter and concern to consider from diversity of backgrounds. One thing must be understood above everything else; every human being is liable to error and we will finish this retrospect upon that and leave the rest to time and human civilization society for the upcoming future.

Supplementary Information

The various original data sources some of which are not all publicly available, because they contain various types of private information. The available platform provided data sources that support the exploration findings and information of the research investigations are referenced where appropriate.

Acknowledgment

The authors would like to acknowledge and thank the GOOGLE Deep Mind Research with its associated pre-prints access platforms. This research exploration was investigated under the platform provided by GOOGLE Deep Mind which is under the support of the GOOGLE Research and the GOOGLE Research Publications within the GOOGLE Gemini platform. Using their provided platform of datasets and database associated files with digital software layouts consisting of free web access to a large collection of recorded models that are found within research access and its related opensource software distributions which is the implementation for the proposed research exploration that was undergone and set in motion. There are many data sources some of which are resourced and retrieved from a wide variety of GOOGLE service domains as well. All the data sources which have been included and retrieved for this research are identified, mentioned and referenced where appropriate.

References

  1. Akhtar ZB, Rozario VS (2025) AI perspectives within computational neuroscience: EEG integrations and the human brain. Artificial Intelligence and Applications 3(2): 145-160.
  2. Akhtar ZB (2025) Exploring AI for pain research management: A deep dive investigative exploration. J Pain Res Manag 1(1): 28-42.
  3. Zarif BA (2025) Artificial intelligence within medical diagnostics: A multi-disease perspective. Artificial Intelligence in Health.
  4. Akhtar ZB (2024) Generative Artificial Intelligence (GAI): From Large Language Models (LLMs) to multimodal applications towards fine tuning of models, implications, investigations. Computing and Artificial Intelligence 3(1): 1498.
  5. Zarif BA (2024) Computer vision and beyond: A deep dive exploration and investigation. Trends Tech Sci Res 7(3): 555711.
  6. Akhtar ZB (2024) Unveiling the evolution of Generative AI (GAI): A comprehensive and investigative analysis toward LLM models (2021-2024) and beyond. Journal of Electrical Systems and Inf Technol 11(22):
  7. Akhtar ZB (2024) The design approach of an Artificial Intelligent (AI) medical system based on Electronical Health Records (EHR) and priority segmentations. J Eng 2024(4): e12381.
  8. Zarif BA (2023) Accelerated computing a biomedical engineering and medical science perspective. Annals of the Academy of Romanian Scientists Series on Biological Sciences 12(2): 138-164.
  9. Weerarathna IN, Kumar P, Luharia A, Mishra G (2024) Engineering with biomedical sciences changing the horizon of healthcare-a review. Bioengineered 15(1): 2401269.
  10. Mishra A, Omoyeni T, Singh PK, Anandakumar S, Tiwari A (2024) Trends in sustainable chitosan-based hydrogel technology for circular biomedical engineering: A review. International Journal of Biological Macromolecules 276(1): 133823.
  11. Ngiejungbwen LA, Hamdaoui H, Chen MY (2024) Polymer optical fiber and fiber bragg grating sensors for biomedical engineering applications: A comprehensive review. Optics and Laser Technology 170: 110187.
  12. Khan NR, Sharmin T, Rashid AB (2024) Exploring the versatility of aerogels: Broad applications in biomedical engineering, astronautics, energy storage, biosensing and current progress. Heliyon 10(1): e23102.
  13. Li X, Wang S, Zheng M, Ma Z, Chen Y, et al. (2024) Synergistic integration of MXene nanostructures into electrospun fibers for advanced biomedical engineering applications. Nanoscale Horizons 9(10): 1703-1724.
  14. Wu C, Wan B, Entezari A, Fang J, Xu Y, et al. (2024) Machine learning-based design for additive manufacturing in biomedical engineering. International Journal of Mechanical Sciences 266: 108828.
  15. Akhtar ZB, Stany RV (2020) The design approach of an artificial human brain in digitized formulation based on machine learning and neural mapping. 2020 International Conference for Emerging Technology (INCET) pp. 1-7.
  16. (2016) Convention on biological diversity.
  17. Dehchani AJ, Jafari A, Shahi F (2024) Nanogels in biomedical engineering: Revolutionizing drug delivery, tissue engineering and bioimaging. Polymers for Advanced Technologies 35(10): e6595.
  18. (2023) Systems biology. Britannica.
  19. Mariatheresa C, Kelvin A, Adelodun MO, Igwama GT, Anyanwu EC (2024) Advancements in biomedical device implants: A comprehensive review of current technologies. International Journal of Frontiers in Medicine and Surgery Research 6(1):
  20. Islam A, Seth S, Bhadra T, Mallik S, Roy A, et al. (2024) Feature selection, clustering and IoMT on biomedical engineering for COVID-19 pandemic: A comprehensive review. Journal of Data Science and Intelligent Systems 2(4): 191-204.
  21. Wang H, Mayhew D, Chen X, Johnston M, Mitra RD (2011) Calling cards enable multiplexed identification of the genomic targets of DNA-binding proteins. Genome Research 21(5): 748-755.
  22. Kwasnieski JC, Fiore C, Chaudhari HG, Cohen BA (2014) High-throughput functional testing of ENCODE segmentation predictions. Genome Research 24(10): 1595-1602.
  23. Arnold CD, Gerlach D, Stelzer C, Boryń ŁM, Rath M, et al. (2013) Genome-wide quantitative enhancer activity maps identified by STARR-seq. Science 339(6123): 1074-1077.
  24. (2025) Unveiling the Evolution of Generative AI (GAI). Elivabooks.
  25. Hart T, Chandrashekhar M, Aregger M, Steinhart Z, Brown KR, et al. (2015) High-resolution CRISPR screens reveal fitness genes and genotype-specific cancer liabilities. Cell 163(6): 1515-1526.
  26. Gilbert LA, Horlbeck MA, Adamson B, Villalta JE, Chen Y, et al. (2014) Genome-scale CRISPR-mediated control of gene repression and activation. Cell 159(3): 647-661.
  27. Battle A, Brown CD, Engelhardt BE, Montgomery SB (2017) Genetic effects on gene expression across human tissues. Nature 550(7675): 204-213.
  28. (2018) GTEx creates a reference data set to study genetic changes and gene expression. National Institutes of Health: Office of Strategic Coordination-The Common Fund.
  29. Li Y, Shi W, Wasserman WW (2018) Genome-wide prediction of cis-regulatory regions using supervised deep learning methods. BMC Bioinformatics 19(1): 202.
  30. Diss G, Lehner B (2018) The genetic landscape of a physical interaction. Elife 7: e32472.
  31. Mardis ER (2008) Next-generation DNA sequencing methods. Annual Review of Genomics and Human Genetics 9: 387-402.
  32. Nirenberg M, Leder P, Bernfield M, Brimacombe R, Trupin J, et al. (1965) RNA codewords and protein synthesis, VII. On the general nature of the RNA code. Proceedings of the National Academy of Sciences of the United States of America 53(5): 1161-1168.
  33. Fiers W, Contreras R, Haegemann G, Rogiers R, Van VA, et al. (1978) Complete nucleotide sequence of SV40 DNA. Nature 273(5658): 113-120.
  34. Bruce G, Buchanan BG, Shortliffe ED (1984) Rule-based expert systems: The MYCIN experiments of the stanford heuristic programming project. pp. 754.
  35. Duda RO, Shortliffe EH (1983) Expert systems research. Science 220(4594): 261-268.
  36. Miller RA (1994) Medical diagnostic decision support systems-past, present and future: A threaded bibliography and brief commentary. Journal of the American Medical Informatics Association 1(1): 8-27.
  37. Baxt WG (1991) Use of an artificial neural network for the diagnosis of myocardial infarction. Annals of Internal Medicine 115(11): 843-848.
  38. Koomey J, Berard S, Sanchez M, Wong H (2010) Implications of historical trends in the electrical efficiency of computing. IEEE Annals of the History of Computing 33(3): 46-54.
  39. Jha AK, Catherine MD, Campbell EG, Donelan K, Rao SR, et al. (2009) Use of electronic health records in US hospitals. The New England Journal of Medicine 360(16): 1628-1638.
  40. Dougherty G (2009) Digital image processing for medical applications. Cambridge University Press, Cambridge, United Kingdom.
  41. (2020) Artificial intelligence and machine learning for healthcare. Sigmoidal.
  42. Power B (2015) Artificial intelligence is almost ready for business. Harvard Business Review.
  43. Bahl M, Barzilay R, Yedidia AB, Locascio NJ, Yu L, et al. (2018) High-risk breast lesions: A machine learning model to predict pathologic upgrade and reduce unnecessary surgical excision. Radiology 286(3): 810-818.
  44. Bloch BS (2016) NHS using google technology to treat patients. BBC News.
  45. Wang H, Zu Q, Chen J, Yang Z, Ahmed MA (2021) Application of artificial intelligence in acute coronary syndrome: A brief literature review. Advances in Therapy 38(10): 5078-5086.
  46. Infante T, Cavaliere C, Punzo B, Grimaldi V, Salvatore M, et al. (2021) Radiogenomics and artificial intelligence approaches applied to cardiac computed tomography angiography and cardiac magnetic resonance for precision medicine in coronary heart disease: A systematic review. Circulation Cardiovascular Imaging 14(12): 1133-1146.
  47. Sotirakos S, Fouda B, Mohamed RN, Cribben N, Mulhall C, et al. (2022) Harnessing artificial intelligence in cardiac rehabilitation, a systematic review. Future Cardiology 18(2): 154-164.
  48. Chen W, Sun Q, Chen X, Xie G, Wu H, et al. (2021) Deep learning methods for heart sounds classification: A systematic review. Entropy 23(6): 667.
  49. Chan S, Reddy V, Myers B, Thibodeaux Q, Brownstone N, et al. (2020) Machine learning in dermatology: Current applications, opportunities and limitations. Dermatology and Therapy 10(3): 365-386.
  50. (2020) COVID-19 pandemic impact: Global R&D spend for AI in healthcare and pharmaceuticals will increase US $1.5 billion by 2025.
  51. Joshi A, Kumar A, Kaushik V (2024) Functional genomics and network biology. Advances in Bioinformatics pp. 71-96.
  52. Ye Q, Zhou C, Lin H, Luo D, Jain D, et al. (2024) Medicago2035: Genomes, functional genomics and molecular breeding. Molecular Plant 18(2): 219-244.
  53. Mahgoub EO, Cho WC, Sharifi M, Falahati M, Zeinabad HA, et al. (2024) Role of functional genomics in identifying cancer drug resistance and overcoming cancer relapse. Heliyon 10(1): e22095.
  54. Ye P, Bai W, Ren Y, Li W, Qiao L, et al. (2024) Genomics-FM: Universal foundation model for versatile and data-efficient functional genomic analysis.
  55. (2025) Exploring the role of molecular engineering in regenerative medicine.

© 2025 Zarif Bin Akhtar. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and build upon your work non-commercially.

-->