Blog

Mechanistic Approach in Engineering Problem Solving

Prabhakaran Veeramani, Lead Project Engineer

Engineering problems can be highly complex, and while some simpler ones may be resolved through analytical calculations, the advent of computational technology has given rise to Finite Element Modeling (FEM), which utilizes numerical calculations to solve complex problems that were previously unsolvable through basic mathematical computations.

Engineering problems can be highly complex, and while some simpler ones may be resolved through analytical calculations, the advent of computational technology has given rise to Finite Element Modeling (FEM), which utilizes numerical calculations to solve complex problems that were previously unsolvable through basic mathematical computations.

Mechanistic modeling is a fusion of modern data science techniques and mathematical science employed to tackle complex engineering problems.

The following illustration provides a clearer explanation:

illustration

To build a Mechanistic model, the following steps should be taken:

Step 1 Identify the problem that needs to be solved.
Step 2 Perform a root cause analysis to determine the underlying factors contributing to the problem.
Step 3 Identify the major influencing factors among the contributing factors.
Step 4 Extract mechanistic features in the form of a dataset corresponding to the major influencing factors.
Step 5 Perform exploratory data analysis to gain insights into the data.
Step 6 Perform feature engineering to prepare the data for machine learning models.
Step 7 Build and evaluate appropriate machine learning models using the features extracted with underlying physics in consideration.

 

Since machine learning models are built with features extracted from underlying physics, the reliability of a mechanistic model is higher than that of a plain statistical model. The study of system characterization can be classified into three categories: single system characterization, homogeneous multi-system characterization, and heterogeneous multi-system characterization.

The major advantage of mechanistic modeling is its ability to capture characterization from different forms of inputs, such as design inputs, process inputs, and manual inputs, in engineering problems. In a Finite Element Analysis (FEA) analysis, it is nearly impossible to study the combined effect of all kinds of inputs. However, mechanistic modeling can capture this effect in the form of a data model.

Graph

Another advantage of mechanistic modeling is that, unlike traditional methods, the scientific efforts required are concentrated primarily in the initial stages of the product or project. As the project progresses, the scientific effort required significantly reduces. In contrast, traditional methods require constant effort throughout the product or project lifetime. Thus, mechanistic modeling is a powerful approach for solving complex engineering problems.

Read more...

Quantum Chemistry, AI and Drug Development

Dr. Dhatchana Moorthy, Director, EinNext Biosciences

Computational chemistry and quantum chemistry are scientific fields that use computer simulations and calculations to study molecules and materials at the atomic and molecular levels. Computational chemistry predicts chemical systems’ properties and behaviour, while quantum chemistry applies quantum mechanics to understand the electronic structure and properties of molecules. These fields play a crucial role in drug discovery, material science, and environmental science by providing insights into the fundamental principles that govern chemical phenomena.

Computational chemistry is essential in drug development as it helps scientists model the behaviour of solvents and active pharmaceutical ingredients (API) / drugs at a quantum level. Understanding the quantum properties of these molecules is crucial for designing safe, effective drugs with desired properties. Computational chemistry is also crucial in developing new drug formulations and optimizing existing ones by improving the solubility and stability of drugs.

However, quantum chemistry can be complex, making it challenging for bench-level research scientists to apply it to drug solubility problems. To alleviate this challenge, a user-friendly dashboard interface with quantum chemical calculations running in the background can provide a comprehensive view of a drug’s molecular properties and thermodynamic parameters, making it easier to predict and compare the solubility of different drugs in a solvent.

UI

A screenshot of a user interface prototype for calculating the total activity of Aspirin in different solvent mixtures.

 

EinNext Biosciences offers customized dashboard solutions powered by advanced AI/ML techniques to cater to various industries such as pharmaceuticals, chemicals, paints, perfumeries, petrochemicals, and distilleries. These solutions are built using python frameworks, task management libraries, and a broker service that employs implicit solvation models to compute quantum chemical properties. The dashboards display valuable insights such as activity and partition coefficients of a given API in a solvent mixture, presented through visually engaging data visualizations.

Our EinNext Biosciences team has recently completed a few projects for our clients, and it is evident that our dashboard solutions will drastically reduce the time and cost associated with standardizing solutions and formulations.

Read more...

National Symposium on Health Data and AI

Prof. Dr. Andre Dekker with EinNext Biosciences team

Our team from EinNext Biosciences participated in the National Symposium on Health Data and AI that took place on March 17th -18th, 2023, at Christian Medical College, Vellore, a premier medical institution located in Tamil Nadu, India. The symposium was attended by a conglomerate of experienced physicians, researchers, academicians, and policymakers to discuss and debate the emerging role of AI in healthcare and medical sciences. The opening speaker, Dr. Andre Dekker, a Professor of Clinical Data Science and Medical Physicist from the Netherlands, talked about the strengths & limitations of AI. “AI is not intelligent,” he said, but completely dependent on training. Replying to a question, he strongly advocated the inclusion of AI in medical education.

Prof. Balaraman Ravindran from IIT-M delivered an enlightening talk on demystifying AI and its applications in healthcare.

Dr. John Oommen, an alumnus of CMC and a Community Health Physician in the state of Odisha, spoke about the digital divide between India (haves) & Bharat (have-nots). He emphasized the need to carefully navigate the ethical implications of AI technologies.

National Symposium

National Symposium on Health Data and AI

Ensuing sessions included topics such as Telehealth, AI for better health, Bringing pipelines together for AI-based research, Network and Cyber Security in Hospitals, and legislative implications for health data management. The conference featured speakers from reputed universities and hospitals across the globe. As a leading bioscience team working with AI, the symposium provided an opportunity for us to establish relationships with medical institutions and universities. While networking with the physicians, we got valuable insights into the need for AI-based solutions for the medical fraternity to deal with lifestyle disorders, cancer, and infectious diseases. The conference also served as a stark reminder of the ethical considerations and responsibilities that arise from harnessing the immense power of artificial intelligence (AI).

Read more...

Success stories

Behavior Imaging, a Boise, Idaho company, uses a system called the Naturalistic Observation Diagnostic Assessment. In the privacy and comfort of their own homes, families use a smart phone app to capture and upload videos of their child’s behaviors in specified situations.

Clinicians watch the videos to make remote diagnoses. More recently, the company has started training AI-like algorithms to observe and categorize behaviors. Although, the algorithms would not diagnose the children, they might be used to point clinicians to specific behaviors that might otherwise have been missed.

Another use of AI-aided diagnosis is an autism screening tool created by Cognoa in Palo Alto California. The tool uses clinically validated artificial intelligence (AI) technology to aid physicians in diagnosing ASD in children between the ages of 18 and 72 months who are at risk of developmental delay.

AI thus reduces the quantum of work by a clinician and therefore speeds up the ASD diagnoses pipeline. We can conclude that AI is of great value in ASD research and the challenging problems related to ASD are ripe for the application of AI/ML technologies. EinNext R&D aims to develop promising screening tools to help decrease the length of time and cost required for diagnosis of ASD.

Read more...

Artificial Intelligence in ASD

EinNext Biosciences

Autism Spectrum Disorders (ASD) comprise a group of neurodevelopmental abnormalities that begin in early childhood characterized by difficulty with social communication and interaction, restricted interests, and repetitive behaviors. According to the latest international data, the incidence of ASD has increased from 150:1 in 2000 to 36:1 in 2017, making it the leading cause of disability in children. With estimates of 2% of the children diagnosed with ASD in the US, researchers should investigate treatment methods, intensity, duration, and outcomes.

The Challenge:

Although the exact cause of ASD remains unclear, current studies suggests that it may be associated with genetic factors, abnormal brain structure and environmental factors.  Since autism spectrum disorder varies widely in symptoms and severity, with no specific medical test to diagnose the disorder, diagnosing ASD can be difficult. Healthcare providers diagnose the condition based on standardized assessments of the patient’s history and behavior. Although ASD can be diagnosed as early as 15- 18 months of age, many children do not receive a final diagnosis until they are in their adolescence or adulthood.

Early diagnosis and intervention are imperative for timely effective treatments to minimize symptoms, and to improve long-term outcomes related to cognition, language, adaptive behavior, daily living skills, and social behavior for children with ASD. Additionally, accurate identification is challenging as ASD is often enmeshed with other neurodevelopmental disorders, and medical comorbidities.

AI in Autism Research:

Since existing solutions for diagnosis of ASD are both resource-intensive and cost-intensive, Artificial intelligence (AI) is a potential solution to this issue. Autism spectrum disorder research has yet to leverage big data on the same scale as other fields. Advancements in easy, affordable data collection and analysis may soon make this a reality.

The high prevalence rate and heterogeneous nature of ASD have led some researchers to turn to machine learning over traditional statistical methods for data analysis. In the last decade, study of AI in Autism has shown a remarkable increase in trend. The availability of various machine learning toolkits, such as Hadoop, TensorFlow, Spark, and R, has led to unique opportunities for researchers to leverage machine learning algorithms.

 Hotspots and Research Fronts:

To improve screening and diagnosis, Machine learning algorithms—such as SVMs— have been used in ASD research. These models have been shown to improve the accuracy of diagnoses and provide insight into how different characteristics (such as standardized assessments, eye movement data, upper limb and general kinesthetic data, and neuroimaging data) can aid in the differential diagnosis of ASD. Standardized assessments that are currently in use have a potential for misdiagnoses, particularly when distinguishing one disorder from another. Machine learning procedures can discriminate individuals with ASD from individuals with attention deficit hyperactivity disorder (ADHD) with a high level of accuracy.

Machine learning has also been implemented in examining neuroimaging data. The symptom severity of individuals with ASD based on cortical thickness using support vector regression (SVR) and ENet penalized linear regression has been studied. Machine learning models, including RF, to analyze neuroimaging data for diagnostic classification purposes have also been implemented.

Longitudinal data captured at multiple points in development (8 and 14 months of age) for high-risk siblings has been used to increase accuracy of predicting ASD diagnosis at 36 months. Deep learning has been used to study predictors of challenging behavior and analyze neuroimaging in individuals with ASD.

While data from the electronic health records of decedents with an ASD were used to build a random forest classifier to examine the life-time health problems of those with ASD, the impact of parental age on the risk of developing ASD has been evaluated using logistic regression.

Machine learning has also been applied in ASD genetics research, allowing researchers to determine which genes are related to ASD. These findings highlight the potential of machine learning to improve understanding of the role genes play in the development of ASD.

Read more...