Applied AI/ML with Deep Learning for Earth Observation Data Analysis.
Over the last two years, I've been dedicated to promoting the integration of AI among scientists at the globally acclaimed marine research institution, Plymouth Marine Laboratory (PML). PML's core research program is focused on critical areas such as climate change, marine pollution, and sustainability. In a recent research study published in July 2023, I developed Machine Learning approaches to ascertain the primary factors influencing the phytoplankton carbon-to-chlorophyll ratio in the Atlantic Basin.These methods provided valuable insights into the environmental factors affecting phytoplankton carbon and chlorophyll levels in the corresponding figures.
ML and DL Solutions for Banking Anomaly Detection
In the dynamic realm of banking, ensuring effective anomaly detection is paramount. Utilising my expertise in state-of-the-art ML algorithms and advanced Deep Learning techniques, I tackled this imperative at Deutsche Bank, a globally recognized leader in financial services. The solution involved several critical stages:
Identifying relevant data from various sources.
Researching and testing optimal Machine Learning algorithms for tabular data analysis.
Exploring and validating appropriate Deep Learning models for extracting insights from textual data (Natural Language Processing).
Integrating the resulting models into the rigorous banking environment, complete with all necessary software engineering components: Python, Object-Oriented Programming, Algorithm Design and Complexity, Deployment, and more.
Development of a High-Accuracy Medical Diagnosis Model Using Machine Learning
In collaboration with a fellow Data Scientist, we developed a powerful machine learning model. This model proved to be highly effective in analyzing medical data and accurately predicting whether a patient had a medical condition, achieving a remarkable accuracy rate of over 90%. Using Machine Learning and TensorFlow we facilitated the model's development. Data Analysis and Data Visualization techniques enabled us to gain valuable insights from the medical data, while Python played a pivotal role in implementing and fine-tuning the model, resulting in its impressive predictive accuracy.
Strategic Partnerships for Market Expansion and Value Growth
◦ Led a portfolio of more than 8 consulting projects and M&A transactions with a combined value of over $120 million
◦ Facilitated a strategic partnership between a Fortune 100 FMCG company and an Eastern European coffee producer, resulting in a doubling of the distribution chain
◦ Prepared exit strategy from a FinTech portfolio business into a leading financial ecosystem with a 3x value growth
◦ Directed a cross-functional team of 6 to develop and implement a 5-year fashion category strategy for a leading Eastern European marketplace to reach top positions in fashion e-commerce
◦ Established a new partnership pipeline with 5 non-competitors, resulting in 8 prospective leads over a 3-month period
◦ Provided ongoing support to the leadership team, including shareholders and the CEO, through comprehensive research and ad-hoc strategy development
Advised and developed solutions for multiple startup ventures through the challenges of the audio technology sector.
With a 9-year tenure as a freelance Audio Technology Consultant, I crafted a wide array of technical solutions. These encompassed the development and implementation of new real-time audio effects for Arm Cortex M4 hardware, the creation of VST plugins with the JUCE framework, the design and integration of a real-time processing system using Beagle Bone Black and the Bela realtime platform, as well as contributions in musical sound synthesis and Machine Learning consultancy for an interactive musical project.
Generated educational programming content incorporating a custom GPT API-driven approach for data analysis.
At WildFoundry, I held the role of developer advocate, producing educational programming content through articles and YouTube videos while offering client support. A prominent GPT use case from my experience involved analyzing the apps that users ran on their devices. To overcome the challenge of manually examining tens of thousands of users, I developed a script that scraped user pages one by one and analyzed the content via the GPT API before storing the results in a pandas dataframe. This approach enabled an unbiased and comprehensive analysis of user applications, yielding accurate numbers and insights that outperformed traditional survey methods. The ensuing report allowed the identification of target customer groups and guided the company in shaping its customer relation strategy.
Machine Learning and Deep Learning Solutions (NLP) for Automating Market Research Processes
Market research is a domain teeming with diverse data types, including tabular, audio, and particularly text. This wealth of data can be utilised not only to enhance internal processes through automation but also to equip the end client with valuable new functionalities and use cases.
The objective of this use case was to automate the labelling of open-ended questions from online market research surveys. Open-ended questions, which receive text responses, require further post-processing to extract useful information. I automated this process for one of the world's largest market research firms using a series of interconnected techniques:
Utilising various forms of word embeddings, leveraging both Neural Network models and general techniques.
Implementing clustering to group similar responses.
Applying summarization and content extraction methods to distil and highlight key insights.
Improved Customer Retention for Dojah with No-Code KYC
PROBLEM:
Typically, founders that are bootstrapping their startups usually take the "hands on role" early in their startups which means they had to perform manual kyc documentation checks on Dojah platform. Some founders did not have technical know how knowledge on how to consume APIs from the identity verification widget which created a lot of ambiguity, drop off, customer retention lagging for Dojah
SOLUTION:
Using the identity verification widget API, I created a flow chart, PRD and use case scenarios that includes parameter of inquiry needed to call Dojah ID widget APIs if the API were consumed by the customers, then used the inquiry parameter as information fields for the no-code tools.
Using the no-code tools, when a founder enters the inquiry parameters with information provide by their end users, at the backend it is automatically calling the ID widget APIs without these founders/operators knowing. At the frontend, the response from the API call is presented to them without needing to consume the APIs.
GTM for online courses in Asia Pacific
Tested the hypotheses of entering different markets with online courses. Tested different kinds of monetization, identified the paying segments, target audience and product positioning.
Building a recruitment platform
As the CEO of Intrallect Recruit, I am dedicated to pushing the boundaries of recruitment practices and fostering a future where skills-based hiring is at the forefront. Our innovative tech platform, Intrallect Recruit, disrupts traditional recruitment norms, empowering recruiters to make informed decisions and elevating the visibility of talented candidates.
I take a holistic approach, overseeing all aspects of the business, from product development to strategic partnerships. By leveraging research-based assessment tools, we offer candidates a platform to showcase their unique skill sets, promoting diversity, inclusion, and agile work structures in the process.
Collaboration and effective communication are key elements of our success. As we integrate our platform into existing processes, we work closely with stakeholders to ensure a seamless transition, gradually breaking down resistance to change.