Client: OHS Auditing Firm
Description: This project was done for an OHS auditing company in South Africa, which specialises in doing health and safety auditing
and training in the country. The method that was used to complete these audits were paper based with the
data capturing and reporting being completed at the office once the auditors return from the site.
My company was contracted to digitise this process and decrease the amount of time it takes from the time the auditors
arrive at site to the submission of the final report to the client.
We build a cloud-based application to meet this need and ended up cutting down the overall time consumed by 80%.
Role: Software Engineer
Toolset and Languages: Mendix Cloud Platform, Java, Xpath
Summary of work Done:
- Building and attaching the various components together (front end, back end and database).
- Analysing current processes to build components.
- Supporting the end users.
Client: OHS Auditing Firm
Description: After completing the OHS auditing application, the client wanted to further digitise their organisation.
They asked us to analyse and solution their current training solution which was a physical software that was provided
to clients on a CD-ROM. It was decided that this training solution will allow them to reach more clients and be easier
to use to maintain and manage the content if it was a cloud-based solution.
Role: Business / Systems analyst
Toolset and Languages: MS Visio
Summary of work Done:
- Assisted with the business and systems analysis.
- Documenting process flow and application logic.
- Diagramming the to be process flow and logic.
- Mocking up the user interface.
Client: Large Bank in Africa
Description: The bank owns and manages many over 2500 branches across Africa. One of their requirements is to comply with the
health and safety regulations within the country. An OHS auditing firm has been auditing the the banks branches across Africa for many years.
During one of the audits, the bank noticed the system that we built for the auditing firm and thought that they could use a system similar
to that to manage their OHS compliance and do their own internal audits. My company was contracted to build this system for the organisation.
Role: Technical Lead / Solution Architect / Software Engineer
Toolset and Languages: Mendix Cloud Platform, Java, XPath
Summary of work Done:
- Design the overall solution architecture of how all the components will function together.
- Lead the development team on the application build.
- Manage stakeholder engagement from a technical perspective.
- Building and integrating components.
- Supporting the end users.
Client: Large Financial Services Provider
Description: The financial services provider group is very large organisation with stakes and ownership in many subsidiaries.
I was tasked to work with various parts of the business to build a combined dataset that captures the
full picture of the groups subsidiaries alongside the percentage ownership. This dataset then needed to be displayed on a dashboard.
Role: Business Analyst / Dashboard Developer
Toolset and Languages: Excel, Power BI
Summary of work Done:
- Worked with various parts of the business to gather data that they managed.
- Combined the dataset into a uniform source.
- Built the dashboard to display the legal entity structure.
Automated Timesheet Submission
Client: Large Financial Services Provider
Description: The FSP had many contractors working across multiple different workloads.
This made timesheet tracking difficult as contractors needed to determine which stakeholders to email timesheets to depending on which
workloads they worked on during the week. This became a tedious process as contractors had to first determine the workloads they worked on,
then determine who is in charge of managing the workload and then email their excel timesheet to all the relevant workload managers.
I was tasked at finding a way to make this process easier as well as implementing it.
Role: Systems Analysis / Development / Data Engineering
Toolset and Languages: Excel, VBA, Sap Hana, Power BI, SQL
Summary of work Done:
- Adjust the excel timesheet template to make it more data ingestion friendly.
- Code VBA functions to allow users to push their timesheets to Sap Hana via the click of a button.
- Build a Datawarehouse (Kimball Methodology) for the timesheet data, ensuring we cater for multiple submissions etc.
- Build a cube off that data warehouse.
- Consume this in Power BI and build a dashboard with various filtering mechanisms to allow the different stakeholders to get a view that is specific to their needs.
Investments Portfolio Construction
Client: Large Financial Services Provider
Description: The organisation is the investment and asset management arm of A large financial services provider. I was contracted to put
together a warehouse and assist with building some advanced analytics functionality on that warehouse. The purpose of the warehouse was to allow for
portfolio construction, in which asset managers could construct portfolios on the fly with various different asset classes and then run the
portfolio through an efficient frontier to see where it lands in terms of risk vs reward by using historic data of the various instruments to make a
calculated prediction.
Role: Data Engineer
Toolset and Languages: Sap Hana, Teradata, Sap Data Services, SQL, R, Java
Summary of work Done:
- Data Engineering:
- Coding in the various business rules and logic.
- Data Modelling.
- Building ETL flows.
- Data Analysis:
- Analysing data to determine business rules.
- Analysing existing solutions to surface business rules and calculations.
- Assisting with building of the advanced analytics functionality.
- Assisting with the build of an API to allow us to consume the data in MATLAB.
Integrating the information governance toolset with the stack
Client: Large Financial Services Provider
Description: The organisation has opted to use the informatica toolset to manage information. One of the tools in informatica is used to review
business logic. In order to review the business logic, this tool needs to be integrated with the various tools that we use to build our business logic on.
I was tasked with integrating the informatica toolset with these tools.
Role: Integration Developer
Toolset and Languages: Informatica Analyst Tool, Sap Hana, Teradata, Cloudera, XML, JSON
Summary of work Done:
- Building the integration layer between these toolsets.
- Building a parser to transform XML to JSON.
Investments Assets Under Management
Client: Large Financial Services Provider
Description: The organisation manages assets for many different clients and organisations. In order to service their clients better,
they need to be able to analyse and report on their data effectively. I was part of a team that focused on building the AUM
(Assets Under Management) data warehouse.
Role: Data Engineer
Toolset and Languages: Teradata, Cloudera, Hive, SAP Hana, Sqoop, SAP Data Services, Power BI, SQL, Bash
Summary of work Done:
- Data Engineering:
- Coding in the various business rules and logic.
- Data Modelling.
- Building ETL flows.
- Data Analysis:
- Analysing data to determine business rules.
- Analysing existing solutions to surface business rules and calculations.
- Assisting with troubleshooting Power BI Dashboard issues.
- Assisting with troubleshooting Excel Power Pivot MDX queries.
- Using Sqoop to migrate data to the lab environment.
Design the solution architecture for the generic data pipeline
Client: Large Financial Services Provider
Description: The organisation was looking to streamline the ingestion process for moving data from around the business to the data lake and vice versa.
Methods did exist to do this but they were cumbersome and separated. I was tasked with evaluating many different tools and putting together a
solution architecture pattern for a streamlined generic data pipeline that could be used easily by business to move and transform data between the
various different businesses.
Role: Solution Architect
Toolset and Languages: Apache Kafka, Apache NiFi, Cloudera, Talend, Apache Airflow, Streamsets, Draw IO
Summary of work Done:
- Evaluate the functionality each tool provides.
- POC the tool to figure out how it works and decide if it’s a good fit for the solution.
- Draw up the solution architecture diagram for the data pipeline.
Client: Large Financial Services Provider
Description: The organisation were looking to adopt the data vault methodology in order to build an enterprise-wide data warehouse.
I was part of the team that did the initial POC to determine if the vault methodology would work and the amount of time it would
take to implement for a source.
Role: Data Engineer
Toolset and Languages: Data Vault Methodology, BIMLFLEX, Sap Hana, Cloudera, SQL
Summary of work Done:
- Analyse the source data and determine the business keys, hubs, links and sats
- Implement the data vault by building it from scratch and then building it using the BIMLFLEX automation tool.
- Document findings, timeframes etc and feedback to architecture.
Data Academy – Analytical view of the customer
Client: Large Financial Services Provider - Data Academy
Description: The Data Academy is responsible for training the grads that are onboard each year. This requires Developing and facilitating various
different modules in the data engineering space. I was responsible for developing the content and facilitating the Analytical view of the customer module which
focuses on data modelling and data visualisation.
Role: Content developer, Facilitator
Toolset and Languages: Sap Hana, SAP Business Objects (Webi)
Summary of work Done:
- Develop the content for the course.
- Facilitate the course.
- Support the students with their final project.
- Grade the students.
Client: Large Financial Services Provider
Description: The Sales Data Hub focuses on analysing and reporting on the raw sales data. The raw sales data is used to provide business with insights on
the sales data which is then used to pay out commissions, make business decisions about products, marketing and a whole lot more. This system is referred to as the
MIS system. It was built over the past 38 years and currently resides on a mainframe. I have been placed in a team that is responsible for migrating this
system onto a newer technology stack in order to cut down on mainframe costs, allow for easier maintenance, operation and also allow more to be built in.
I currently play the role of technical leads for some of the streams of the solution as well as the senior data engineer
Role: Technical lead / Data Engineer
Toolset and Languages: Cloudera, Hive, Wherescape, SAP Hana, BitBucket (GIT), SourceTree (GIT), XL Deploy (Deploy AI), Bamboo, EBX, Automic (Enterprise Scheduler)
Summary of work Done:
- Data Engineering:
- Coding in the various business rules and logic.
- Data Modelling.
- Building ETL flows.
- Data Analysis:
- Analysing data to determine business rules.
- Analysing existing solutions to surface business rules and calculations.
- Technical Leadership:
- Establishing and architecting technical patterns.
- Integrating the various toolsets above.
- Managing code deployments across the landscape.
- Managing the environments for the toolsets.
- Supporting the team members.
- Approving and overseeing builds.
- Managing external team dependencies.
- Managing Stakeholders.
Evolve Academy Data Engineering Module
Client: Internal - Deloitte
Description: I was put in charge of generating the content and facilitating the data engineering module to upskill our junior team members. I put together a 5-day
course that focuses on the end-to-end spectrum of the data engineering role.
The course consisted of both theory and practical sessions. In the Theory sessions, we covered data warehousing, data modelling and the data engineering role in general.
In the practical sessions, we focussed on Building a data pipeline in GitHub Code spaces using Python and SnowPark landing the data in SnowFlake.
The pipeline contained CICD code deployment and orchestration. Once the data is landed in snowflake, we went through a SQL course with them to show them how to do
some advanced SQL and then got them to build a data warehouse using the provided data. They then consumed the warehouse in Power BI to build a dashboard.
Role: Content Creator / Facilitator
Toolset and Languages: GitHub Code Spaces, Snowflake, Power BI, Python, SQL, Linux Terminal
Summary of work Done:
- Develop the content.
- Write the code as well as the quick create scripts.
- Facilitate the course.
Client: Internal - Deloitte
Description: We required a generic data lab that we could quickly spin up when we want to do POC’s or provide our clients a space to do exploration / experiments
on their data. This lab needed to cater for 3 main use cases. Ingesting data from a variety of sources, allowing transformations to be done on the ingested data and
lastly allow reports / dashboards to be built on the data. I was put in charge of creating the solution architecture for how this lab should be put together.
Toolset and Languages: AWS, Draw IO
Summary of work Done:
- Evaluate various components / services of AWS.
- Determine how integrable these services are.
- Put together a design for what the lab should look like.
AI and Gen AI POC Team Lead
Client: Internal - Deloitte
Description: When submitting a proposal for work, we are often able to more clearly show the value that the solution will add if we include a POC. I am tasked with managing
the team that puts together these POC's.
Toolset and Languages: Mainly the big 3 cloud providers.
Summary of work Done:
- Track engagements where a POC would be applicable.
- Determine what should be built for the POC and the toolset that would best fit it.
- Work with / guide the team in building the POC.