Solution-oriented, technology leader with extensive experience developing complex IT
solutions
and
building high-performing technology organisations. Data-driven, detailed oriented and skilled at
defining
corporate IT strategy. Polyglot developer background; expert in designing complex distributed systems.
Entrepreneurial and adaptable, successful in corporate as well as venture backed start-ups.
Competencies
Technology Leadership, Software Development, Project Management, Technical Architecture,
Team Building
and
Development, Agile, Data Architecture, Data Engineering, Cloud System Design, Product Development,
Strategy
Investor relations
BP is a multinational Energy company going through a massive change programme.
Consultant Senior AWS Architect
I lead a team of developers responsible for the delivery of a proprietary global fleet card payments system. The system is deployed to multiple regions across the globe. Event Driven to accommodate interfaces to legacy systems and maintain 24-7 up time.
Management
Lead an agile delivery team using the Scrum methodology. Responsible for the hiring of development staff. Defined ways of working. Built a strong and supportive team. Provided support to the BP apprentice programme. Negotiated with Product Owners and Product Managers on the direction and priority of development. Liaised with Enterprise Architects on rolling out changes to the Division.
Hands on development
Lead technology decisions. Requirements gathering. Business and data analysis. Design of architecture; AWS and event source. Database schema design and build.
React single page application with micro front ends. API development. Development of BFF layer for front-end to back-end communication.
Built Proofs of Concept for several technologies that were adopted by the project.
Managed the development of a common code repository for rolling out to the programme development community. Facilitating faster team start up times.
Built code and components specifically to address Non Functional Requirements, e.g. penetration testing, code quality and security.
Developed continuous integration pipelines to test and deploy code. Introduced a security threat modelling process to asses risk to feature development.
Upp.ai
January 2021 - present - London, UK
Upp uses Artificial Intelligence to optimise Google Shopping for retailers.
Chief Technology Officer / Director of Engineering
Through a critical review and re-engineering of the technology stack, identification of wasted spend I was able to reduced operational spend by half.
Introduced best practise Security protocols (ISO27001).
Conducted investor presentations outlining Upp technology strategy and AI.
Hands on development
Personally designed and built an artificial intelligence to predict revenue changes based on changes in advertising spend. Utilising an unsupervised learning neural network, taking more than a dozen features, and built using Python and Tensorflow, to predict product level revenue changes sensitive to advertising spend.
Re-architected the Upp system infrastructure to eliminate obsolete code. Reduce complexity of tech stack. Java -> Nodejs.
Designed and implemented a Kimball compliant reporting data warehouse. Postgres
Management
Enhanced team performance and moral. Introduced team and staff development programs. Team learning program and regular development reviews.
Filled gaps in the Engineering team. Built out Data Science function.
tinkr, (formerly JamieAi)
June 2018 - January 2021 - London, UK
tinkr is an innovative tech company that uses Artificial Intelligence to assist candidates
in
their
job search and companies in their candidate search.
Chief Technology Officer
Hands on development
Architected and oversaw the implementation of a multi-tier, real-time, scalable, AI
application.
Comprising;
React web application, React Native app, Postgres database, AWS cloud based API and real-time AI
feedback
(Python).
Designed a sophisticated relational data model (Postgres) for storing user data. Optimising data
retrieval times and data versioning using Kimball Type 2 dimensions.
Storing user data required some lateral thinking. As users made changes to their CV the updates to
CV components
would be scored and added to the rest of the CV score. Scoring CV deltas saved processing time in
the back-end AI model.
As the CV changes over time the user would soon become aware of the changes that brought positive
benefit
to their CV score.
Each individual score component was time stamped. Utilising Kimball Type 2 Dimension Methodology,
the user's CV
can be "rolled back" to any point in time.
Additionally, we could use the changes time series data to analyse user interaction with the
application.
The database used is Postgres. Enhancements were made to the Sequelize Object Relational Model (ORM)
node module
to build and access the database model. These enhancements enabled inclusion of Type 2
Dimensionality to database tables.
Designed and built an auto scaling, multi-tiered, Machine Learning (NLP) data processing pipeline.
The challenge was to process one hundred million text documents (CVs and job descriptions) to build a
corpus for a machine learning job to build a word embedding model (the AI).
The pipeline was designed to accommodate differing hardware demands of different layers of
processing.
Some processes required intensive disk access whereas other processes were RAM intensive.
The solution included a dashboard for configuration and monitoring and a back-end controller
service.
The Dashboard gave the user access to the controller. The user would define the processing code to be
used
in each processing layer and the number and type of compute resources for each layer.
The user could then start, or stop and monitor running jobs.
The system would dynamically build layers of processing
instances (EC2). A bespoke Linux (Ubuntu) EC2 image was built for this process. It was built to optimise
certain processing requirements and include specific code libraries.
Each processing instance would use layer specific code written in Python, imported at runtime from our
GitHub repository.
Between each processing layer is a Kinesis Stream (AWS' Kibana implementation) to collate results and
distribute
to the next processing layer.
I wrote the webapp and controller in node.js. The controller would orchestrate building and
destroying of AWS resources via API calls utilising the node AWS SDK.
The Dashboard is an Express app. The controller is service written with Nodejs. The pipeline
processing components are written in Python. Modules used include pandas, numpy and spacey.
Provide leadership of the company's technology function. Full budgetary control.
Audited current company technical capabilities and direction. Defined new IT strategy.
Restructured existing development policies and practices [Agile] to align with new IT strategy.
Developed interdepartmental communication policies and processes.
Built Development and Data Science teams to deliver the IT strategy. Created a personal development
and improvement culture. Implemented a mentoring system for junior staff.
Assessed suitability of current product offering. Determined that the current product offering was not
sustainable. Researched and designed new products for delivery to the both B2C and B2B markets.
Pivoted company to deliver a new and innovative product suite.
Delivered a superior product offering whilst reducing burn rate by over 50%.
Defined and managed on-going support function.
Developed presentations and presented to investors.
Doubled investment in the firm.
Compliance as a Service (CaaS)
January 2016 - June 2018 - London, UK
CaaS is a boutique consultancy, providing service to the finance industry. With innovative
services
and solutions for multiple regulatory requirements.
Chief Technology Officer
Lead the company's technology department. Define IT strategy.
Architected systems, managed IT infrastructure. Designed and built internal support dashboard.
Designed and developed daily position breach notification system for Short Selling Directive regulatory
requirement. The product improved client companies' efficiency in meeting new regulatory requirements.
Cutting processing time from hours to minutes.
CaaS Short Selling service is a managed service for companies to track their exposure to short
positions
that might breach regulators' minimum allowed. The user delivers a positions file via FTP or email
on a daily basis. For derivatives and indices
we source reference data from across the internet. We break down the derivatives and indices to find
the underlying equities.
The service uses AWS data pipeline service to process data. We use an AWS RDS (mySQL) database.
Web crawlers scrape reference data from publically available sources.
Each user is isolated to their own VPC. Reference data is stored in a central database accessible
from each private network.
Designed and developed a, fully indexed and searchable, library of Financial Services regulations.
This tool enabled client companies' to identify and track their compliance with relevant regulatory
demands.
Saving companies on headcount.
regulationnavigator.com is an Express web application
that details regulations relevant the financial services industry.
It uses Firebase for user authentication and profile storage. The server is written in Nodejs with
an Apache reverse proxy
in front. The UI is bespoke to the project utilising bootstrap Material Design for the components.
Data is loaded to the server via Zoho Creator application. The server runs on an AWS EC2 Linux
instance.
The user can save and share reports and track compliance with regulations that impact their
business.
Technologies: AWS, EC2, S3, Firebase, Nodejs
Managed onshore and offshore development and consultancy staff.
Deutsche Bank, Group Finance Change
May 2012 - December 2015 - London, UK
Deutsche Bank AG is a multinational investment bank and financial services company
headquartered in Frankfurt, Germany. The bank spans 58 countries with a large presence
in Europe, the Americas and Asia.
Data Business Analyst [contract]
Designed and built of several database models to solve new regulatory requirements.
FINREP and COREP data models - designed prototypes for both systems.
The FINREP
and COREP
projects both required extensive mapping of data from the company's Finance Data Warehouse
to the reporting format required by the regulator. This mapping process was not a simple challenge.
I was given my own database with a snapshot copy of the Finance data warehouse. I had full control
of the data.
Working closely with Finance Change Business Analysts I developed a prototype to represent the data
flow
from the data warehouse to the reports.
Proxies needed to be found to split some data warehouse line items across particular line items in
the reports.
The code for this project was written in Oracle PL/SQL. I build the Business Analysts a mapping tool
in Excel.
They would list the mappings in the spreadsheet. VBA macros would source the data and process.
Once the prototype was complete, I oversaw the transition of the prototype into functional requirement
documents
for the developers to build this solution in the monthly processing run for the Financial Data
Warehouse.
Asset Encumbrance - modelled impact of new regulation.
Financial Sector Exposure - designed and developed a system to calculate level of exposure to the sector.
Managed the offshore development and release.
This system saved the DB billions of Euros in capital requirement.
Large exposure to other financial sector companies was the cause of the Global Economic Crisis in
2007.
Reducing the company's exposure to this risk was the remit of the FSE project.
Complex rules were defined by the regulator specifying how a company may offset long and short
positions
in other Financial Sector Entities. I was charged with developing a prototype to calculate the
overall
exposure to the finance sector.
Many positions were held in indices and derivatives. These needed to be broken apart to access the
underlying
shares and relative weightings. Rules to offset long positions with corresponding short positions were
implemented.
I used Oracle PL/SQL to build the prototype. Sourcing position data from the Financial Data
Warehouse and splitting
those position in derivatives to extract the Financial Sector Entities as defined by sourced reference
data.
Dodd Frank - derivative exposure calculation - developed system prototype. Converted business
requirements into functional and technical design documents.
Worked with multiple asset classes; equities, fixed income, bonds, CDO, FX, Repo, CDS, equity
derivatives.
Data modelling, data analysis, technologies: ORACLE PL/SQL.
Royal Bank of Canada, London
September 2007 - May 2012 - London, UK
The Royal Bank of Canada is a Canadian multinational financial services company and the
largest bank in Canada by market capitalization. The bank serves over 16 million clients
and has over 86,000 employees worldwide.
Data Warehouse Manager [contract]
Ownership of Regulatory Reporting Data Warehouse.
Managed significant change process, design logical and physical data model changes.
Basel II - Implemented liquidity rules. Managed team of over ten internal and external
developers and analysts.
The data warehouse utilised Informatica for ETL storing data in a Sybase databse. In taking ownership of the warehouse
I inherited a significant backlog of work including the integration of Basel II Liquidity Risk requirements.
The data model of the Data Warehouse needed to be changed. Additional data points needed to be stored
against
existing positions representing future cash flows. This required significant alterations to the Informatica ETL
jobs. New
data sources were added to the model to add new data.
Scaling up the team with staff from an external consultancy and internal hires I was able to manage
the
delivery of the change program on time and budget.
Incidentally, I was identified as a "key man risk" to the project and, on contract renewal, offered
a completion bonus to
stay to end of the contract.
Designed, developed and implemented SQL Bond Modified Duration (MD) calculation.
More accurate calculation of MD saved thousands of pounds per day.
Amongst other daily reports, determining on the sensitivity of bonds held to interest rate movements
was
an essential part of the Regulatory Reporting data Warehouse.
In the past, this calculation was performed by a batch script running on a dedicated pc under a
desk.
The code base, written in C++ used a generic Finance library to perform the calculation.
This machine was a risk to the daily processing as it was not secure, not backed-up and subject to being
accidentally un-plugged by the cleaners!
To bring the calculation inside the daily production process I wrote in Sybase T-SQL a Modified Duration
calculator to perform this function. Using best practice interpolation technique to perform the
calculation as per the bank's quantitative analysis department (Toronto) requirement.
ABN AMRO Bank N.V. is a Dutch bank with headquarters in Amsterdam. ABN AMRO Bank is the
third-largest bank in the Netherlands. At the time it operated in 63 countries and had over 110 000
employees.
Data Architect [contract]
Responsible for global market risk system OLAP presentation layer.
Designed and implemented VaR calculation.
Reduced VaR calculation times by 95%.
The Market Risk System (MRS) used Value at
Risk (VaR) to measure the sensitivity to interest rate movements.
The reporting layer of the MRS used Essbase as the OLAP engine to aggregate the equity risk across
the organisational structure.
VaR was the measure, however it cannot be aggregated. For reporting at each level in an organisation the
overall
position needed to be aggregated and then the VaR calculated at that level.
To solve this problem, and take advantage of some of the advanced Essbase functionality, I developed
a dynamic
calculation that would utilise pre-aggregated positions and apply the historical simulation
calculation.
The standard Essbase interpolation funcion did not perform as required by the user. Therefore I had
to develop
a new calculation to produce the correct result. This calcualtion was developed in java and wrapped
in c++ to interface
to Essbase.
The Essbase calcualtion engine aggregates calculation results up from leaf nodes to parent nodes up
each
dimension hierarchy. For positions held at parent level totals for child calculations over write the
parent
calculation result. Previous attempts to design the OLAP cube suffered from this problem.
When I arrived on the team, I immediately identified this issue and altered the calculation code to
correct
for this error in design. Resulting is correct totals being aggregated throughout the OLAP model.
Designed data mart, data archiving sub system, technologies: ESSBASE, ORACLE PL/SQL, JAVA
Deutsche Bank
May 2000 - December 2001 - London, UK
Deutsche Bank AG is a multinational investment bank and financial services company
headquartered in Frankfurt, Germany. The bank spans 58 countries with a large presence
in Europe, the Americas and Asia.
ESSBASE Contractor [contract]
Charged with the ongoing improvements ta global Equities P&L reporting sub system.
Designed and built DB's famous 'cube factory'. A system to be rolled out across the
division to simplify the building of OLAP solutions.
Saved months of development time for subsequent projects.
Prototyping and development of new data models for liquidity and risk systems, technologies: ESSBASE,
ORACLE
PL/SQL, JAVA, VB
KPMG Consulting
Dec 1999 - May 2000 - Melbourne, Australia
KPMG is a multinational professional services network, and one of the Big Four accounting
organizations. The Consulting division became Bearing Point in 2001.
Business Intelligence Analyst
Provide consulting services to clients, specialising in management information systems. Technologies:
ESSBASE, ORACLE PL/SQL, VB
Telstra
Sep 1989 - May 2000 - Melbourne, Australia
Telstra Corporation Limited is an Australian telecommunications company. Telstra has transitioned from
a state-owned enterprise to a fully privatised company and has recently focused on diversified
products and emerging technologies.
Various positions
Financial audit, treasury functions, management accounting, business reporting, OLAP system
development. Technologies: ESSBASE, ADL, DOS