Show 100 – Fintech Forum 2020 – Part 2
PRCA members receive 10 CPD points for listening to this podcast if they log it on the PRCA CPD programme.
The second of two episodes recorded in partnership with CFA UK on the subject of Fintech for 2020, where we interviewed a number of the speakers who were due to present at their Fintech Forum.
The event itself was meant to have taken place in London in March, but as a precautionary measure, to minimise the risk of the Coronavirus, CFA UK took the decision to cancel the event, however they weren’t prepared to let COVID-19 beat the podcast, and so we recorded all our interviews online.
In this episode:
- Andrew Eisen, Global Head of Solutions for Financial Services, IHS Markit – New opportunities and age old challenges
- Independent Consultant Rehan Islam & Raj Gupta, Lead Data Scientist at Location Sciences – Alternative Data
- Stacey English, Chief Digital Officer, Corlytics – The future of Regtech
- Bart Vanhaeren, Founder, Investsuite – Emerging Ecosystems in Wealth Management
- Tom McGillycuddy, Co-Founder, tickr – Launch of tickr
For more information from CFA UK, visit www.cfauk.org
Andrew began by highlighting two big questions both related to data.
- Do we have enough of the right data?
- Do we have enough of the technology or the capacity or the tools to draw insights out of that data to make better business decisions?
He said that these are not new questions but they are fundamental, but when you look at the world of fintech, we’re getting closer and closer to ‘Yes’ on both – and highlighted Moore’s Law, the amount of capacity doubling is every 18 months.
Andrew used an example from the Vietnam War and talked about Operation Igloo White. He explained that the US government tracked the troop movements and supply chain of the Viet Cong to plan how to interrupt or take action on those movements. They dropped sensors on the Ho Chi Minh trail to pick up infrared and sonar for troop movements and supply chain trucks. They used radio to connect all these sensors and push them off to IBM mainframes with about 8mb of RAM, which was huge in those days. They did this over three years, optimised their tactics to take advantage of what those sensors observed, in those days that would cost somewhere along the lines of a $1bn/year. When you think about it today, 8mb of RAM picking up a few sensors on the Internet of Things compared to one G.E. engine in an airplane which has around 3000 sensors producing real time information and the cost of that could be measured in dollars, maybe in tens of dollars to operate.
Andrew went on to explain that now you data ingestion, that capacity, as quantum computing is on the verge of becoming mainstream in the next five years- you’ve got the capacity to collect and analyse date – the physical side. Then on part 2 of the question, you’ve got a volume of data, whether it’s satellite information, whether it’s sensor data, the Internet of Things, the collection of data in manufacturing data or data creation processes is just huge. Looking at open source technology, sharing tools for the creation and collection of data are beginning to emerge, which just accelerates that. He adds that we now have tons of data that we never had before that we’re dealing with to ingest, matching that with the capacity to drive decision making. In a fintech world, that means market participants can use more technology and more data to try and achieve operational alpha or investment alpha.
Andrew said that if we take that step back and we look at fintech, the data and the technology, the one thing that ties this together are the assumptions or the inputs that we as humans put into the process. There is an assumption that, had he been at the conference and taken a sampling of all the participants in the room and said, how many of you think the financial markets are random and then you take the other side of it and say, how many of you think they are not? There is an assumption or a bias that’s inserted in there. Andrew added that the machine is either completely random or using data and technology you can build a model that predicts it, any time we appeal to that ‘lizard brain’ that got that hypothesis that says we assume or think there is a relationship between action and behaviour, we are putting in bias into the machines and the tools we use.
IHS Markit’s chief data scientist, Yaacov Mutnikas, who Andrew described as one of the most interesting men in the world and explained his story of the in research that they do on bias. He explained Pavlov’s theory which refers to a study where you can train a chicken to be superstitious. The chickens will perform a behaviour and then the random feeding button will be pressed, it will produce a pellet they’ll eat it and then they’ll reproduce the behaviour they did before they pressed the button, thinking it has an impact on whether or not they get fed. Despite it being completely random, the chicken will start to be superstitious and recreate that pattern of behaviour despite it having zero impact on the outcome.
So, when talking about the bias in the in the in the in the model, we typically assume what ‘good’ looks like for the data, Andrew continued. He said that often we assume some of those assumptions or the hypotheses where there are relationships and we’ll encode, in particular in fintech, that behaviour into the actual software processes we’re running. What’s fascinating about that is you start to get this collective bias where you actually can reinforce that bias in the tools and the software. It can be seen in modern research that’s showing that the diversity of the teams that come up and work together to build these tools will eliminate some of that risk for bias. Ultimately, the tools are there, but they all require human intervention to build them or use them, which Andrew said he can see that as being an interesting input or evolution to this conversation of fintech and automation and machine learning and the nature of the space five years from now.
Linking this theory to the current climate around COVID-19, and the idea of mass panic behaviour, Andrew said that fear is really driving decision making despite having valid data or facts behind it. Commenting on the rise of people wearing surgical masks, Andrew mentioned that when you read the research and you read the doctors reports, they all tell you that the effectiveness of these masks is questionable, especially in tubes or subways however, again, if you tie back to that bias, going back to our emotions and instincts, that actually is what it appeals to. That is what will cause us to make decisions that otherwise would not be considered rational.
What Andrew looks at in his day-to-day work is operational alpha, which is how you use the data, the tools and the capacity to make better business decisions on how you run your business. He used an I.T. example of predicting failure for candidates in operational systems. If you look at any complex technical working environment, there could be millions of alerts, millions of data points being collected from systems that are core to your trading or operational infrastructure and you can now collect all of that operational data and start to analyse it for predictive analytics of pre failure conditions, a particular server or a particular system. As memory spikes or network traffic spikes, the probability of failure goes much higher. But Andrew also added that you can see this happening in all aspects of how people do business, whether it’s optimising client relationships, outreach, compliance, fraud. On the investment side of fintech, you’re seeing a huge investment in using those tools, technology and data to find Alpha. One of the most interesting things IHS Markit can see is that, because of the depth and breadth of content that’s available, the ingestion of alternative data sets into those investment decision-making processes, a lot of it isn’t alternative to the primary consumers. Andrew used the example of oil production data that IHS collect and track from over four hundred and fifty thousand oil wells globally and how this is really interesting data for energy companies or prospectors or oil drilling firms. On top of this, IHS have GPS. data of almost every ship on the planet, over 100 tons with their cargo manifests and a large number of those are oil tankers. Therefore, you can start to combine oil production data, oil supply data in regional locations and ports using satellite imagery to track supply data and those open wells or open storage containers, you start to see how many how much supply is at sea at any given point in time. It is then possible to start to model supply impacts when there are dramatic changes in supply and ships get diverted to move that supply to where the best price may be paid at that particular moment in time. This can then lead to an investment thesis that can drive not just what stocks to buy, but what’s the risk of a credit event? Where do I have supply chain risk? Where do I have refiner capacity? The inputs into your decision-making process suddenly get a lot more sophisticated than what was available five or 10 years ago. People are now taking all these diverse datasets and ingesting them at scale to drive business decisions.
On the investment models, there’s some real opportunities on building better portfolio risk management, better portfolio risk capabilities to understand the relationships between content and data according to Andrew. He said that we can also see a huge amount of investment in the ESG space as well as real thought leadership in modelling climate and climate risk into your portfolio.
He also added that as a fintech or an asset owner or asset manager, if you start to go into other industries, you can start to use climate risk to drive your decision making on where you invest capital, where you build your plants, where you build your machines and how you line up your supply chain. This is where there is opportunity in this space to now ingest all of this data that you couldn’t get before, combine it with that compute and leverage those capabilities on models and data science to really understand a whole new facet that’s very relevant. Andrew continued by saying that whilst we’ve all seen the pictures of climate change and Big Ben underwater or the scene from Venice where everything’s flooded these new opportunities allow us to have the capability to model that risk and instead of it appealing to fear, companies can optimise their capital and decisions based on those outcomes. He ended by adding that climate risk as a profession or as a discipline can help to drive portfolio optimisation, whether that’s investment portfolios or more capital investment.
Next to talk to us were Rehan Islam, an independent fintech consultant, who up until recently headed up the innovation unit at one of the leading global active asset managers, Janus Henderson Investors, and with him was Raj Gupta, lead data scientist at Location Sciences, an independent third party data intelligence company that verifies the accuracy and quality of location.
Rehan began by explaining that alternative data is a term primarily used in financial markets and that traditionally, investors have used conventional metrics such as market data prices, volumes, revenues and earnings when making investment decisions over the last decade or so. But, as advances in technology have led to new sources of data, it has, in turn, led to a new category of alternative data, which can supplement investment analysis. Rehan pointed out that what’s called alternative data today is likely to become standard data in five years’ time, making it a shifting boundary. Alternative data includes data sets such as sentiment analysis of companies from social media and the web, satellite imagery of supply chains and crop fields, geo-location data from apps or transaction data from email receipts or debit cards.
Raj then went on to explain how in the specific case of Location Sciences, they provide store visitation data for most FTSE Retail Ticker’s data, which is used by fundamental investors looking at all data to support their investment hypotheses and by quant funds looking for non-market generated features to improve the accuracy of their trading models. He added that often the buy side typically wants the data sets to be as structured as possible, but the danger is that data providers are not aware of the nuances of the financial markets and would filter out signals in the process of noise elimination, for example, location data and indiscriminate filtering by speed or dwell can lead to information being thrown out.
Rehan explained how Europe is ahead of the game in terms of regulating what companies can do with data. He said that in a lot of circumstances, data is collected for one use and sold on to investors which can raise issues if people start to feel uncomfortable. He used the example of a money management app that could then sell on data on what people are spending on, saying that maybe those people didn’t realise this because they didn’t go through the terms and conditions. In other cases, some companies started off as just providing business intelligence for a certain industry and then later they’ve shifted on to providing the same data to investors. So, as Rehan explained, if you started collecting data with one use in mind and now you have another use in mind it can raise some questions on what you’re doing with the data and whether that’s allowed or not. He believes that regulation will become stricter as this area grows.
Raj commented on the fact that we do know that GDPR is pretty serious about the privacy rights of EU citizens and the concept of Personally Identifiable Information (PII), that is suddenly likely to evolve over time. For example, in the location space where Raj works, all data that’s collected must be properly consented, opted in and anonymised before the data sets can be generated for the buy side. He said that he can see vendors due diligence becoming more and more stringent as all data goes mainstream.
Rehan then went on to say that he believes that the vendors need to find a way to make their offerings fit with the investment process of the investors they’re selling to and that they also need to do more to help investors identify the signals in this data. He points out three questions regarding what investors are looking for from alternative data; the first question is, is there an investment signal on a particular asset they’re looking at? Secondly, is that signal priced in? Because if it’s already priced, then it’s not that useful. And thirdly, what’s the time-frame of that signal? For example, if it’s only giving you a half an hour advantage on something for a lot of long term investors, that doesn’t really move the needle for them. Therefore, vendors need to help investors answer these questions.
Rehan makes the point that there is a difference between how systematic investors and how fundamental investors use alternative data. So, for systematic investors, they’re quite used to looking at lots of different data sets and making small bets across a range of different signals that they find in these, whereas for fundamental investors haven’t historically done things in a very quantitative way so it may be much harder for them to find how alternative data fits into their process.
Raj added that in his experience working with location data, he finds that fundamental investors are looking at all data sets to support their investment hypotheses as opposed to basing their hypotheses on this kind of data. They also find a lot of quant funds looking for their kinds of data sets to improve the accuracy of their trading model, so effectively, visitation data becomes a feature to be added onto their models to make predictions better.
Raj also states that all data is typically not as clean as market data and so they have to do a fair amount of wrangling before something useful can be generated. For example, in the case of location signals, typically a speed filter would be used to eliminate people driving past a store and dwell distributions would be built, that is time spent in a store, and then calibrate them with similar stores. Then third party data set’s, such as transaction data, would be used to create truth sets. They would also use sentiment analysis to understand context and all these data sets are typically very unstructured and he thinks that data cleaning in Raj’s experience has been around 60 to 70 percent of the job.
When asked about how we can used alt-data to generate insights into alt-data investments, Rehan gave the example of Tesla. Two different ESG ratings on Tesla were given where one was zero and one was one hundred, the reason for this was because one was just looking at Tesla’s activities itself and the other was looking at where the batteries go and where they come from in the wider societal impact. So, that leads the question, how broad do you go when looking at the impact and the broader you want to go, the more scope there is for looking at alternative data sets.
Raj says that these footfall trends they find are a leading indicator for the financial KPIs of these tools and brands, as well as looking into the demographics of the people who actually visit these stores, the distance they travel to get to those stores and various other brands and stores that they visit and that together with data from social media and other sources, starts giving a picture of what the audience looks like and how it’s evolving over time, which he believes would be of great value to investors in this area.
Stacey defined Regtech as a very specific subset of fintech, that is technology that enables brands to meet their regulatory obligations much more effectively and efficiently than existing capabilities and how they do things today. She added that the Regtech we’re seeing evolve now is able to leverage a whole range of technologies and tools, such as artificial intelligence and analytics, through things like API’s and the cloud. The reason it’s becoming so important is simply because of the volume and pace of regulatory change that firms need to comply with. She said that we know that regulators are publishing at least 57,000 alerts a year and that equates to over 220 publications every single day, all of which brands need to be aware of to review and implement where relevant. At Corlytics, Stacey said that they have tracked an increase in global enforcement activity over the last year and in the last five years alone, there’s been more than $90 billion worth of monetary fines levied to that around the world. So, the personal liability is increasing too, making it a very, very challenging regulatory environment firm to work in.
Stacey said that firms have been increasing the size of their compliance teams and budgets year on year, which isn’t sustainable from either a cost or recruitment point of view. So, what firms are looking for help with is automating manual tasks so they can free up their valuable and skilled compliance and risk resources to actually focus on much more value-added things like advising the business and ensuring that their customers are protected. She added that we’re also seeing great demand for insight and metrics. The volumes and complexity of information that’s being generated by regulators is outstripping the capacity of human teams to interpret it, firms are actually looking for help getting those analytics and measures that tells them which regulatory risks they should prioritise, where they’re most exposed and what’s relevant to their business. We’re also seeing regulators encouraging their firms to adopt Regtech. She gave one example, where one of the Australian regulators has recently said that there’s becoming a tipping point where those firms that don’t invest in Regtech research and development are actually going to be at a disadvantage in the long run.
Stacey explained that at Corlytics, they do a lot of work for customers to structure regulation, which means adding tags and taxonomies, which they then use to map rules to a firm’s own entity and risk frameworks. In practice, that means firms can automate many processes which otherwise might require them to have teams of hundreds of staff, filter out regulatory rules and updates, determine which are relevant and where they need to be directed to. The other thing that Corlytics can do is risk right regulation, they have a single source of global enforcement activity which they can then extract insight and analytics. For example, the relative risk rules and regulations, they quantify the impact of breaches and identify which controls is that are causing those breaches enabling firms to determine where they’re most exposed based on the type of business they are, which jurisdictions they operate in, which products they sell and services they operate in. That helps them to demonstrate to regulators that they’re learning the lessons from their peers in the industry and also that they’re focusing their compliance, monitoring insurance activities on the greatest areas of risk to their business. Corlytics also provides visualization, which in practice means that there are heat maps and graphs and metrics which actually physically show areas of risk and exposure, and that’s really helpful for actually seeing where to focus and where to prioritise.
Stacey said that from very early on, Corlytics have been lucky enough to have the opportunity to work with regulators, including the FDA by doing things like tag their rulebooks, help structure their regulation to make it more usable and machine readable, and it’ll help with things like digital reporting and updating policies. However, she also said that as we start to see the firms benefit from all that automation, there’s going to be much more of demand for prediction to see what emerging risks lie ahead. She added that ultimately, she expects that it will become the norm for Regtech to be in place to support compliance teams do their jobs. Stacey anticipated that the skills needed in those compliance teams is going to change too, while we don’t expect or need compliance officers to become developers or coders, they are going to need to have the confidence to work with new tools and technology. For example, at Corlytics, their legal expert sits alongside developers and they all have the opportunity to code because there really is a recognition that in the longer term, practitioners are not just going to need experts in a technical detail of rules and regulations, they are going to need to understand things like analytics, artificial intelligence and Regtech so that they themselves can provide guidance to their business and get the most benefit out of the tools. She added that we’re going to see a growing use of another subset of fintech and Regtech, which is supetech, which is specifically designed to help the regulators themselves to be more effective in supervising the industry because regulators themselves face the same sorts of challenges. They’ve got growing volumes of data being generated and their own resource limitations.
Stacey said that there will be challenges around the implementation of Regtech, as with all technology. She continued by saying that she thinks that explainability and auditability is key because we’ve been through a financial crisis where basing decisions on models that haven’t been well understood has demonstrated the problems that can cause. So, being able to look inside these tools being able to explain the results and how decisions are being made is going to be absolutely critical for regulators, auditors and the boards. Risks around data security and cyber resilience were also mentioned as potential challenges for Regtech.
Bart explained that his company operates in the wealth management space as a pure B2B player. That means they don’t go to any customer directly, but their customers are financial institutions like retail banks, brokers, wealth managers and private banks. InvestSuite aims to help these customers in their own digital wealth transformation journeys by offering them solutions and allowing them to get on to the market quicker and cheaper whilst adding some extra expertise as well.
Bart then went on to paint a picture of a rapidly changing environment within the industry. He said that what we can see today are huge pressures and we’re also seeing a lot of players embarking on their wealth transformation journey. Bart also highlighted a changing ecosystem which consists of the following participants; the private banks or the banks themselves. Secondly, core banking platform providers, IP providers, and those were there before. And finally, the emergence of FinTech’s in that space, which he said is all possible because of open banking and because of APIs, which quite different than than a few years ago. He also added to that list implementation consultants, which means a new ecosystem of players, which we’re not even there three or four years ago.
AI and machine learning are of particular relevance and significance. Bart suggests that this is an area where a lot of B2B FinTech’s can add value ranging from Know Your Customer (KYC) fraud protection to purely being in the heart of what they do, i.e. fund construction, portfolio construction where quant tools and AI tools can enter the game. Bart then gave us an example of how both the old, traditional methods can work alongside new technological methods in the form of private banks. He said that they create a hybrid solution so that their customers know that there is a human being there to help them, but they merge their traditional approach with new digital tools, thereby allowing their clients to have all the digital access which they would have with a fintech. They even go a step further because they can offer the trust of the brand and possibility to talk with a real human being if needed. This creates that ecosystem consisting of a number of fintechs offering high end solutions with the private bank.
When asked about how he sees this developing in the future, Bart explains that we are currently in, what he calls, a ‘discovery condition period’, whereby some players are already a bit more advanced and others are actively exploring. According to Bart, this may lead a situation where private banks will be more in the hybrid mode, where they will still leverage what they’re good at, but they will add a completely new digital experience for their customers, using AI and other quant methods to further improve or leverage their portfolio advice to portfolio construction methods. He also added that research has shown that the cost income of private banks is still on the high side and so they can lower the costs by further automating processes such as client onboarding systems or compliance systems. He also added that by offering digital solutions for their clients, banks can take in more clients per advisor and as such, aid to enhance their revenues. It was also interesting to hear that Bart believes that that private banks’ traditional model will still exist for quite a while because of the very high added value of services like tax advice and advice around investment and support which are very complex and therefore a personal relationship is very important.
For more information Bart suggests the following places;
- Traditional management consultants are painting very good pictures about the state of industry. For instance, KPMG Luxembourg issued a report on private banking in Luxembourg. McKinsey, about private the state of the universe with respect to private banks.
- Events on trends in wealth management in various countries, these conferences are good to see what is going on in the world and to scan what a number of the leading players are doing and who are leading players in private banking, FinTechs active in wealth management.
- A number of banking platform providers like avaloq follow this market very closely and issue trend reports on it.
The business was founded by Tom and Matt Latham, who spent eight/ nine years working in the investment management industry. They were working in the impact investment team investing money on behalf of pension funds, family offices and wealth managers around the world, but there was no way for them to invest their money in the way that they were doing in their day job. So, that was the first trigger point. Tom said that they thought the app was something to build because they knew the themes around impact investing appealed even more so to their generation, the people that really want to impact the climate or a social issue that they really care about. They thought that there was a real, powerful opportunity to bring this form of investing to a mass market because of the themes and the framing and the narrative around it.
Tom explained that he had a crisis of mission, at around twenty-three/ twenty-four. He said that he needed to feel like what he was doing on a day to day was of use beyond his own gains. This is where he tried to match his finance skills with that idea and found impact investment and then dove into it.
Tom said that the response to Tickr has been good so far as they are approaching 50,000 users just twelve months after launch. He added that the demographic is exactly what they had hoped it to be with eighty five percent being first time investors, the average age being thirty-one, investing about two thousand pounds a year each as well as a 50:50 gender split. According to Tom, the best telling of their response is shown through two events a month at Tickr offices for users to learn about investing and impacts. Tom adds that these events have been the best user feedback sessions, he said that everyone comes with an idea, everyone comes with a story to tell.
In terms of challenges, Tom calls it ‘good hour bad hour’, where there is extremely good news and bad news every single minute. He said that part of the challenge is learning to manage the ups and downs, particularly when it comes to the growth of the business. He added that as the company effectively increases by 30-50 percent every two or three months, the team needs to change and adapt at the same rate. You therefore have to continually learn to upskill, there’s a constantly shifting expectations and goalposts.
Tom explained that there’s a hierarchy of things that need to be achieved and they have to choose to let fires burn and prioritise what to fix and when and create an experience that ticks a lot of the fundamental, basic boxes. He added that at Tickr they have been very fortunate to hire a couple of amazing customer services representatives in Liverpool who have changed one star reviews into five stars in the app store and play store and that’s how they have combated some of the features in the app, however, he said that it’s still very much version 1 of the product. Tom said that they will continue to grow and hit their targets and understand that fires will burn and will prioritise which ones to put out next.
Tickr are raising another round of funding right now with the goal being to have it done and finished by May. The main priority for them is to grow their user base, and they’re going to funnel the app with more impact content, more narrative and more of the investment education that they teach in person. The aim is for the in-app experience to become a lot richer and more referable, more content driven, highlighting what they stand for as a company, which is impact and education.
Tom’s biggest piece of advice is to have a co-founder. He said that having Matt meant that they could share the burden and stress test ideas. So, he said that he would urge anyone else to only go into something like this if they have a co-founder that they trust their life with.