Executive Summary 
   
    Modern banks such as the Canadian "Big Six" have been "power 
      users" of modern methods in mathematics, statistics, and computing 
      in their financial markets arms for many decades now to the point where 
      at least Master's level training in a quantitative discipline has become 
      essential for many working in trading and risk management. But perhaps less 
      known is the trend towards highly quantitative work in the less glamorous 
      but highly profitable commercial and retail banking arms of banks. 
    Commercial bankers must decide to whom to lend, in what amount, and on 
      what terms, both quickly and accurately. Not everyone gets a mortgage from 
      a bank, and of those who do, not everyone pays the same mortgage rate. These 
      decisions are made with a tap of a few keys in a retail banker's office. 
      How are these decisions made? With Big Data and the big analytics that go 
      with it. 
    For example, every time you use your credit card, the bank records when, 
      where, how much, and to whom you are paying. Some of us have had the experience 
      of getting a call from a bank, concerned that our recent use of a card in 
      an unaccustomed way or in a remote place is evidence that it has been stolen. 
      What is less known is the use to which this huge set of credit card data 
      can be used in finding patterns that suggest that someone's spending is 
      calling into question their ability to repay their loans.
    Even the marketing group of the bank's credit card division gets into the 
      action. Many consumers hit the mall armed with a small arsenal of credit 
      cards. What few of us know is the degree of concern that banks devote to 
      keeping the card they have issued "top of wallet": the first card 
      you reach for at the point of sale. Banks can assemble, query, and analyze 
      the Big Data in their databases and use it to decide both how to shed unprofitable 
      but also how to retain profitable customers, by tying it with various marketing 
      techniques.
    
     
     
  
  Program Outline
   
    This 2-day program will consist of a 1.5-day workshop on data analytics 
      in banking, capped off by a 1-hour research talk. The workshop leader is 
      Professor Cristián Bravo, Department of Industrial Engineering, Universidad 
      de Talca, Chile. The workshop deals with modern credit scoring and credit 
      risk techniques and includes coverage of probability of default (PD), loss 
      given default (LGD) and exposure at default (EAD). Case studies may include 
      social network analysis for credit card fraud, micro-credit, and an application 
      of semi-supervised learning. Participants in the workshop should have a 
      basic knowledge of data mining and an understanding of how to create and 
      use statistical models. Previous knowledge of predictive analytics used 
      in this application area is not required. A 1-hour research talk entitled 
      "State-Dependent Correlations and PD-LGD Correlation" given by 
      Dr. Adam Metzler, Department of Mathematics, Wilfrid Laurier University, 
      will conclude the program.
     
     
  
  Online Registration
   
    Online registration has now closed. We regret that 
      we can not accept any on-site registrations as the workshop has reached 
      capacity.  
    Workshop fees: $200 regular rate, $150 faculty rate, $75 student 
      and postdoctoral fellow rate.
    These fees include workshop participation, lunch and break catering both 
      days, and reception on Wednesday evening (cash bar).
    Funding support: Graduate Students and Post-doctoral Fellows interested 
      in being considered for a travel award should select and complete the "funding 
      application form" option when registering. Awards will be given post-workshop.
    
     
     
  
  Workshop Leader
   
    Cristián Bravo is Instructor Professor at the Universidad 
      de Talca, Chile. He is an Industrial Engineer, has a Master in Operations 
      Research and a PhD in Engineering Systems from the University of Chile. 
      He previously served as the Research Director of the Finance Center at the 
      Department of Industrial Engineering, Universidad de Chile, and worked as 
      a Research Fellow at KU Leuven, Belgium.
    He has been published in several Data Mining and Operations Research journals, 
      and edited a special issue at Intelligent Data Analysis (2014). His research 
      interests cover Credit Risk, specially applied to micro, small, and medium 
      enterprises, and the development and application of data mining and big 
      data models in this area.
     
     
  
  Schedule
   
    Wednesday May 20
    
       
        | 8:30-8:45  | 
        Welcome | 
      
       
        | 8:45-10:30 | 
        Workshop | 
      
       
        | 10:30-11:00 | 
        Coffee break | 
      
       
        | 11:00-12:30  | 
        Workshop | 
      
       
        | 12:30-1:50  | 
        Onsite lunch | 
      
       
        | 1:50-3:30 | 
        Workshop | 
      
       
        | 3:30-4:00 | 
        Coffee break | 
      
       
        | 4:00-5:30 | 
        Workshop | 
      
       
        | 5:30-6:30  | 
        Reception at Fields | 
      
    
    Thursday May 21
    
       
        | 8:30-10:30 | 
        Workshop | 
      
       
        | 10:30-11:00 | 
        Coffee break | 
      
       
        | 11:00-12:30  | 
        Workshop | 
      
       
        | 12:30-1:50  | 
        Onsite lunch | 
      
       
        | 1:50-2:50 | 
        Talk by Adam Metzler | 
      
       
        | 2:50-3:00 | 
        Closing remarks | 
      
    
    
       
  
    
  Talks
   
    Adam Metzler, Wilfrid Laurier University
    State-Dependent Correlations and PD-LGD Correlation - Modeling and Computation
    It is an empirical fact that (i) correlations tend to rise during adverse 
      economic scenarios and (ii) default rates are strongly positively correlated 
      with loss-given-default. Unfortunately many of the most popular models used 
      in risk management applications do not incorporate these phenomena, and 
      the result is overly optimistic predictions. This talk will consist of two 
      parts. In the first part we present an empirically motivated model that 
      allows for state-dependent correlations in linear factor models and contains 
      both the simple mixture and so-called Random Factor Loading models as special 
      cases. We derive a number of tractable asymptotic approximations and illustrate 
      that state-dependence in correlations effectively precludes moderate default 
      rates, exacerbating both good times and bad. In the second part we discuss 
      efficient Monte Carlo methods for computing risk measures in a PD-LGD correlation 
      model developed by Miu and Ozdemir (2006).
     
       
         Adam Metzler received his M.Math (2004) and Ph.D. (2008) from 
          the Department of Statistics and Actuarial Science at the University 
          of Waterloo. He was an assistant professor in the Department of Applied 
          Mathematics at the University of Western Ontario from 2008-2012 before 
          accepting his current position as assistant professor in the Mathematics 
          Department at Wilfrid Laurier, where he is currently Co-ordinator of 
          Financial Mathematics Programs. He maintains an active research program 
          in quantitative finance, with an emphasis on problems related to credit 
          risk.
        
      
    
  
   
   
   
   
   
   
   
   
   
    
  Back to top