Thursday 9 April 2015

Macro Trends April 2015: Forecasts, Rationales, Effects


This post refers to our forecasts and justifications for six economic variables which are central to investing strategies, and reflected in future valuations. These variables include the economy, GDP growth, the exchange rate, RBA cash rate, unemployment rate, 10 year bond yield and inflation rate for Australia. Investors should remember that the share market tends to factor in perceived and forecasted economic strengthening before it happens, but less likely to acknowledge economic weakening.

Economy: short term decline over 2015, medium term growth due to depreciation

The economy is experiencing below trend growth, rising unemployment and a weak fiscal budget. This is mainly from drops in commodity prices, lack of reforms and instability in the political environment, with cuts to public investment, low business confidence and hence slow credit growth, and falling capex in mining sectors. Generally, the Australian economy needs to transition from a focus on mining capital expenditure and high commodity prices towards non mining sector, property investment and consumption, with stronger government spending on infrastructure.
The main risks to the economy are derived from the exchange rate, trade weighted index- including iron ore export prices, employment growth, consumer confidence and the timing of both fed and state infrastructure investment. The RBA cash rate seems to be mainly affecting the housing market. Consumption and investment are less interest sensitive relative to previous cyclical downturns, which plays a role in the RBA setting the cash rate to its lowest since the 1960s. Therefore I believe the RBA’s stimulus is missing the big picture of strengthening the economy’s structural and cyclical weaknesses.
Declining growth over 2015 is likely however with the depreciation of the Australian dollar to a forecasted 70-75c mark, it should accelerate growth in the medium term, especially with regards to export competitiveness.  The RBA is likely to use its policies to target the currency.

GDP growth: 2.4% over 2015 and into early 2016 from end of mining growth and absence of investment

 We forecast GDP to grow at 2.4% over the remainder of 2015 and early 2016. GDP growth was 2.5% over 2014 to the December quarter, the second consecutive quarterly contraction. Gross domestic income also declined 0.2% due to falling terms of trade. Factors driving the slowdowns include inventory declines and lack of both public and private investment, which outweighed growth in consumption and net exports. Constrained wage growth meant reduction in savings to increase consumption. Flat commodity prices and the delay in transitioning towards non mining private investments due to lack of business confidence means this underlying trend is set to continue. Mainly industrial and commercial services would benefit from this GDP growth given the fed government follows through with infrastructure spending. The financial sector is being stimulated through low bad debts expenses and increased productivity, so the GDP growth will help lending growth which is currently slowed down by low consumer and business confidence related to job security.

Exchange rate: depreciate to 70-75c due to rate hikes in US and terms of trade decline

Current depreciation is partially due to the termination of quantitative easing in the US this year, which slows down the carry trade from US dollar denominated capital markets to Australia. Carry trade refers to where investors borrow money at a lower interest rate economy, converting it to a currency in a higher interest rate country and investing it in the highest rated bonds of that country. Decline in the terms of trade and general unfavorable growth outlook will also play a role in depreciation. Nevertheless the depreciation will be at a lesser extent compared to the Euro and the Yen. Quantitative easing in Europe and Japan is causing capital to flow towards higher yielding Aussie bonds. There is also still support for the Australian dollar from Chinese investors, emphasising on our economy’s wealth profile.
The energy and transportation sector are most affected by the $A depreciation. Depreciation increases the $A value of offshore earnings and revenue denominated in $US, though also increases costs for importers where their cost base depends greatly on the overseas value chain.

RBA cash rate: drop to 2% in the next few months to promote depreciation

The current rate sits at 2.25% though we forecast a cut to 2% in the next few months. Due to the lack of fiscal stimulus and structural policy inertia, the RBA must step up to drive the depreciation of the Australian dollar whilst improving employment and economic growth. The depreciation of the Australian dollar has been trending for some time however it must fall further according to a trade weighted perspective against the US dollar. This depreciation would trigger inflation and GDP growth which reduces the need to maintain such a historically low cash rate. Industrials, consumer goods, financial and real estate sector are likely to benefit. Households who have recently been exhibiting low propensity to consume will increase their consumption. This is also the result of stimulatory interest rates which ensures low mortgage rates, with wealth generated from the property price boom. Multiplier effects flow through to all sectors. A cyclical trajectory forms as unemployment reduces, increasing earnings for real estate investment trusts.

Unemployment rate: increase to 6.5% due to discouraged jobseekers, subdued GDP growth and lack of new jobs being created by the economy

The current rate of unemployment is 6.3%, unemployment refers to those who have not worked for more than a month as a proportion of the labor force. We forecast this rate to increase 0.2% to 6.5%. Whilst in January unemployment rose 0.3% and the 6.3% in Feb and March was a decline of 0.1%, we expect employment growth to stay weak parallel to the weak GDP expectations. Participation rate which followed a declining trend line in 2014 is still being limited by discouraged jobseekers, implying the real problem is being masked by statistically measured unemployment. The economy is not creating enough new vacancies to absorb the number of new entrants into the labor market, plus a strong GDP of 3% pa is needed to support stronger employment rates. Unemployment would affect all sectors especially consumer discretionary, as the marginal propensity to spend is boosted when consumers have job security.

10 year bond yield: increase to 2.65% from QE in Japan and Europe

This yield represents the benchmark risk free rate for investor decisions. It is currently 2.48% and we forecast it to increase to 2.65%. Internationally, bond yields are being slowed by quantitative easing policies from slow economic growth, as evident in Japan and Europe. This results in capital outflow towards more attractive bonds (Australia) which will trap such countries in a cycle to lower their yields further.  This will improve our financial sector as low interest rates make bank equity more appealing due to dividend yields and earnings growth will accumulate through low lending rates.

Inflation Rate: increase to 2.5% over 2015 and 3% in 2016 from depreciation and tradeables


The current rate sits at 1.7% though 12 month forecast is an increase to 2.5%. Headline inflation fell to 1.7% from 2.3% in the last quarter of 2014. Falling petrol prices sustain inflation from other areas which is why our inflation forecast still remains well below 3%. Meanwhile underlying inflation, derived from volatility movements in commodities such as fuel, was 2.3%, driven by non-tradeables. Tradeables prices should rise in the next few months and converge with the strong pricing trend of non-tradeables due to depreciation of the Australian dollar. The depreciation is not only against the US dollar, the $A is expected to depreciate against the Chinese Yuan, increasing prices of imports from China, further generating inflationary pressures. In 2016 we therefore forecast a 3% inflation rate. Such inflation has both positive and negative impacts on consumer staples and discretionary- inflation allows cost increases to be easily passed onto consumers to maintain margins though expenditure is reduced. 

Tuesday 7 April 2015

Quantopian Trading Strategy: Fundamentals, Daily

Below are some examples of strategies that take advantage of Morningstar's fundamental data on US companies. 

1. Filter the top 50 companies by market capitalisation
2. Find the top two sectors with the highest avg PE ratio, then invest in the shares that have the lowest PE ratio in those sectors. 
3. Every month exit all positions before entering new ones at the start of the month
4. Record all the positions we enter. 

import pandas as pd
import numpy as np

def initialize(context): 
context.stock_weights = {}
context.days = 0 
context.sect_numbers = 2

The initialise method is compulsory for any algorithm, and is input to set up bookkeeping. Context is an initialised and empty Python dictionary and stores any defined variables and portfolio/position/account object. Here, we have set up a dictionary of shares and their weights, and state that we don't require any days before rebalancing the portfolio, and will go long in 2 sectors. 
    
context.sector_mappings = {
       101.0: "Basic Materials",
       102.0: "Consumer Cyclical",
       103.0: "Financial Services",
       104.0: "Real Estate",
        205.0: "Consumer Defensive",
        206.0: "Healthcare",
        207.0: "Utilites",
        308.0: "Communication Services",
        309.0: "Energy",
        310.0: "Industrials",
        311.0: "Technology"
}
    
schedule_function(rebalance,
date_rule=date_rules.month_start(),
time_rule=time_rules.market_open())

def rebalance(context, data):
for stock in context.portfolio.positions:
if stock not in context.fundamental_df and stock in data:
order_target_percent(stock, 0)

    log.info("The two sectors we are buying today are %r" % context.sectors)

    weight = create_weights(context, context.stocks)

for stock in context.fundamental_df:
    if stock in data:
       if weight != 0:
         log.info("Ordering %0.0f%% percent of %s in %s"
                   % (weight * 100,
                   stock.symbol,
                   context.sector_mappings[context.fundamental_df[stock]['morningstar_sector_code']]))
order_target_percent(stock, weight)

We ensure the portfolio rebalances on the first day of every month, at market open, and that we exit all positions before we start any new ones. We then ensure shares are all rebalanced to their target weights. The log.info is what we read when the transaction occurs, with the sector names at the end of the string. i.e If we had the share in our portfolio, order the target weight of the share and output information that we are ordering x% of the share in whichever sector.
    
def before_trading_start(context): 

    num_stocks = 50
fundamental_df = get_fundamentals(
query(
fundamentals.valuation_ratios.pe_ratio,
fundamentals.asset_classification.morningstar_sector_code
)
.filter(fundamentals.valuation.market_cap != None)
.filter(fundamentals.valuation.shares_outstanding != None)
.order_by(fundamentals.valuation.market_cap.desc())
.limit(num_stocks)
)

As the name suggests, the before_trading_start method is called once before the market opens and in this case updates the universe with 50 shares and their financial performance data from get_fundamentals. get_fundamentals, here assigned the variable name fundamental_df, is a SQLAlchemy query to retrieve the shares with information about their PE ratio and industry sector from Morningstar. These results are then filtered to ensure we don't include any companies that have zero market capitalisation or shares outstanding as this would imply they are not trading anymore. We then sort the results by descending order so the shares with the largest market capitalisation listed first. The results are cut off when we reach 50 shares. 

    sector_pe_dict = {}
    for stock in fundamental_df:
        sector = fundamental_df[stock]['morningstar_sector_code']
        pe = fundamental_df[stock]['pe_ratio']
        
        if sector in sector_pe_dict:
            sector_pe_dict[sector].append(pe)
        else:
            sector_pe_dict[sector] = []

The algorithm searches for sectors that have the highest average PE ratio. I have assigned 'sector' as the variable name for the industry sector code and 'pe' as the variable name for the PE ratio so that it is easier to type later. If the sector exists then we add the PE ratio to our existing list. Otherwise, we don't. 
    
    sector_pe_dict = dict([(sectors, np.average(sector_pe_dict[sectors])) 
                               for sectors in sector_pe_dict if len(sector_pe_dict[sectors]) > 0])
    
    sectors = sorted(sector_pe_dict, key=lambda x: sector_pe_dict[x], reverse=True)[:context.sect_numb]
    
    context.stocks = [stock for stock in fundamental_df
                      if fundamental_df[stock]['morningstar_sector_code'] in sectors]
    
    context.sectors = [context.sector_mappings[sect] for sect in sectors]

    context.fundamental_df = fundamental_df[context.stocks]
    
    update_universe(context.fundamental_df.columns.values)   
    
We are searching for the average PE ratio for each sector, sorting in ascending order, filtering shares based on the top two sectors we found in the previous step. Then we initialise a context.sectors variable, and update context.fundamental_df with the desired shares and their PE ratios. 

def create_weights(context, stocks):
       if len(stocks) == 0:
        return 0 
    else:
        weight = 1.0/len(stocks)
        return weight

We previously coded our portfolio to rebalance to target weights. Here we define that our weights for each stock is equal, as len(stocks) refers to the number of stocks. 

def handle_data(context, data):

    record(num_positions = len(context.portfolio.positions))

Handle_data is called whenever a market event occurs for our specified shares. The parameter context stores any defined states and portfolio object while data is a snapshot of our universe as of when this method was called. For instance, the dictionary can include market information about each share and is updated in the process when a query is run for fundamental data. So whenever our shares are affected we track the number of positions we hold. 




This resulted in giving me a 16.9% return, outperforming the market benchmark by 1.7%. 

PE ratio represents the market value per share over earnings per share; the price an investor is willing to pay for every dollar of earnings. Generally a low PE ratio is indicative of an undervalued stock, however this should really be compared within industries. A high PE ratio is not always a bad thing as it can reflect strong past performance. 

In order to factor in future performance, I decided to use diluted EPS growth as investment criteria instead, keeping all else constant to see its effect on my strategy. In this case I changed the listing of securities to descending as a higher EPS growth is generally better. Previously I was sorting PE ratio based on highest PE sector but lowest PE company within such sector(s) hence why i had sorted ascending order. 

import pandas as pd
import numpy as np

def initialize(context):
    context.stock_weights = {}
    context.days = 0
    context.sect_numb = 2
    
    context.sector_mappings = {
       101.0: "Basic Materials",
       102.0: "Consumer Cyclical",
       103.0: "Financial Services",
       104.0: "Real Estate",
       205.0: "Consumer Defensive",
       206.0: "Healthcare",
       207.0: "Utilites",
       308.0: "Communication Services",
       309.0: "Energy",
       310.0: "Industrials",
       311.0: "Technology"
    }
    
        schedule_function(rebalance,
                      date_rule=date_rules.month_start(),
                      time_rule=time_rules.market_open())
    
def rebalance(context, data):
    
    for stock in context.portfolio.positions:
        if stock not in context.fundamental_df and stock in data:
            order_target_percent(stock, 0)

    log.info("The two sectors we are ordering today are %r" % context.sectors)

    weight = create_weights(context, context.stocks)

    for stock in context.fundamental_df:
        if stock in data:
          if weight != 0:
              log.info("Ordering %0.0f%% percent of %s in %s" 
                       % (weight * 100, 
                          stock.symbol, 
                          context.sector_mappings[context.fundamental_df[stock]['morningstar_sector_code']]))
              
          order_target_percent(stock, weight)
    
def before_trading_start(context): 
  
    num_stocks = 50
    
    fundamental_df = get_fundamentals(
        query(
            fundamentals.earnings_ratios.diluted_eps_growth,
            fundamentals.asset_classification.morningstar_sector_code
        )
        .filter(fundamentals.valuation.market_cap != None)
        .filter(fundamentals.valuation.shares_outstanding != None)
        .order_by(fundamentals.valuation.market_cap.desc())
        .limit(num_stocks)
    )

    sector_eps_dict = {}
    for stock in fundamental_df:
        sector = fundamental_df[stock]['morningstar_sector_code']
        eps = fundamental_df[stock]['diluted_eps_growth']
                
        if sector in sector_eps_dict:
            sector_eps_dict[sector].append(eps)
        else:
            sector_eps_dict[sector] = []
    
    sector_eps_dict = dict([(sectors, np.average(sector_eps_dict[sectors])) 
        for sectors in sector_eps_dict if len(sector_eps_dict[sectors]) > 0])
    
    sectors = sorted(sector_eps_dict, key=lambda x: sector_eps_dict[x], reverse=False)[:context.sect_numb]
    
    context.stocks = [stock for stock in fundamental_df
                      if fundamental_df[stock]['morningstar_sector_code'] in sectors]
    
    context.sectors = [context.sector_mappings[sect] for sect in sectors]

    context.fundamental_df = fundamental_df[context.stocks]
    
    update_universe(context.fundamental_df.columns.values)   
    
def create_weights(context, stocks):

    if len(stocks) == 0:
        return 0 
    else:
        weight = 1.0/len(stocks)
        return weight
        
def handle_data(context, data):
  
    record(num_positions = len(context.portfolio.positions))


This resulted in an even higher return of 20.9%, beating the benchmark by 5.7%. Judging from the Sharpe ratio, returns with this strategy are higher even after adjusting for risk. 

Quantopian Algorithmic Trading Strategy: Multiple Securities

This post demonstrates algorithmic strategy similar to our initial Basic Trading Strategy, though uses multiple securities. We have chosen to backtest JP Morgan Chase, Bank of America and Citigroup, the three largest banks in America. These securities have all been traded for the length of our backtest (April 2014 to April 2015).

import datetime
import pytz

def initialize(context):
    context.stocks = symbols('JPM', 'BAC', 'C')
    context.vwap = {}
    context.price = {}

    context.max_notional = 1000000.1
    context.min_notional = -1000000.0

     utc = pytz.timezone('UTC')
    context.d=datetime.datetime(2000, 1, 1, 0, 0, 0, tzinfo=utc)

def handle_data(context, data):
    notional=0
    money = []

    for stock in context.stocks:
        price = data[stock].price
        money.append(price*context.portfolio.positions[stock].amount)
        notional = notional + context.portfolio.positions[stock].amount * price
        tradeday = data[stock].datetime

    for stock in context.stocks:  
        vwap = data[stock].vwap(5)
        price = data[stock].price

       if price < vwap * 0.995 and notional > context.min_notional:
            order(stock,-100)
            notional = notional - price*100
        elif price > vwap * 1.005 and notional < context.max_notional:
            order(stock,+100)
            notional = notional + price*100

    if (context.d + datetime.timedelta(days=1)) < tradeday:
        log.debug(str(notional) + ' - notional start ' + tradeday.strftime('%m/%d/%y'))
        context.d = tradeday

    record(jpm_money=money[0], bac_money=money[1], c_money=money[2])

Function meanings: 

  • Firstly we import required libraries. Datetime represents the timestamp of the last trade of the security. Timezones are either a pytz timezone object or a string conforming to the pytz timezone database. 
  • Secondly we initialise each security by calling their symbols 
  • vwap(days) calculates the volume weighted average price over the given number of trading days, including today's trailing data
  • context.max_notional, context.min_notional is also run in our initialize function as it sets our maximum and minimum position size. That is, it sets a limit on the absolute magnitude of any position held by the algorithm for a given security in terms of dollar value. As the limit has not been set for each security, it applies to all
  • We also initialise time variables for logging purposes, in this case we convert the timezone to UTC
  • We start with our position as 0  (notional=0)
  • For each stock, the algorithm computes our position at the start of each frame, finding price, notional (amount of stock positions x price) and trading day
  • Next the algorithm goes through each security again, finding the price and calculating volume weighted average price over the past 5 days. If the price is slightly below the vwap, and the position limit has not been reached, the sell order is executed and our position is updated
  • Similarly, if price is slightly above the vwap, and the max position value has not been reached, the buy order is executed to ride the upswing and our position is updated
  • If this is the first trade of the day, the notional value is logged. 
  • We are also going to record the amount of money we have in each security. 
This strategy generated negative returns of -2.3%. 


However, if we increase the number of shares purchased during the upswing to 120 000 and increased the number of shares sold during downswings to 1000, we get an outstanding 77.3% return. Strangely, further manipulation of these numbers (increases and decreases) showed me inconsistent combinations which even touched on negative returns. 


As this strategy is based around each stock's volume weighted average price, increase the length of days in calculating vwap from 5 to 7 days increases return to 74.6%. However, increasing it to 8 days gives a 24.9% return and increasing it to 10 days generates a negative 64.4% return. 

Using a similar strategy on stocks in the energy sector- BHP Billiton, Anadarko Petroleum Corporation and Exxon Mobil Corporation, a 70% return is generated when 2000 shares are sold during downswings and 100,000 are bought during upswings. 

if price < vwap * 0.995 and notional > context.min_notional:
            order(stock,-2000)
            notional = notional - price*100
elif price > vwap * 1.005 and notional < context.max_notional:
            order(stock,+100000)
            notional = notional + price*100 

Comparing different sector stocks: Microsoft (technology), Johnson and Johnson (pharmaceuticals) and Toyota (manufacturing), we managed to tailor the strategy to give a 185.8% return. Changing parts of the algorithm did produce volatile results similar to our previous stocks, however we realised that the main reason of the low returns was due to the algorithm's poor performance when the benchmark returns increased early in 2015. In order to capitalise on the overall gains, we lowered the buy order cutoff mark - instead of waiting for a 0.5% increase in price signal, the algorithm now executes a buy when there is a 0.2% increase in price signal.

elif price > vwap * 1.005 

elif price > vwap * 1.002
price < vwap * 0.9995 and price < vwap * 0.9995elif price > vwap * 1.002 


Seeing as the algorithm still performed relatively poorly when the benchmark was low, we also decided to adjust the sell strategy to selling when current stock price reached less than 99.95% vwap rather than 99.5% vwap. This generated the highest return so far: 415.5%! 



Monday 6 April 2015

Quantopian Algorithmic Trading Strategy: Basic, Daily

We have been recently been using Quantopian, a crowd-sourced hedge fund which allows individuals access to a browser-based algorithmic trading platform. Below is a basic investment strategy. Algorithms are coded in Python, and backtested against US stock price and fundamental data.

def initialize(context):
    context.security = symbol('K')

def handle_data(context, data):

    average_price = data[context.security].mavg(5)
    current_price = data[context.security].price
    cash = context.portfolio.cash

if current_price > 1.01*average_price and cash > current_price:
    number_of_shares = int(cash/current_price)
    order(context.security, +number_of_shares)
    log.info("Buying %s" % (context.security.symbol))

elif current_price < average_price:
    order_target(context.security, 0)
    log.info("Selling %s" % (context.security.symbol))
    
record(stock_price=data[context.security].price)

Function meanings:


  • To start an algorithm, we need to use the initialise function. This will set any data or variables that are used in the algorithm, such as the security, or parameters
  • data is a dictionary that represents a snapshot of the algorithm's universe including market information about the security and transforms
  • context stores any state defined by us and stores portfolio object
  • In the first example, we are using Kellogg (K)
  • The handle_data function is run based on user specified frequency- either every minute (live trading and minute backtesting) or daily i.e. handle_data is called whenever a market event occurs for Kellogg
  • We calculate the average price of Kellogg stock based on the moving average for the last 5 days
  • The portfolio object tracks our position, cash, cost basis of holdings, etc. In our basic strategy, we are interested in the current amount of cash in our portfolio
  • We have programmed the algorithm to BUY if current price is 1% above Kellogg's 5 day average price and there is enough cash in the portfolio
  • This is why we also need to calculate how many shares to buy
  • We have programmed the algorithm to SELL if current price is below Kellogg's 5 day average price, that is, close position to holding zero shares
  • The record function can track signals, up to five different variables. In our case, we are recording Kellogg's stock price.
The screenshot demonstrates this strategy generates 8.3% return from the start of 2014 to April 2015. 


However, we identified a potential problem which played a role in lowering returns. We replaced the code to sell with a code instructing sell only if the current price is higher than purchase price/cost basis. Firstly we define cost basis as the volume weighted average price paid (including commission) per share in this position. 

costBasis = context.portfolio.positions[context.security].cost_basis
elif current_price > costBasis:
    order_target(context.security, 0)
    log.info("Selling %s" % (context.security.symbol))

This gave us a higher return of 19%, outperforming the benchmark. 


Update (7/4/15): I updated the buying strategy to 

if current_price > 1.005*average_price and cash > current_price:
        number_of_shares = int(cash/current_price)
        order(context.security, +number_of_shares)
        log.info("Buying %s" % (context.security.symbol))

All else remains unchanged. I believe that buying the stock when it increases its price by 0.5% as opposed to 1% would improve chances of making the most of the upswing in price before it fell again. This was confirmed by the 28.59% returns. 



If the same buy strategy (buy if 0.5% increase) had used a sell strategy where we sell if current price is below average price, ignoring the fact the current price could be below purchase price, 

elif current_price < average_price:
        order_target(context.security, 0)
        log.info("Selling %s" % (context.security.symbol))



The returns are 13.1%. This is higher than our original strategy (which generated 8.3% return)- the 0.5% buy strategy boosted returns by approximately 5%. Adding in the improved sell strategy where sell only if current price is above purchase price more than doubles this return. 

Using the same strategy on Facebook stock though with a moving average of 7 days, we generate a 22.4% return. We have added the function set_symbol_lookup_date, which globally sets the date to help disambiguate cases where a symbol historically referred to different securities (FB used to refer to FBR Asset Investment Corp, last traded in 2003).