Non-linear Optimization of Nelson-Siegel model using nloptr R package

This post shows how to use nloptr R package to solve non-linear optimization problem with or without equality or inequality constraints. Nelson-Siegel yield curve model is used as an target example.


Nelson-Siegel model using nloptr R package



In this post, the non-linear least squares problem is solved by using nloptr R package. As a proper example, Nelson-Siegel model is used as it is a representative non-linear problem in financial application.


Nelson-Siegel model as a non-linear optimization problem


Nelson-Siegel model is a non-linear least square problem with 4 parameters with some inequality constraints.
\[\begin{align} y(\tau) &= \beta_1 + \beta_2 \left( \frac{1-e^{- \tau \lambda }}{\tau \lambda }\right) \\ &+ \beta_3 \left(\frac{1-e^{- \tau \lambda }}{\tau \lambda }-e^{- \tau \lambda }\right) \end{align}\] \[\begin{align} \beta_1 > 0, \beta_1 + \beta_2 >0, \lambda > 0 \end{align}\]
Here, \(\tau\) is a maturity and \(y(\tau)\) is a continuously compounded spot rate with \(\tau\) maturity. \(\beta_1, \beta_2, \beta_3\) are coefficient parameters and \(\lambda\) is the time decay parameter.



nloptr() function


After installing nloptr R package, nloptr() function with the following specification is available to use.

1
2
3
4
5
6
7
8
9
10
nloptr(
    x0          initial guess,
    eval_f      objective function, 
    lb          lower bounds,
    ub          upper bounds,
    eval_grad_f gradient function, 
    eval_g_ineq inequality constraints, 
    eval_g_eq   equality constraints,
    opts        optimization options
)
 
cs

As the names of arguments indicate, we can understand the meanings of each arguments. Here, initial guesses, objective function, lower and upper bounds, options must be provided but the rest are optional.



Options


nloptr() function should takes opts as an argument since at least algorithm is needed to be provided. nloptr R package provides a list of algorithms, which are categorized into two classes : with and without gradient.

As there are many algorithms included in this package, for the detailed information of algorithms or other settings, refer to https://nlopt.readthedocs.io/en/latest/.

In particular, if the number of evaluation is too small, the non-linear optimization result tends to be poor so that "maxeval" is set be larger and also "xtol_rel" is set smaller to avoid local minima. In this perspective, I set maxeval=500000 and xtol_rel=1.0e-16.

1
2
3
4
5
6
opts <- list(
    "algorithm"   = "NLOPT_GN_ISRES",
    "xtol_rel"    = 1.0e-16,
    "maxeval"     = 500000,
    "print_level" = 1 )
 
cs


I used minimal options. If you try to perform the full-fledged optimization work, you need to apply additional settings to options.


R code


The R code below estimates parameters of Nelson-Siegel model with two yield curves. It is worth noting that when an objective function has additional arguments such as data or maturities, these are included in nloptr() function like optim() function.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
#========================================================#
# Quantitative Financial Econometrics & Derivatives 
# ML/DL using R, Python, Tensorflow by Sang-Heon Lee 
#
# https://shleeai.blogspot.com
#--------------------------------------------------------#
# Nelson-Siegel using nloptr function
#========================================================#
 
graphics.off(); rm(list = ls())
 
library(nloptr)
 
#-----------------------------------------------
# objective function
#-----------------------------------------------
    objfunc <- function(para, y, m) {
        beta <- para[1:3]; la <- para[4]
        C  <- cbind(
            rep(1,length(m)),    
            (1-exp(-la*m))/(la*m),             
            (1-exp(-la*m))/(la*m)-exp(-la*m))
        return(sqrt(mean((y - C%*%beta)^2)))
    }
 
#-----------------------------------------------
# constraint function ( <= : less than equal )
#-----------------------------------------------
    # beta0 + beta1 > 0
    constfunc <- function(para, y, m) {
        beta  <- para[1:3]
        lhs   <- -beta[1]-beta[2]
        return(lhs)
    }
 
 
#=======================================================
# 1. Read data
#=======================================================
    
    # b1, b2, b3, lambda, rmse for comparisons
    ns_reg_para_rmse1 <- c(
        4.26219396-4.08609206-4.90893865,  
        0.02722607,  0.04883786)
    ns_reg_para_rmse2 <- c(
        4.97628654-4.75365297-6.40263059,  
        0.05046789,  0.04157326)
    
    str.zero <- "
        mat     rate1      rate2
        3       0.0781     0.0591
        6       0.1192     0.0931
        9       0.1579     0.1270
        12      0.1893     0.1654
        24      0.2669     0.3919
        36      0.3831     0.8192
        48      0.5489     1.3242
        60      0.7371     1.7623
        72      0.9523     2.1495
        84      1.1936     2.4994
        96      1.4275     2.7740
        108     1.6424     2.9798
        120     1.8326     3.1662
        144     2.1715     3.4829
        180     2.5489     3.7827
        240     2.8093     3.9696"
    
    df <- read.table(text = str.zero, header=TRUE)
    m  <- df$mat
    y1 <- df$rate1; y2 <- df$rate2
 
 
#=======================================================
# 2. Nonlinear least squares with constraints
#=======================================================
    
    # Set optimization options.
    opts <- list("algorithm"   = "NLOPT_GN_ISRES",
                 "xtol_rel"    = 1.0e-16,
                 "maxeval"     = 500000,
                 "print_level" = 1 )
 
    #---------------------------------------------
    # NS estimation with 1st data
    #---------------------------------------------
    y <- y1
    
    x_init <- c(y[16], y[1]-y[16], 
                2*y[6]-y[1]-y[16], 0.0609)
    
    m_nloptr <- nloptr(
        x0 = x_init, y = y, m = m,
        eval_f      = objfunc, 
        eval_g_ineq = constfunc,
        # beta0 > 0, 0.01 < lambda < 0.1
        lb   = c( 0,-30,-300.001), 
        ub   = c(303030,   0.1),
        opts = opts
    )
    
    ns_nloptr_out1 <- c(m_nloptr$solution, 
                        m_nloptr$objective)
    
    #---------------------------------------------
    # NS estimation with 2nd data
    #---------------------------------------------
    y <- y2
    
    x_init <- c(y[16], y[1]-y[16], 
                2*y[6]-y[1]-y[16], 0.0609)
    
    m_nloptr <- nloptr(
        x0 = x_init, y = y, m = m, 
        eval_f      = objfunc, 
        eval_g_ineq = constfunc,
        # beta0 > 0, 0.01 < lambda < 0.1
        lb   = c( 0,-30,-300.001), 
        ub   = c(303030,   0.1),
        opts = opts
    )
    
    ns_nloptr_out2 <- c(m_nloptr$solution, 
                        m_nloptr$objective)
 
    
#=======================================================
# 3. Results and Comparisons
#=======================================================
    
    ns_reg_para_rmse1
    ns_nloptr_out1
    
    ns_reg_para_rmse2
    ns_nloptr_out2
    
cs


The following estimation results show that non-linear optimization results from nloptr are close to the answers.

Estimation of Nelson-Siegel model using nloptr R package


Using these estimated parameters at two dates, fitted yield curves are generated as follows.

Estimation of Nelson-Siegel model using nonlinear least squares optimization nloptr R package


Concluding Remarks


This post shows how to use nloptr R package to perform non-linear optimization problem with or without equality or inequality constraints. In fact, as non-linear optimization is vulnerable to the type or magnitude of constraints, careful consideration is needed to handle this kind of problem. \(\blacksquare\)


No comments:

Post a Comment