TensorFlow: Weighted Average Layer with Fixed Weights

This post demonstrates the process of implementing a TensorFlow code to calculate the weighted average of the previous layer's output. This simple exercise serves as a practical guide to creating a custom layer in TensorFlow.


Weighted Average Layer in TensorFlow


Elvis Presley napping on a sofa at New York's Warwick Hotel, surrounded by fan mail, March 17, 1956.


Why not use a simple weighted average calculation? The reason is that the inputs to the weighted average are outputs that have passed through multiple deep learning layers, such as dense layers, derived from the input data (country yield curves). Hence the need for a custom layer.


Python Jupyter Notebook Code


The following TensorFlow code declares the weighted average layer and applies it to three time series with trigonometric frequencies using specific fixed weights.

import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt
 
# Define the user defined custom layer
class WeightedAverageLayer(tf.keras.layers.Layer):
    
    def __init__(self, custom_weights, **kwargs):
        super(WeightedAverageLayer, self).__init__(**kwargs)
        self.custom_weights = tf.constant(custom_weights, 
                                          dtype=tf.float32)
    def call(self, inputs):
        weighted_sum = tf.reduce_sum(inputs*self.custom_weights, 
                                     axis=-1, keepdims=True)
        return weighted_sum
 
# Generating time values
= np.linspace(010, num=100, endpoint=False, dtype=np.float32)
= tf.constant(t, dtype=tf.float32)
 
# Defining different trigonometric time series
series0 = 0.3*tf.sin(t*0.5*np.pi) + t*0.05
series1 = 0.5*tf.sin(t*1*np.pi)
series2 = 0.2*tf.sin(t*5*np.pi) + 0.1
 
# Concatenating 3 time series to create the tensor
m_series = tf.stack([series0, series1, series2], axis=1)
 
# Define the fixed weights
wgt = [0.50.30.2]  
 
# instantiates an object of the class WeightedAverageLayer
weighted_avg_layer = WeightedAverageLayer(custom_weights=wgt)
 
# Pass 'time_series' through the WeightedAverageLayer
output = weighted_avg_layer(m_series)
 
# Plotting the time series data and WeightedAverageLayer output
plt.figure(figsize=(74))
plt.plot(m_series[:,0],label='Feature 1: '+str(wgt[0]*100)+'%')
plt.plot(m_series[:,1],label='Feature 2: '+str(wgt[1]*100)+'%')
plt.plot(m_series[:,2],label='Feature 3: '+str(wgt[2]*100)+'%')
plt.plot(output[:,0], label='Weighted Average', color="black", lw=4)
plt.title('Time Series Data and Weighted Average')
plt.legend(loc='lower right')
plt.tight_layout()
plt.show()
 
cs


In the WeightedAverageLayer I explain two things for the sake of clarity.

  1. self.custom_weights: This is an attribute of the WeightedAverageLayer class. It holds the custom weights that are used to compute the weighted sum. These weights are applied element-wise to the inputs.
  2. tf.reduce_sum(..., axis=-1, keepdims=True): axis=-1 indicates that the summation will be performed along the last axis of the tensor, effectively summing the products obtained from the element-wise multiplication.The keepdims=True argument ensures that the resulting tensor retains

In particular, when you perform tf.reduce_sum along axis=-1 (the last axis which represents the num_features), it effectively reduces or collapses the tensor along the num_features dimension. This reduction operation sums the values across the num_features for each time step, leaving only the time steps as the remaining dimension in the resulting tensor. Therefore, after the reduction, you'll have a tensor of shape (time_steps, 1) where each value corresponds to the sum across the num_features dimension for each time step.


This exercise is centered around grasping the creation of custom layers through a simple approach. Later, building upon this method, we can advance to implementing more practical filters.


No comments:

Post a Comment