Blocks

DenseBlock

QuantizedNetworks.DenseBlockType

The DenseBlock module defines a custom block structure for neural networks. It consists of a chain of layers, making it suitable for creating dense blocks in deep neural networks. Encapsulates functionality for quantized dense layers, batch normalization, and output quantization. It is defined to be a functor object.

Constructor

Constructor creates a DenseBlock object containing a chain of layers. It takes several optional arguments:

  • (in, out) specifies the input and output dimensions.
  • σ is an activation function (default is the identity function).
  • weight_quantizer sets the quantizer for weights (default is a ternary quantizer).
  • output_quantizer sets the quantizer for the layer's output (default is a sign quantizer).
  • batchnorm determines whether batch normalization is applied (default is true).

It constructs a chain of layers including a quantized dense layer, optional batch normalization, and an output quantizer.

source

Standard Dense Layer

Flux.DenseMethod

This function is overwritten from the Flux package converts a DenseBlock into a standard dense layer Flux.Dense with quantized weights, adjusted biases, and the specified output quantization.

Extractors serve the purpose of extracting specific components from a DenseBlock like:

  • extract_dense for quantized dense layer.
  • extract_batchnorm for the optional batch normalization layer.
  • extract_quantizer for the output quantization function.
source

Examples

julia> using Random, QuantizedNetworks; Random.seed!(3);

julia> db = DenseBlock(2=>2)
DenseBlock(Chain(QuantizedDense(2 => 2; bias=false, quantizer=Ternary(0.05, STE(2))), BatchNorm(2), Sign(STE(2))))

julia> x = rand(Float32, 2, 4)
2×4 Matrix{Float32}:
 0.940675  0.100403   0.789168  0.582228
 0.999979  0.0921143  0.698426  0.496285

julia> db(x)
2×4 Matrix{Float32}:
 -1.0   1.0   1.0   1.0
  1.0  -1.0  -1.0  -1.0

FeatureBlock

QuantizedNetworks.FeatureBlockType

A custom struct representing a feature block. Holds a collection of layers which include feature quantization layers and an optional quantizer and where each layer performs a specific operation on the input data. The struct is also a functor, so it can be used as a function.

Constructor

You specify the dimensionality of the input features dim and the number of quantization levels k. Additionally, you can choose to include an extra quantizer layer for handling missing data by setting output_missing to true. The quantizer argument lets you specify the quantization function to be used (with a default of Sign()), and any additional keyword arguments are passed to the FeatureQuantizer constructor.

source

Examples

julia> using Random, QuantizedNetworks; Random.seed!(3);

julia> fb = FeatureBlock(2, 2)
FeatureBlock(Parallel(vcat, FeatureQuantizer(2 => 4; quantizer=Sign(STE(2)))))

julia> x = rand(Float32, 2, 1)
2×1 Matrix{Float32}:
 0.8521847
 0.7965402

julia> fb(x)
4×1 Matrix{Float32}:
  1.0
  1.0
  1.0
 -1.0