L0 gate
QuantizedNetworks.L0Gate
— Typestruct L0Gate{T, S}
Represents an L0Gate which applies L0 regularization during neural network training.
L0Gate(logα = 10; β = 2/3, dims = :, active = nothing)
Fields
logα::T
: Controls the strength of L0 regularization. (defaults to10
)β::S
: Controls "temperature" of a sigmoid function. (defaults to2/3
)dims::Union{Colon, Int}
: Specifies the dimensions to which L0 regularization is applied. (defaults to:
)active::RefValue{Union{Bool, Nothing}}
: Indicates whether the L0Gate is active (regularization is applied). (defaults tonothing
)
Functions
QuantizedNetworks.isactive
— Methodisactive(c::L0Gate)
Checks if the L0Gate is active (applies regularization).
Flux.testmode!
— FunctionFlux.testmode!(c::L0Gate, mode=true)
Sets the testing mode for the L0Gate object. If mode is true, it sets the active field to nothing, effectively turning off L0 regularization during testing. If mode is false, it sets the active field to true, enabling L0 regularization during testing. If mode is :auto, it toggles the active field.
Helper functions
QuantizedNetworks._shape
— Method_shape(s, ::Colon)
Computes the size of an array when applying L0 regularization to all dimensions
QuantizedNetworks._shape
— Method_shape(s, dims)
Computes the size of an array when applying L0 regularization to specified dimensions.
QuantizedNetworks.shift
— Methodshift(x::T, lo::Real = -0.1, hi::Real = 1.1) where {T}
Shifts the input x to a specified range [lo, hi].
QuantizedNetworks.l0gate_train
— Methodl0gate_train(x::AbstractArray{T}, logα, β; dims = :) where {T}
Applies L0 regularization during training.
QuantizedNetworks.l0gate_test
— Methodl0gate_test(::AbstractArray{T}, logα, β) where {T}
Applies L0 regularization during testing.