I’m trying to implement projected Gradient Descent with projected line searches as described here. Unfortunately when I pass a piecewise defined objective function and its gradient (using the clamp function) I get the error below.
I realise this might a big question and I haven’t really thought all of this through but I’m not really sure where to start and would be very grateful for any broad pointes in the right direction.
MethodError: no method matching isless(::Int64, ::Vector{Float64})
Closest candidates are:
isless(!Matched::Missing, ::Any)
@ Base missing.jl:87
isless(::Any, !Matched::Missing)
@ Base missing.jl:88
isless(::Integer, !Matched::ForwardDiff.Dual{Ty}) where Ty
@ ForwardDiff C:UsersOwner.juliapackagesForwardDiffPcZ48srcdual.jl:145
...
Stacktrace:
[1] <(x::Int64, y::Vector{Float64})
@ Base .operators.jl:352
[2] >(x::Vector{Float64}, y::Int64)
@ Base .operators.jl:378
[3] clamp(x::Vector{Float64}, lo::Int64, hi::Int64)
@ Base.Math .math.jl:98
[4] f(x::Vector{Float64})
@ Main c:UsersOwnerDesktopMastersProjectjulia_notebooksTestingtest_projected_derivatives.ipynb:2
[5] (::NLSolversBase.var"#fg!#8"{typeof(f), typeof(g!)})(gx::Vector{Float64}, x::Vector{Float64})
@ NLSolversBase C:UsersOwner.juliapackagesNLSolversBasekavn7srcobjective_typesabstract.jl:14
[6] value_gradient!!(obj::OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
@ NLSolversBase C:UsersOwner.juliapackagesNLSolversBasekavn7srcinterface.jl:82
[7] initial_state(method::GradientDescent{InitialPrevious{Float64}, Static, Nothing, Optim.var"#14#16"}, options::Optim.Options{Float64, Nothing}, d::OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64})
@ Optim C:UsersOwner.juliapackagesOptimKIRzgsrcmultivariatesolversfirst_ordergradient_descent.jl:59
[8] optimize(d::OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64}, method::GradientDescent{InitialPrevious{Float64}, Static, Nothing, Optim.var"#14#16"}, options::Optim.Options{Float64, Nothing}, state::Optim.GradientDescentState{Vector{Float64}, Float64})
@ Optim C:UsersOwner.juliapackagesOptimKIRzgsrcmultivariateoptimizeoptimize.jl:36 [inlined]
[9] optimize(f::Function, g::Function, initial_x::Vector{Float64}, method::GradientDescent{InitialPrevious{Float64}, Static, Nothing, Optim.var"#14#16"}, options::Optim.Options{Float64, Nothing}; inplace::Bool, autodiff::Symbol)
@ Optim C:UsersOwner.juliapackagesOptimKIRzgsrcmultivariateoptimizeinterface.jl:156
[10] optimize(f::Function, g::Function, initial_x::Vector{Float64}, method::GradientDescent{InitialPrevious{Float64}, Static, Nothing, Optim.var"#14#16"}, options::Optim.Options{Float64, Nothing})
@ Optim C:UsersOwner.juliapackagesOptimKIRzgsrcmultivariateoptimizeinterface.jl:151
[11] top-level scope
@ c:UsersOwnerDesktopMastersProjectjulia_notebooksTestingtest_projected_derivatives.ipynb:1