Optimization Problem 3.0.1

Optimization_Problem_4

Problem

Given a point $x_k \in \mathbb{R}^n$, a function f(x) $ \in \mathbb{R}^n$, a NxN symmetrix matrix, and a trust region radius $\Delta_k$ state the trust region subproblem in terms of $f_k$, $\nabla f_k$ and $\beta_k$ what would $B_k$ result in for a method equivlent to steepest descent $x_{k+1} = x_k - \alpha_k \nabla f_k$

Solution:

  1. state the model function as it is the trust region subproblem using the terms defined above:
$$m_k(p) = f_k + \nabla f_k p_k + \frac{1}{2}p_k^T \beta_k p_k$$




  1. The $\beta_k$ would relate to steepest descent when it is a zero matrix, thus removing itself from the subproblem and only using the gradient as its search direction
In [ ]:
 
Share this Post