Preface Part 1 Invited Lectures 1 The Use of Low-Rank Updates in Interior-Point Methods 2 Study on Approximate Proximal Point Algorithms for Monotone Variational Inequalities 3 Some Optimality Conditions for Minimax Problems and Nonlinear Programs 4 On Updating the Inverse of a KKT Matrix 5 Low-Cost Methods for Nonlinear Large-Scale Matrix Equations 6 Extended Conjugate Residual Methods For Solving Nonsymmetric Linear Systems 7 Necessary Conditions and Sufficient Conditions for Non-Decomposable Two-Stage Min-Max Optimizations 8 Some Recent Progress In Unconstrained Nonlinear Optimization 9 Superlinear Convergence un-Algorithms for D.C. Minimization 10 Principal Manifolds Nonlinear Dimensionality Reduction via Tangent Space Alignment Part 11 Contributed Papers 11 Preconditioned Multiple Search Direction Conjugate Gradient Method 12 Existence and Uniqueness of the AHO Search Direction in Primal-Dual Interior Point Algorithms for the Quadratic Semidefinite Programming 13 A Multisplitting and Schwarz Iteration Scheme For Solving Implicit Complementarity Problems 14 New Insights into Penalty Functions 15 Global Convergence Properties of the Conjugate Descent Method with Armijo-type Line Searches 16 A New Class of Memory Gradient Methods with Inexact Line Searches 17 Convergence Properties of Nonmonotone Spectral Projected Gradient Methods 18 A Subspace Trust Region Method for Large Scale Unconstrained Optimization 19 A Trust Region Method for Large Scale Inverse Problems in Atmospheric Image Restoration 20 Number Theoretic Global Optimization Searching Algorithm Based On Evolution