A limited memory q-BFGS (Broyden–Fletcher–Goldfarb–Shanno) method is presented for solving unconstrained optimization problems. It is derived from a modified BFGS-type update using q-derivative (quantum derivative). The use of Jackson’s derivative is an effective mechanism for escaping from local minima. The q-gradient method is complemented to generate the parameter q for computing the step length in such a way that the search process gradually shifts from global in the beginning to almost local search in the end. Further, the global convergence is established under Armijo-Wolfe conditions even if the objective function is not convex. The numerical experiments show that proposed method is potentially efficient.