In this paper, we present some algorithms for unconstrained convex optimization problems. The development and analysis of these methods is carried out in a Banach space setting. We begin by introducing a general framework for achieving global convergence without Lipschitz conditions on the gradient, as usual in the current literature. This paper is an extension to Banach spaces to the analysis of the steepest descent method for convex optimization, most of them in less general spaces.