This paper discusses a cellular network assisted by an energy-and spectral-efficient unmanned aerial vehicle (UAV), in which the UAV is deployed to serve mobile users in the cellular network and enable mobile data offloading from a ground base station (GBS) by taking a circular flight route. We explore a visible light communication (VLC)-enabled UAV, in which a light-emitting diode (LED) is mounted on a rotary-wing UAV to offer communications to the users. Our aim is to simultaneously optimize both energy efficiency (EE) and spectral efficiency (SE) of the VLC-enabled UAV by jointly optimizing the common throughput of all users as well as the UAV’s trajectory and flying speed. We employ a unified metric, called resource efficiency (RE), and explore the RE optimization to obtain an adaptive EE-SE tradeoff. The problem posed is seen in a complex and non-convex shape, making it hard to solve. Motivated by the enormous achievement of deep reinforcement learning (DRL) in solving complex control problems, we propose a DRL-based approach to handle this non-convex and complicated optimization. The findings of the simulation reveal that the developed framework achieves a substantial performance in terms of the solution convergence as well as the promising quality of the solutions.