A 7B LLM has a huge quantity of knowledge about the world. You don't need that just to reword sentences. You can use a translation model with English input and English output, or other Text2Text model such as one for textual style transfer. A purpose-built model for rewording into a fixed style different from the input could be easily be 10M parameters or fewer (that's already big enough for translating between two languages, afterall) but you can readily find models in the 100M range for text style transfer.