There has been ongoing discussion here regarding image quality degradation associated with using the JPEG file format. A general truism that has emerged in defense of JPEGS is that "you can't really see any difference between a JPEG and any other file format, so as long as you limit the amount of edit/saves performed JPEGs are fine."
I've always disagreed (although perhaps not openly) with that, being a "TIFF person" myself who uses JPEGs only as a final step in editing, when outputting a file to be used on the Internet or for printing through other than my own color laser printer.
To illustrate my point, I've processed a file which was edited as a TIFF (converted from RAW) and saved once as a final output JPEG. I've opened that file, converted it back into a 16 bit TIFF, then applied a variety of editing processes to accentuate the blocks of pixels that are produced as part of the JPEG compression process.
This isn't a scientific experiment; but the newly processed file (saved in JPEG format) is shown below with the original JPEG file from which it was derived:
My contention remains: as with safety protocols that govern the cleaning of equipment for micro-organisms and other contaminants in food and drug industries; just because you can't see something, doesn't mean it isn't there.
Magnify to 100% and you will see what I mean.