dpi and line screen -- Conventional Wisdom vs Reality

Conventional wisdom says to provide twice the resolution of your line screen. I seem to remember reading an article debunking that myth and promoting a resolution equal to line screen scenario–but I can’t put my hands on it.

Can anyone point me to an authoritative source I can quote?

Many thanks!

I could point you at the source but I wont. You can output to a high end imagesetter/CTP at far less than twice the DPI. But lets look at some of the problems. You scan a file at say 1.25 to 1.5 times the resolution depending on the device. Now you go through a long design process. Somewhere in the design process some type gets deleted and the image enlarged. A common process (Problem 1).

Now we get the file to the printer and they decide since it is going on to a fairly nice paper to output at 175 line screen instead of the planned 150 (Problem2)

The final piece is finally delivered from the printer and you find that the image that you tried to scan at one and a half times the resolution is pixelated. Now how many jobs have to go through just a little bit faster to make up for this spoiled work. You probably wont make it up in your lifetime.

And the reason you did this. Is to save some hardrive room. Hardrives are very cheap nowdays. To help the jobs go through the rip faster. Rips are very fast now days. The difference in a 25 meg file and a 50 meg file wont be noticed.

So now that you are out looking for a new job because you spoiled that last job on press, what was it you were trying to save.

Mike

If you are just going to a desktop printer and never send files out. Or use the scans for photography reasons do a google search for Brian P. Lawler

He has a paper on this

I have seen a report that I believe was done by some RIT students that
showed 1.5 times the line screen was acceptable, but not 1x. I can’t find
the reference.

Bret Hesler
L.P. Thebault Company

On 12/29/05 3:14 PM, “SciTech” fredericksonr@ncifcrf.gov wrote:

Conventional wisdom says to provide twice the resolution of your line screen.
I seem to remember reading an article debunking that myth and promoting a
resolution equal to line screen scenario–but I can’t put my hands on it.

Can anyone point me to an authoritative source I can quote?

Many thanks!


Sent from colorforums.com
To start new topics, do not reply to this message. Cut and paste the forum
address into a new email.
Return to www.colorforums.com for subscription administration
(c)2004-5 CHROMiX, Inc.

Post generated from email list

Thanks for the replies.

Mikec makes excellent points that I would not argue with and have always incorporated into our production workflows (e.g., optimize for the press as the last step).

My purpose for asking, however, is more academic in nature. We use an Imacon 949 scanner to directly capture images from histologic specimen slides, and it does a wonderful job. The math just works out nicely that at our maximum capture resolution, we get 40 times life size (a nice scientific sounding round number)–only its 200 dpi at final size (about 1.5 times for a 133 line screen or 1.3 times for a 150 line screen).

200 reproduces nicely to our photo-quality printers, but everyone has been indoctrinated into the 300 dpi mantra for offset print.

In an article by Richard Anderson (peterkrogh.com/Pages/digital … gital.html), he makes an argument that digital capture contains more detail than scanning film (since you’re not dealing with film grain) and, therefore, the rule of thumb no longer applies.

The article seemed more opinion than fact-based on hard evidence, so I was hoping for something a bit more definitive on the subject.

The follow-on question is how AM/FM/hybrid screen modulation affects the equation.

Again, my thanks for your assistance.