Here
The long and short of it is that the reporter is trying to say that a populace of 1.5 million needs 680,000 tons of flour a day.
Using a trick called "math", Martin Kramer was able to determine that the report itself is nonsense, given that it works out to around half a ton of flour a day per person in Gaza.
Did I mention that this "statistic" was used to show that Israel was restricting the food needs in Gaza by 99%?
Read the blog post itself to see how egregious this error is. It has been reported multiple times and contradicts the article's author's own estimates in previous years.
On many hot-button topics the numbers journalist quote should always be suspect.
For one reason this is because they prefer the "greater narrative". This is a fancy word for "They had the thrust of the story already determined and needed to fiddle with the numbers to make it fit."
Also it's just prudent to think critically about such things. Questioning authority and news is a good thing, it helps a person become more informed and able to tell what's actually going on.
It's important to remember scale. The number in that previous article was approaching the order of the number of people in it. That's the first test that something is fishy.
Often a simple calculation will uncover that the article is BS.
Sometimes it takes a little bit of research.
Like a recent NY Times article that tried to paint Veterans returning from Iraq as dangerous savages. The story was full of anecdotal evidence and stories of tragedy.
But when it came time to say exactly how many murders were committed (they also included incidents of manslaughter and automotive accidents another test tat something fishy was up).
Do the math: the 121 alleged instances of homicide identified by the Times, out of a population of 700,000, works out to a rate of 17 per 100,000--quite a bit lower than the overall national rate of around 27.
But wait! The national rate of 27 homicides per 100,000 is an annual rate, whereas the Times' 121 alleged crimes were committed over a period of six years. Which means that, as far as the Times' research shows, the rate of homicides committed by military personnel who have returned from Iraq or Afghanistan is only a fraction of the homicide rate for other Americans aged 18 to 24. Somehow, the Times managed to publish nine pages of anecdotes about the violence wreaked by returning servicemen without ever mentioning this salient fact.
Full post here
What John Hinderaker did with nothing more than some national statistics and a bit of math was show that the entire point of an article was a flat out, and clumsy, lie.
John goes into more detail, but it's clear that someone with an agenda posted an article that pointed to one direction, facts be damned.
Also, notice how all the NY Time's editors or fact checkers let an article pass that was nothing but fear-mongering nonsense.
This is why you as a news reader has to be suspicious and do these calculations and tests. Sad as it is to say, the news cannot be trusted to get basic facts right.
For another, of many, examples. What's wrong with the pictures in this report?
This kind of clumsy ineptness shows how little the media thinks of it's customers.
Just ask Dan Rather, a man that still rants about a series of 70's era documents that just happen to be an exact. match to the default settings to Microsoft Word. You'd think the forgers in that case would at least bothered to find an old typewriter...