South Africa Flag logo

South African Skeptics

August 19, 2019, 22:16:24 PM
Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
Go to mobile page.
News: Please read the forum rules before posting.
   
   Skeptic Forum Board Index   Help Forum Rules Search GoogleTagged Login Register Chat Blogroll  
Pages: [1] 2  All   Go Down
  Print  
Author Topic:

Digitisation interpolation and information

 (Read 2773 times)
0 Members and 1 Guest are viewing this topic.
bluegray
Administrator
Hero Member
*****

Skeptical ability: +9/-3
Offline Offline

Posts: 1107



saskeptics
WWW
« on: September 19, 2009, 19:34:42 PM »

I agree with Peter, the information gained by interpolation might not be zero, but it's not very useful either. You will have more values and pixels, but not more detail. Their are various interpolation methods of course, some will be more useful that others.
The issue isn't usefulness. If you think so, please define "usefulness" in relation to information. The issue is the amount of information regardless of one's tastes. If you add an algorithm to interpolate between samples, you have added information whether it's useful or not.

ETA: Thanks Petey, you're a real prince.
Yes, that us true - my use of the word 'usefulness' is maybe not technically right. What I mean is that the extra values, while obviously there doesn't tell as much more about the original source. They are values derived from information that were already there.

If we look at a bitmap picture of text - lets say 100x100 px, and resize that image to twice the width and height using linear interpolation, you will have a bitmap that is 200x200 px. That is 4 times the amount of values.
The text will not be any clearer though, just bigger and blurred.
Logged
Irreverend
Full Member
***

Skeptical ability: +9/-1
Offline Offline

Posts: 222



« Reply #1 on: September 20, 2009, 13:55:23 PM »

Interpolation can take many forms, including such things as edge detection, FFT's, assorted filters, anti-aliasing, etc. If you run one or more of them on a given image, you will generally add info because in order to recreate the processed image from the starting one, you have to know a certain minimum of detail about the algorithm(s) used. In other words, the new image takes more detail to describe fully or produce. That's the point.

Please read the Wiki link carefully to "Information theory" that I posted earlier. The important point to note is that "information" has a precise scientific definition.

Also, that doesn't mean that you can't reduce info. You can do grayscale reductions or color separations that effectively reduce info.

ETA: To clarify the last point, the info added by whatever algorithm is applied is more than offset by the info that is removed by it, hence a net info subtraction.
Logged
Peter Grant
Hero Member
*****

Skeptical ability: +5/-9
Offline Offline

Posts: 845


a fully caused agent


AtheistStoned AtheistStoned
WWW
« Reply #2 on: September 20, 2009, 18:38:03 PM »

Apologies are nice, but I prefer explanations. Been reading Information Theory again and can now see how the work done in running the algorithms would actually increase the complexity of the data and thus the amount of digital information as well. This, however, still leaves the question of how much this new information has in common with the original analogue version. Sure the new file contains more information, strictly speaking, but what does this new information have to do with the original?
Logged
bluegray
Administrator
Hero Member
*****

Skeptical ability: +9/-3
Offline Offline

Posts: 1107



saskeptics
WWW
« Reply #3 on: September 20, 2009, 20:14:30 PM »

Peter, I think we both misread Irreverend's original argument - we are actually talking about different cases. Will clarify later when I have more time Embarrassed
Logged
Peter Grant
Hero Member
*****

Skeptical ability: +5/-9
Offline Offline

Posts: 845


a fully caused agent


AtheistStoned AtheistStoned
WWW
« Reply #4 on: September 20, 2009, 21:26:39 PM »

OK, whenever you get a chance.  Smiley

For the record, here is where this discussion actually starts:

Irreverend on September 16, 2009, 20:54:05 PM
Logged
Mefiante
Defollyant Iconoclast
Hero Member
*****

Skeptical ability: +61/-9
Offline Offline

Posts: 3749


In solidarity with rwenzori: Κοπρος φανεται


WWW
« Reply #5 on: September 21, 2009, 15:06:40 PM »

This, however, still leaves the question of how much this new information has in common with the original analogue version. Sure the new file contains more information, strictly speaking, but what does this new information have to do with the original?
Not to pre-empt Irreverend’s answer, but your question actually relates to the meaning of the information, not the information itself, so here are a few relevant points:

  • Information theory is not concerned with meaning.  How could it when exactly the same string or substring of symbols can occur in a digital photograph, a music file, a data file, instructions in a program, and so on?  The meaning is determined by the context in which the information is used.
  • The information added by running certain interpolations is related to the original precisely by virtue of the fact that the original provided the necessary initial information that produced that – and only that – additional information.  Had the original been different, so too would the output be.
  • Many interpolations precisely preserve large parts of the original.  In such cases most of the new information simply adds to the existing information without altering what was already present in the analogue version.
  • Your original claim was that “For storing and transmitting information, digital is far more efficient. But, it is more efficient because it actually contains less information. An analogue photograph or recording contains far more information than your eyes or ears are capable of processing”  (Emphases added).  Irreverend has shown that this claim is not necessarily true.  One criticism that has not been raised is the A-to-D conversion itself, which often does result in a loss of information.  However, there is no in-principle reason why, for example, colours cannot be digitised to precisions that exceed the quantisation levels of photons.  In such a case, the digital rendition, while hugely wasteful, will again contain more information than the analogue version (and some of it will necessarily be spurious).  And what of photographs taken with a CCD camera?  They are digital from the word go because CCDs do not support a continuum of colour states.  They effectively “snap” to the closest supported one.
  • For many digital renditions, it is simply not necessary to preserve everything fully (if that is even possible – many analogue storage methods deteriorate over time and tend to attract noise).  Several digital image and sound storage formats make use of so-called “lossy” data compression techniques, and they are for all practical purposes as good as any original.
  • “Digital” does not have to be binary.  Binary (two-state) is just the most convenient implementation from a technical perspective.  DNA uses a quaternary (four-state) system (ACGT) for encoding information.

'Luthon64
Logged
bluegray
Administrator
Hero Member
*****

Skeptical ability: +9/-3
Offline Offline

Posts: 1107



saskeptics
WWW
« Reply #6 on: September 21, 2009, 15:26:01 PM »

Thanks for that extensive post Luthon Wink
I misread Irreverend's original post. I thought he meant interpolating a lower resolution digital image to get a higher resolution image. What he actually said was interpolating a digitized image sampled higher than the physical resolution of the analog source.
With that I completely agree.
Logged
Peter Grant
Hero Member
*****

Skeptical ability: +5/-9
Offline Offline

Posts: 845


a fully caused agent


AtheistStoned AtheistStoned
WWW
« Reply #7 on: September 21, 2009, 15:45:12 PM »

Thanks Mefiante, glad you're still talking to me!  Grin

Still have a few questions, but will wait till later.
Logged
Peter Grant
Hero Member
*****

Skeptical ability: +5/-9
Offline Offline

Posts: 845


a fully caused agent


AtheistStoned AtheistStoned
WWW
« Reply #8 on: September 21, 2009, 19:00:49 PM »

This, however, still leaves the question of how much this new information has in common with the original analogue version. Sure the new file contains more information, strictly speaking, but what does this new information have to do with the original?
Not to pre-empt Irreverend’s answer, but your question actually relates to the meaning of the information, not the information itself, so here are a few relevant points:


Well I suppose you could call it meaning, but not in the usual sense of the word. I'm talking more about the source of the information. Empirical measurement verses theoretical extrapolation.

  • Information theory is not concerned with meaning.  How could it when exactly the same string or substring of symbols can occur in a digital photograph, a music file, a data file, instructions in a program, and so on?  The meaning is determined by the context in which the information is used.


Or, perhaps more often, by the original source of the the information?

  • The information added by running certain interpolations is related to the original precisely by virtue of the fact that the original provided the necessary initial information that produced that – and only that – additional information.  Had the original been different, so too would the output be.


Agreed, the original information is essentially all we have to work with.

  • Many interpolations precisely preserve large parts of the original.  In such cases most of the new information simply adds to the existing information without altering what was already present in the analogue version.


Yes, let's stick with strict interpolation where "the function must go exactly through the data points"



I'm glad you took this example even further than the quantum level. I think this markedly shows the difference between analogue and digital information. Analogue is essentially empirical in nature whilst digital is theoretical and predictive.

Here is an example which doesn't need to go quite as far. Say one has an analogue signal of which only a few data points are initially sampled. One takes this relatively small file, runs an interpolation algorithm on it, and is presented with a nice smooth curve. The process is repeated, but this time more samples are taken and the results are observed again. The new curve follows roughly the same path, but on closer examination it is more wiggly. The process can be repeated until the limits of the measuring equipment are reached and the curve will get progressively more wiggly each time, though the wiggles will become harder to spot. Each new wiggle represents more information which has been digitally captured from the original analogue signal, but just because we can't capture any more information doesn't mean that there isn't more information there to be captured. I think one might argue that this information has to be there, otherwise the analogue signal wouldn't behave the way it does.

  • For many digital renditions, it is simply not necessary to preserve everything fully (if that is even possible – many analogue storage methods deteriorate over time and tend to attract noise).  Several digital image and sound storage formats make use of so-called “lossy” data compression techniques, and they are for all practical purposes as good as any original.


Agreed, for storage and transmission of information, digital rules.

  • “Digital” does not have to be binary.  Binary (two-state) is just the most convenient implementation from a technical perspective.  DNA uses a quaternary (four-state) system (ACGT) for encoding information.

'Luthon64


Good point, I think language would be another example.
« Last Edit: September 21, 2009, 19:33:21 PM by Peter Grant » Logged
Irreverend
Full Member
***

Skeptical ability: +9/-1
Offline Offline

Posts: 222



« Reply #9 on: September 21, 2009, 20:34:31 PM »

Mefi's last post here covers most of it. In much more detail than I'd've given. Thanks.

Just one thing, you can up the info content of low-res images to produce higher-res ones, provided you know more or less what you're looking for or at. A variety of techniques are available for this. E.g. getting crisp stills from old grainy blurry videotape.
Logged
Peter Grant
Hero Member
*****

Skeptical ability: +5/-9
Offline Offline

Posts: 845


a fully caused agent


AtheistStoned AtheistStoned
WWW
« Reply #10 on: September 22, 2009, 09:03:42 AM »

This is not really surprising considering how much redundant information video contains, as opposed to stills.
Logged
BoogieMonster
NP complete
Hero Member
*****

Skeptical ability: +19/-1
Offline Offline

Posts: 3078



« Reply #11 on: September 22, 2009, 16:24:21 PM »

I fly in my spare time, and interpolation is used there...

The aircraft manuals have tables containing air density, weight, ambient air temperature, etc. And then provide expected performance figures at certain intervals of these variables. From these tables, one is required to learn how to interpolate, much like a rasterized image, in two dimensions.

This is because the table cannot be exhaustive, and your particular conditions may be somewhere between 4 of the values given. Lets say you're trying to solve for airspeed. You will then use a simple averaging technique to find the approximate airspeed that will give you the desired performance at your particular ambient conditions.

Now, this information may not be exact. But it is close enough that you can fly that speed and get the performance you desire, in this case the difference being negligible. And thus you have just gained useful information. And herein lies the usefulness of the technique. The bandwidth, or lets say storage capacity, of the manual is limited, it cannot provide figures for every possible scenario. But through interpolation you can extract useful information out of that limited dataset, and safely use it knowing that it's close enough to the original to be of use. Similarly for digital images. The interpolated values are "close enough" to be more pleasing to the eye than using the original raw data. Hence, as far as your eye is concerned, information has been added. Even though it doesn't exactly reproduce the values that you might have found had your initial sampling rate been higher. However, had it been, you may not have had the necessary storage capacity to usefully store the image(s).

As others have stated, analogue media may in theory provide a more perfect representation. But in practise this is dogged by analogue media deteriorating over time, or actually having "background noise" to begin with. Additionally, making an exact copy of analogue media may require lots of effort, if not being impossible. Whereas digital copies always maintain the exact quality of the (albeit a tiny bit imperfect) master, after many generations of copies too.
Logged
st0nes
Hero Member
*****

Skeptical ability: +10/-1
Offline Offline

Posts: 942



mark.widdicombe1
WWW
« Reply #12 on: September 23, 2009, 06:21:17 AM »

Exactly the same with tide tables in the nautical sphere.  Interpolate between ports and times to get height of tide for your location and time.  Expensive if you don't get it right.
Logged
Irreverend
Full Member
***

Skeptical ability: +9/-1
Offline Offline

Posts: 222



« Reply #13 on: September 23, 2009, 08:28:39 AM »

Sure enough, those last two examples are useful. They are examples of the simplest kinds of interpolation - straight averaging and distance-weighted linear interpolation. These work well on smoothly changing quantities but there are far more sophisticated methods where transitions aren't smooth. Think of CAT scans and various types of tomography.
Logged
Peter Grant
Hero Member
*****

Skeptical ability: +5/-9
Offline Offline

Posts: 845


a fully caused agent


AtheistStoned AtheistStoned
WWW
« Reply #14 on: September 23, 2009, 19:28:20 PM »

I hope you're not suggesting that a CAT scan contains more information than a patient's brain.
Logged
Pages: [1] 2  All   Go Up
  Print  


 
Jump to:  

Powered by SMF 1.1.11 | SMF © 2006-2009, Simple Machines LLC
Page created in 0.501 seconds with 24 sceptic queries.
Google visited last this page May 11, 2019, 01:36:19 AM
Privacy Policy