Peter Grant


« Reply #15 on: September 08, 2009, 20:50:58 PM » 

Thanks for the lengthy response, quite illuminating. I wouldn't argue that human brains don't have a tendency to sometimes spit out garbage… True enough, but quantum computing is more akin to individual neurons occasionally firing haphazardly, rather than the whole brain going off the rails. As long as some majority (above a certain threshold) of the neurons fire in concert, the brain will produce a definite answer (or so neuroscientists are currently inclined to believe). The correctness of the answer will depend on several other factors which can be loosely grouped under the heading “inputs.” OK, so in techie terms it's a bit like error handling through redundancy? … but how quickly and accurately do we actually solve these BQP problems? Theoretically, a quantum computer would solve certain general instances (NB! not all of them) of BQP (and P) problems very, very fast – much faster than even the most powerful modern supercomputers and superclusters that presently are capable of speeds in the order of ten teraflops. However, there are some tricky technical problems to overcome first before quantum computing becomes a reality. Execution time on a quantum computer will grow sublinearly with desired accuracy. Makes sense. Wouldn't ordinary deterministic computing provide sufficient explanation for our somewhat limited abilities? It seems unlikely. If it was so, we would probably be able to simulate a human brain fairly decently, which is something we still cannot do. Moreover, the brain can exhibit problem solving behaviour that is outside the P complexity class. For example, often when you solve a crossword and are looking for the answer to a particularly vexing clue, you don’t systematically run through all the possibilities, although naturally you’ll try a few. The answer, when it comes, suddenly pops clearly into your head and is almost instantly recognised. Not sure I understand this P complexity thing, but the crossword example doesn't seem all that complex when compared with problems like working out prime factors for instance. Why could a clever subliminal search algorithm, or an exceptionally ingenious indexing system not account for this? Does proposing a quantum brain offer any explanatory value? Not at present but it may do one day, although it does seem rather unlikely. Consciousness is an essential dimension of a functioning human brain but nobody knows what consciousness is. The squabble in this thread is over just this crucial issue: the complete QMconsciousness model is still a wild guess because it is missing several key ingredients. It is scientific only insofar that some aspects of it can be tested, at least theoretically, but the essential difficulty of how QM effects produce brain activity (and/or consciousness) remains entirely obscure and a matter of considerable speculation. Also, a large part of the problem is that “quantum” has in many quarters (especially in New Age ones) become the next Supremely Transcendent Universal Principle of Ignorant Deduction (STUPID), much like god/gods was/were in the past: “I don’t quite know how this works, so it must be quantum. Hallelujah!” 'Luthon64 Yeah, this worries me too. Assuming for a moment we and most other vertebrates, as I think one would have to assume, have evolved this remarkable quantum computing ability, what does it have to do with consciousness? If these quantum effects are taking place, they are necessarily subliminal, occurring in tiny quantum intervals far too brief for us to be consciously aware of.



Logged




Mefiante
Defollyant Iconoclast
Hero Member
Skeptical ability: +61/9
Offline
Posts: 3748
In solidarity with rwenzori: Κοπρος φανεται


« Reply #16 on: September 09, 2009, 10:40:43 AM » 

OK, so in techie terms it's a bit like error handling through redundancy? Er, not quite. It’s rather more like a Monte Carlo simulation where the solution emerges from an aggregate of a large number of statistically random trials or “samples” from the problem space. What happens in a sense is that individual errors are random but they tend to cancel one other out in the long run, and said aggregate converges on the required solution. Just as in ordinary statistical sampling, the larger the number of trials, the more confident one can be that it is properly representative of the whole. Not sure I understand this P complexity thing, but the crossword example doesn't seem all that complex when compared with problems like working out prime factors for instance. Why could a clever subliminal search algorithm, or an exceptionally ingenious indexing system not account for this? The point though is that from the brain’s perspective, solving a tricky crossword clue is procedural (or algorithmic) only in a very loose sense. As said, you’ll try a few solutions before suddenly hitting on and recognising the correct one. Speaking strictly algorithmically, the process would be quite different. It might involve a base dictionary of all possible words, selecting the correct (or probable) language, counting letters, extracting a subset of candidate words based on their length and known letter positions, and so on. Finally, the correctness or otherwise would be judged by assessing the crossword puzzle as a whole, and the process may actually find more than one solution. The only essential difference between solving a crossword in this way and factoring a large composite number is in the size of the solution space. The totality of human words consists perhaps of a few million, whereas factoring a 100digit composite number has a solution space of around 10 ^{50} (a one, followed by 50 zeroes) possibilities. Both problems actually sit in NP complexity space because the solution space grows exponentially with problem size. Also, solving a crossword does involve some form of subliminal search algorithm. The question, however, is whether that algorithm is deterministic or probabilistic. There is good reason, as outlined earlier, to think that it is the latter, and quantum computers would also run algorithms. The use of an algorithm does not mean that a solution process is necessarily deterministic. Concerning an indexing system, this itself needs to procedural/deterministic otherwise it would at times fail to index the same thing consistently. In the context of the crossword problem, it seems obvious that the indexing would need somehow to index the meaning of words, phrases and sentences (and we’ll even ignore crosswords that provide cryptic clues only). But meaning is notoriously difficult to pin down in many cases, and furthermore the process of extracting such meaning would itself need to be deterministicalgorithmic for indexing to work correctly. Yet the brain appears to act associatively (as opposed to procedurally), often through patternmatching on templates that can be somewhat fuzzy or fluid. If these quantum effects are taking place [in the brain], they are necessarily subliminal, occurring in tiny quantum intervals far too brief for us to be consciously aware of. Indeed, and that is one of the major objections against even just the concept of a QMmediated model of consciousness. 'Luthon64



Logged




Peter Grant


« Reply #17 on: September 09, 2009, 15:41:42 PM » 

Thanks Mefiante, it makes more sense now. I was naturally suspicious of any supposed link between QM and consciousness, but purely as an explanation for increased computational speed it really is quite exciting! I suppose this might help explain how savants do such complex calculations so quickly as well as some of their other amazing mental feats.



Logged




cyghost
Skeptically yours
Hero Member
Skeptical ability: +12/1
Offline
Posts: 1409
Carpe diem


« Reply #18 on: September 09, 2009, 15:52:47 PM » 

Bravo, Mefiante. Solid explanations even I can follow and understand. Thank you for that.



Logged




Peter Grant


« Reply #19 on: September 09, 2009, 23:07:10 PM » 

OK, have been rereading everything and have one, hopefully last, question. In what way would a quantum computer essentially differ from one of these: http://en.wikipedia.org/wiki/Hybrid_computerFrom what I've been reading they seem really cool, but unfortunately there seems to be little interest in them lately.



Logged




Mefiante
Defollyant Iconoclast
Hero Member
Skeptical ability: +61/9
Offline
Posts: 3748
In solidarity with rwenzori: Κοπρος φανεται


« Reply #20 on: September 10, 2009, 16:55:17 PM » 

In what way would a quantum computer essentially differ from [a hybrid computer?] As the linkedto article describes, a hybrid computer uses an analogue “computer” to generate a reasonably good initial value (or set of values) that forms the starting point for digital processing by a normal computer in cases where problemsolving requires iterative techniques. The reason for doing this is basically to reduce the solve time on the digital computer: the better the initial guess, the fewer iterations are needed to satisfy some predefined tolerance or accuracy criterion. An analogue “computer” must not be thought of as comparable to a digital computer because it doesn’t per se perform any calculations. Instead, it simulates one physical process by another that is mathematically similar – hence the “analogue” descriptor. Something as simple as an electronic InductanceResistanceCapacitance (LRC) circuit built with variable resistors, capacitors and/or inductors can qualify as an analogue “computer.” Such a circuit can, for example, be used to simulate forced damped vibration behaviour in materials (or mechanical wave propagation in them) by subjecting the circuit to AC current of an appropriate frequency and measuring certain electrical responses in the circuit. Another example is to simulate stress/strain state changes in materials subjected to impulsive forces by measuring transient electrical behaviour in the circuit. Such problems have quite complex formulations that are of the same or a very similar mathematical form to that of the analogue that is used to simulate them, usually a set of linked nonlinear partial differential equations. Put briefly, an analogue computer is a clever way of initialising the analysis of a problem in order to reduce the computational effort that a digital computer alone would need and thereby reduce the processing time. Analogue computers are not general computing machines like a digital computer is. They are limited in their applicability to a small set of problems and are purposebuilt for specific problems, which is perhaps the main reason why they and also hybrid computers are less favoured. So much for the background. It should be clear from the above that a hybrid computer is fully deterministic because the digital aspect of it greatly refines the approximate answer provided by the analogue component. In contrast and as described in an earlier post, a quantum computer is not deterministic, and that is the essential difference between the two types. One possible way of thinking about a quantum computer is to picture it as an array of bits (i.e. binary digits) that have a curious property: until each bit is actually examined, it exists in a superposed state of both 0 and 1, and it becomes definitely 1 or 0 only once it is examined. Such special bits are called “qubits.” Moreover, the state that each qubit will assume upon examination depends on its neighbours and what operations have been performed on the whole collection of them and in what order (this only works if the qubits are entangled, and this is the main technical obstacle in the way of the “quamputer”). These operations and their sequence can be thought of as the algorithm. Here’s a very simple example for the sake of illustration: Suppose you have a fourqubit computer. It has 16 (= 2 ^{4}) possible states. Suppose further that an algorithm is loaded that forces the third qubit always to be the complement ( not = negation) of the exclusiveor ( xor) result of the first two, and the fourth qubit is the logicaland ( and) result of the second and third qubits. This algorithm has two degrees of freedom because the third and fourth qubits are fixed by the value of the first two. The algorithm also predisposes the first qubit to come up high (= 1) 80% of the time, and low (= 0) for the remaining 20%, while the second qubit comes up high 33% of the time and low over the remaining 67%. Because this is a trivial problem, it’s not hard to work out the probability of each of the four possible states but it should be clear that the complexity of the algorithm and the quantum computer can be vastly increased in theory to address more meaningful problems. While the quantum computer can be used to implement such a simulation directly, a deterministic computer must either analyse the problem symbolically or use a source of randomness (or, more usually, pseudorandomness) to compute the likelihood of the possible outcomes. 'Luthon64



Logged




Peter Grant


« Reply #21 on: September 13, 2009, 19:03:29 PM » 

It should be clear from the above that a hybrid computer is fully deterministic because the digital aspect of it greatly refines the approximate answer provided by the analogue component. In contrast and as described in an earlier post, a quantum computer is not deterministic, and that is the essential difference between the two types.
Hmm, I guess I just assumed that a quantum computer would also have a deterministic, digital component which we would use to interface with the quantum part. I'm more interested in the the analogue part at the moment though. Would it be deterministic? If not, is it nondeterministic for a very different reason than the quantum computer? This is the part which really grabbed my attention: Consider that the nervous system in animals is a form of hybrid computer. Signals pass across the synapses from one nerve cell to the next as discrete (digital) packets of chemicals, which are then summed within the nerve cell in an analog fashion by building an electrochemical potential until its threshold is reached, whereupon it discharges and sends out a series of digital packets to the next nerve cell. The advantages are at least threefold: noise within the system is minimized (and tends not to be additive), no common grounding system is required, and there is minimal degradation of the signal even if there are substantial differences in activity of the cells along a path (only the signal delays tend to vary). The individual nerve cells are analogous to analog computers; the synapses are analogous to digital computers. Put briefly, an analogue computer is a clever way of initialising the analysis of a problem in order to reduce the computational effort that a digital computer alone would need and thereby reduce the processing time. Analogue computers are not general computing machines like a digital computer is. They are limited in their applicability to a small set of problems and are purposebuilt for specific problems, which is perhaps the main reason why they and also hybrid computers are less favoured.
But wouldn't a programmable analogue computer be seriously cool? Imagine being able to write programs which ran simulations that were actually real! (At least in the numerical sense) NEC just started miniaturisation in June this year: http://www.necel.com/news/en/archive/0906/1802.htmlAlso check out Factorizing RSA Keys, An Improved Analogue Solution: http://www.springerlink.com/content/k5355j45w452537m/



Logged




Mefiante
Defollyant Iconoclast
Hero Member
Skeptical ability: +61/9
Offline
Posts: 3748
In solidarity with rwenzori: Κοπρος φανεται


« Reply #22 on: September 14, 2009, 14:27:38 PM » 

Apologies for the delayed reply – I have been indisposed these past few days. Hmm, I guess I just assumed that a quantum computer would also have a deterministic, digital component which we would use to interface with the quantum part. Well, yes, probably there would be such but it would be a component merely serving an inputoutput (IO) function. Its presence would most assuredly not suddenly change a quantum computer into a deterministic machine. That would be a bit like saying that the presence or absence of a speedometer in your car changes the type of fuel it needs between petrol and diesel. Would [the analogue part of a hybrid computer] be deterministic? Ideally, yes – or as close to it as macroscopic (i.e. nonquantum) models will allow in theory. In actuality, all real analogue devices are subject to certain inaccuracies, however small they may be. These inaccuracies are for all practical purposes random, if not entirely unknowable, and they could be the result of any number of environmental factors or conditions. The errors may be tiny but no instrument can measure with 100% precision the real quantity it was designed to measure, and it is furthermore highly doubtful whether perfect accuracy is even achievable. But wouldn't a programmable analogue computer be seriously cool? Sure it would, but it is hard to see how one might go about constructing a generalpurpose programmable analogue machine. As outlined earlier, an analogue machine is one that makes use of a process that is mathematically similar to the one of interest, whereas a digital computer (usually) treats the mathematics itself of the process of interest (which is why a digital computer isn’t inherently limited to a rather narrow range of simulations or physical problems). It is not apparent how one might find an analogue of sufficient generality (short of reality itself, which clearly isn’t an analogue) to cover all of the requirements adequately for a super analogue computer to be built. 'Luthon64



Logged




Peter Grant


« Reply #23 on: September 14, 2009, 19:24:15 PM » 

Apologies for the delayed reply – I have been indisposed these past few days.
No worries, gave me time to do some more reading. Hmm, I guess I just assumed that a quantum computer would also have a deterministic, digital component which we would use to interface with the quantum part. Well, yes, probably there would be such but it would be a component merely serving an inputoutput (IO) function. Its presence would most assuredly not suddenly change a quantum computer into a deterministic machine. That would be a bit like saying that the presence or absence of a speedometer in your car changes the type of fuel it needs between petrol and diesel. Agreed, in the same way that, with this quantum consciousness theory, the digital parts of our nervous system do not suddenly change our brains into deterministic machines. The same could be said for an analogue/digital hybrid consciousness theory. Would [the analogue part of a hybrid computer] be deterministic? Ideally, yes – or as close to it as macroscopic (i.e. nonquantum) models will allow in theory. In actuality, all real analogue devices are subject to certain inaccuracies, however small they may be. These inaccuracies are for all practical purposes random, if not entirely unknowable, and they could be the result of any number of environmental factors or conditions. The errors may be tiny but no instrument can measure with 100% precision the real quantity it was designed to measure, and it is furthermore highly doubtful whether perfect accuracy is even achievable. Are you sure? This quote from Wikipedia seems to say the opposite: Although digital computer simulation of electronic circuits is very successful and routinely used in design and development, there is one category of analog circuit that cannot be simulated digitally, and that is an (analog) circuit made to exhibit chaotic behavior. Because everything in the analog circuit is essentially simultaneous, but a digital simulation is sequential, simulation a chaotic circuit fails. http://en.wikipedia.org/wiki/Analog_computerBut wouldn't a programmable analogue computer be seriously cool? Sure it would, but it is hard to see how one might go about constructing a generalpurpose programmable analogue machine. As outlined earlier, an analogue machine is one that makes use of a process that is mathematically similar to the one of interest, whereas a digital computer (usually) treats the mathematics itself of the process of interest (which is why a digital computer isn’t inherently limited to a rather narrow range of simulations or physical problems). It is not apparent how one might find an analogue of sufficient generality (short of reality itself, which clearly isn’t an analogue) to cover all of the requirements adequately for a super analogue computer to be built. 'Luthon64 They have designed programmable analogue computers, but they still use punch cards! I'm talking about miniaturization and bringing the complexity up to that of today's digital computers. As to finding analogues of problems, as with digital computing, one breaks them down into components. Analogue computers are great at: * summation * integration with respect to time * inversion * multiplication * exponentiation * logarithm * division, although multiplication is much preferred BTW did you get a chance to check out that Factorizing RSA Keys, An Improved Analogue Solution: http://www.springerlink.com/content/k5355j45w452537m/Isn't this one of those BQP problems?



Logged




Mefiante
Defollyant Iconoclast
Hero Member
Skeptical ability: +61/9
Offline
Posts: 3748
In solidarity with rwenzori: Κοπρος φανεται


« Reply #24 on: September 15, 2009, 11:13:13 AM » 

Are you sure? This quote from Wikipedia seems to say the opposite:… Yes I am, and not really, respectively. The first thing to realise is that the terms “chaos” and “chaotic behaviour” have precise mathematical meaning. Second, “chaos” does not imply indeterminism or noncomputability or some such. Third, the phrasing of the cited Wikipedia excerpt is a little misleading because true chaotic behaviour can always be digitally simulated to arbitrary precision. If it were not so, then, for example, the Mandelbrot set would come out looking differently every time it is computed. It’s just a question of how long you wish to wait for answers, as well as of the capability of the resources at your disposal. That things happen essentially simultaneously in an analogue circuit of a certain kind also does not in principle preclude digital simulation. It is not a good reason at all. For example, in brittle failure modes (which are mathematically chaotic), things like stress redistributions and strainenergy releases also happen essentially simultaneously, yet we can model such scenarios quite satisfactorily on powerful digital machines. I think that what the article means to say is really just what I wrote earlier, namely that while analogues are fully deterministic in theory, it is an enormously difficult task to predict or model certain types’ behaviour in practice – so much so that it is fair to call them intractable or even infeasible. The important difference to bear in mind here is the distinction between “practically undoable” and “impossible even in principle.” They have designed programmable analogue computers, but they still use punch cards! I'm talking about miniaturization and bringing the complexity up to that of today's digital computers. As to finding analogues of problems, as with digital computing, one breaks them down into components. Yes, true enough all around. However, the observation that your suggestions haven’t been happening much should tell you a few important things: for reasons of physics, analogues do not generally lend themselves well to miniaturisation; the individual components are very limited in their capabilities and ranges of application; assembling a solution involves a handson approach to arranging the components in a particular way according to some design (sort of like using bits and pieces from a programming library, except that the analogue constituents are palpable); constructing a generalpurpose machine that automatically assembles an analogue and runs it according to some schematic concept merely shifts the problem back by one level; and so on. In short, the limitations of analogues and the practical difficulties of implementing and using them are daunting. That is not to say, however, that they do not find good use in certain dedicated niches, nor that these difficulties are technically insurmountable. BTW did you get a chance to check out that [paper]?
…
Isn't [composite integer factorisation] one of those BQP problems? Yes, I’ve read the analogue factorisation paper – thank you for locating it. It’s a very interesting theoretical exercise that has as much to say about complexity theory as about integer factorisation. As for what complexity class the problem of general integer factorisation is, it’s still unknown. Most number theorists are reasonably sure that it is squarely NP. Certainly, all known algorithms place it there, but definitive proof is still lacking. If it is indeed NP, it probably falls outside the scope of the BQP class (because it is also not entirely clear just how far the BQP class extends). 'Luthon64



Logged




Peter Grant


« Reply #25 on: September 15, 2009, 20:43:21 PM » 

Are you sure? This quote from Wikipedia seems to say the opposite:… Yes I am, and not really, respectively. The first thing to realise is that the terms “chaos” and “chaotic behaviour” have precise mathematical meaning. OK, went and read up on Chaos theory and it IS about a deterministic mathematical model which shows that "tiny differences in the starting state of the system can lead to enormous differences in the final state of the system". Chaotic behaviour is a real phenomenon, observed in natural systems like weather and, I would propose, analogue circuits. Second, “chaos” does not imply indeterminism or noncomputability or some such. If used in the mathematical model sense, no, but when applied to a real life observation, why not? Third, the phrasing of the cited Wikipedia excerpt is a little misleading because true chaotic behaviour can always be digitally simulated to arbitrary precision. This is where chaos theory seems to fall short of observed chaos. Considering the exponential growth of error in a chaotic system, how can a digital simulation ever be precise enough? If it were not so, then, for example, the Mandelbrot set would come out looking differently every time it is computed. It’s just a question of how long you wish to wait for answers, as well as of the capability of the resources at your disposal.
They look kinda pretty, but I'm betting that they take long to compute and they still look digital. Imagine what cool wavy patterns an analogue computer could make, and in a fraction of the time. That things happen essentially simultaneously in an analogue circuit of a certain kind also does not in principle preclude digital simulation. It is not a good reason at all. For example, in brittle failure modes (which are mathematically chaotic), things like stress redistributions and strainenergy releases also happen essentially simultaneously, yet we can model such scenarios quite satisfactorily on powerful digital machines. I think that what the article means to say is really just what I wrote earlier, namely that while analogues are fully deterministic in theory, it is an enormously difficult task to predict or model certain types’ behaviour in practice – so much so that it is fair to call them intractable or even infeasible. The important difference to bear in mind here is the distinction between “practically undoable” and “impossible even in principle.” Agreed but it takes longer. That is why this quantum consciousness was proposed originally, wasn't it? Also with an analogue simulation it is possible to manipulate the variables and see the results in real time. With digital you have to run the algorithm again from scratch. They have designed programmable analogue computers, but they still use punch cards! I'm talking about miniaturization and bringing the complexity up to that of today's digital computers. As to finding analogues of problems, as with digital computing, one breaks them down into components. Yes, true enough all around. However, the observation that your suggestions haven’t been happening much should tell you a few important things: for reasons of physics, analogues do not generally lend themselves well to miniaturisation; the individual components are very limited in their capabilities and ranges of application; assembling a solution involves a handson approach to arranging the components in a particular way according to some design (sort of like using bits and pieces from a programming library, except that the analogue constituents are palpable); constructing a generalpurpose machine that automatically assembles an analogue and runs it according to some schematic concept merely shifts the problem back by one level; and so on. In short, the limitations of analogues and the practical difficulties of implementing and using them are daunting. That is not to say, however, that they do not find good use in certain dedicated niches, nor that these difficulties are technically insurmountable. Still think it would be easier than constructing quantum computers. BTW did you get a chance to check out that [paper]?
…
Isn't [composite integer factorisation] one of those BQP problems? Yes, I’ve read the analogue factorisation paper – thank you for locating it. It’s a very interesting theoretical exercise that has as much to say about complexity theory as about integer factorisation. As for what complexity class the problem of general integer factorisation is, it’s still unknown. Most number theorists are reasonably sure that it is squarely NP. Certainly, all known algorithms place it there, but definitive proof is still lacking. If it is indeed NP, it probably falls outside the scope of the BQP class (because it is also not entirely clear just how far the BQP class extends). 'Luthon64 Still not too sure I get this P complexity thing. Is the analogue solution not at least as plausible as the quantum one?


« Last Edit: September 15, 2009, 21:31:25 PM by Peter Grant »

Logged




Mefiante
Defollyant Iconoclast
Hero Member
Skeptical ability: +61/9
Offline
Posts: 3748
In solidarity with rwenzori: Κοπρος φανεται


« Reply #26 on: September 15, 2009, 22:44:52 PM » 

Chaotic behaviour is a real phenomenon, observed in … [some] … analogue circuits. Yes, correct, but don’t confuse “chaos” with “unpredictability.” They’re not the same thing. [W]hen applied to a real life observation, why [does “chaos” not imply indeterminism or noncomputability or some such]? We have been talking about simulations all along, have we not? And simulations are by definition not the real thing. This is where chaos theory seems to fall short of observed chaos. How so? I don’t follow. Perhaps you should define “observed chaos” separate from mathematical chaos as relevant to the cited passage, in particular how one might distinguish the one from the other. Considering the exponential growth of error in a chaotic system, how can a digital simulation ever be precise enough? Very simply by defining the initial and governing conditions with sufficient accuracy to meet the precision requirements of the model in question. They look kinda pretty, but I'm betting they they take long to compute and they still look digital. Imagine what cool wavy patterns an analogue computer could make, and in a fraction of the time. Where to begin? Appearance is nothing in this context, merely exciting the sensibilities about an intriguing pattern. Mathematically, though, they’d be useless and – much worse – mostly uninformative because of the accuracy issues plaguing analogue computation. Then, there’s the question of recursion and iteration which analogues are quite clumsy at. Once you understand the mathematical nature of the Mandelbrot set, you should have no trouble seeing that high precision computation is the only way to generate it for any purpose beyond the aesthetic. Agreed but it takes longer. That is why this quantum consciousness was proposed originally, wasn't it? Not really. The Copenhagen interpretation (CI) of QM implies that nothing is definite until it is observed by a conscious entity. This led Schrödinger to propose his deadalive cat thought experiment in order to illustrate the apparent absurdity of this conception. Based thereon, Roger Penrose much later hypothesised that quantum wave function collapse (reduction) – the technical term for finding a particle or group of coordinated particles in a definite state – is missing some essential ingredient that is perhaps also instrumental in manifestations of consciousness. Penrose conjectures that this is to be found in a proper quantum gravity formulation (still conspicuously lacking), and calls it “objective reduction” (OR), as opposed to the “subjective reduction” done by conscious observers. A neuroscientist, Hameroff, proposes that OR on a (relatively speaking) large coordinated scale within neural microtubules accounts for consciousness. That’s the picture painted in very broad strokes. Still think it would be easier than constructing quantum computers. At present, undoubtedly so, but quantum computers aren’t just some theoretical possibility that excites only nerds. If achievable, these machines will revolutionise computation as surely as the digital one has, and probably in ways we can barely imagine today. Is the analogue solution not at least as plausible as the quantum one? I’m not sure what to make of “plausible” in this context. Sure, the analogue variety is doable with today’s technology. After all, historically it precedes the digital kind, but the analogue computer is not a generalpurpose machine for attacking a significant crosssection of realworld problems using a single device. That kind of flexibility is reserved for the digital computer and accounts in large part for its huge growth and success. The basic issue is that analogue computers are deterministic, extremely limited and very cumbersome to implement. In truth and in light of the above responses, I am beginning to be somewhat doubtful whether you appreciate fully the severity of these objections and constraints. Moreover, (and not that analogue and quantum computers are properly comparable – it would be a bit like comparing an abacus with an IBM Roadrunner), assuming that the technical difficulties with quantum computers are soluble, these machines have the potential to obviate many of the limitations of both analogue and digital computers. That we don’t have them yet is perhaps the most frustrating thing in all of this. 'Luthon64



Logged




Peter Grant


« Reply #27 on: September 16, 2009, 14:56:35 PM » 

Yes, correct, but don’t confuse “chaos” with “unpredictability.” They’re not the same thing.
But wouldn't the observed chaotic behaviour in the final state ultimately be influenced by quantum unpredictability in the starting state? We have been talking about simulations all along, have we not? And simulations are by definition not the real thing.
OK, but digital simulations are less real. Analogue simulations are real physical processes which operate on real numbers, the results of which we can observe. How so? I don’t follow. Perhaps you should define “observed chaos” separate from mathematical chaos as relevant to the cited passage, in particular how one might distinguish the one from the other.
Mathematical chaos is a deterministic model which shows that "tiny differences in the starting state of the system can lead to enormous differences in the final state of the system". When it is used to model a real chaotic system digitally, each iteration is performed on an approximation of the real values. Each of these approximations, in turn, is itself a tiny difference which can lead to enormous differences later on. Even if we are to increase the precision down to the quantum level, at enormous cost in processing time, we are still left with further unpredictability. Very simply by defining the initial and governing conditions with sufficient accuracy to meet the precision requirements of the model in question.
But through those digital iterations we loose the intrinsic unpredictability in a naturally chaotic system. Analogue computers may not be as precise, but they perform the calculations themselves far more accurately than a digital computer could given a reasonable amount of time. Where to begin? Appearance is nothing in this context, merely exciting the sensibilities about an intriguing pattern. Mathematically, though, they’d be useless and – much worse – mostly uninformative because of the accuracy issues plaguing analogue computation. Then, there’s the question of recursion and iteration which analogues are quite clumsy at. Once you understand the mathematical nature of the Mandelbrot set, you should have no trouble seeing that high precision computation is the only way to generate it for any purpose beyond the aesthetic.
OK, will read up more on these Mandelbrot sets, but I think my comment still applies to CGI generally. Not really. The Copenhagen interpretation (CI) of QM implies that nothing is definite until it is observed by a conscious entity. This led Schrödinger to propose his deadalive cat thought experiment in order to illustrate the apparent absurdity of this conception. Based thereon, Roger Penrose much later hypothesised that quantum wave function collapse (reduction) – the technical term for finding a particle or group of coordinated particles in a definite state – is missing some essential ingredient that is perhaps also instrumental in manifestations of consciousness. Penrose conjectures that this is to be found in a proper quantum gravity formulation (still conspicuously lacking), and calls it “objective reduction” (OR), as opposed to the “subjective reduction” done by conscious observers. A neuroscientist, Hameroff, proposes that OR on a (relatively speaking) large coordinated scale within neural microtubules accounts for consciousness. That’s the picture painted in very broad strokes.
Mefiante on September 07, 2009, 15:46:49 PM At present, undoubtedly so, but quantum computers aren’t just some theoretical possibility that excites only nerds. If achievable, these machines will revolutionise computation as surely as the digital one has, and probably in ways we can barely imagine today.
I don't doubt it, but isn't an analogue computer more likely to evolve naturally than a quantum one? I’m not sure what to make of “plausible” in this context. Sure, the analogue variety is doable with today’s technology. After all, historically it precedes the digital kind, but the analogue computer is not a generalpurpose machine for attacking a significant crosssection of realworld problems using a single device. That kind of flexibility is reserved for the digital computer and accounts in large part for its huge growth and success. The basic issue is that analogue computers are deterministic, extremely limited and very cumbersome to implement. In truth and in light of the above responses, I am beginning to be somewhat doubtful whether you appreciate fully the severity of these objections and constraints. Moreover, (and not that analogue and quantum computers are properly comparable – it would be a bit like comparing an abacus with an IBM Roadrunner), assuming that the technical difficulties with quantum computers are soluble, these machines have the potential to obviate many of the limitations of both analogue and digital computers. That we don’t have them yet is perhaps the most frustrating thing in all of this. 'Luthon64 I guess what I mean is, is the proposed analogue solution sufficiently efficient to explain our brain's computational speed? I agree it would be seriously cool if we built quantum computers.



Logged




Mefiante
Defollyant Iconoclast
Hero Member
Skeptical ability: +61/9
Offline
Posts: 3748
In solidarity with rwenzori: Κοπρος φανεται


« Reply #28 on: September 16, 2009, 16:21:14 PM » 

Sorry, but I’m somewhat confused. Is there a specific point, or more than one, that you wish to make? Or are you just whipping up conversation for the interest of it? Because truthfully I’m beginning to feel just a little beset and harassed over matters that are either trivial, only peripherally relevant or misconstrued, or all of the above – matters that you could easily do research on by yourself.
Do you think that, failing a quantum computer, the analogue kind is the answer to our computing needs and consciousness, maybe? Or that in reality analogues are quasiquantum computers, maybe? If so, then reality and the computing status the world over resoundingly refute that stance. Of course, you are welcome to persist in such beliefs (if indeed you hold them), but it would in my view be unwise to do so, considering these rather imposing countermanding indications.
It’s quite simple, really: For a variety of technical reasons already given, analogue computers are not the answer. If they were, we’d all be using them already.
'Luthon64



Logged




Peter Grant


« Reply #29 on: September 16, 2009, 19:55:39 PM » 

Sorry, but I’m somewhat confused. Is there a specific point, or more than one, that you wish to make? Or are you just whipping up conversation for the interest of it? Because truthfully I’m beginning to feel just a little beset and harassed over matters that are either trivial, only peripherally relevant or misconstrued, or all of the above – matters that you could easily do research on by yourself.
I'm really sorry, that was not my intention at all. If you come to the next Skeptics in the Pub I'll buy you a drink to try and make up for it. I'm genuinely interested in artificial intelligence and, I must admit, somewhat disappointed by the lack of progress in this field. I'd hate to have to wait for quantum computers before we can build machines that can think and feel, so I'm looking for alternatives. As for doing more research myself, I have been. Reading up on QM and responding to your posts has been taking up most of my free time lately. Do you think that, failing a quantum computer, the analogue kind is the answer to our computing needs and consciousness, maybe? Or that in reality analogues are quasiquantum computers, maybe? If so, then reality and the computing status the world over resoundingly refute that stance. Of course, you are welcome to persist in such beliefs (if indeed you hold them), but it would in my view be unwise to do so, considering these rather imposing countermanding indications.
I'm more concerned with consciousness, and how we could simulate it artificially. We already know that the human brain has both digital and analogue components. The digital is obviously insufficient considering we are starting to reach the limits of this technology and haven't found a solution yet. Analogue computing, however, hasn't really progressed much in the last 40 years. Concerning my understanding of quantum mechanics: It seems logical to me that if everything is based on quantum mechanics, then the entire universe is essentially one big quantum computer. This isn't a belief I hold, I just don't know what else to think. What is it I am missing? It’s quite simple, really: For a variety of technical reasons already given, analogue computers are not the answer. If they were, we’d all be using them already.
'Luthon64
But if everyone though that way, no one would ever invent, or just imagine in my case, anything! Also, considering how closely scientific and technological progress has been tied to digital computing over the last few decades it might simply be a lack of interest or focus.


« Last Edit: September 16, 2009, 20:12:06 PM by Peter Grant »

Logged




