sâmbătă, 1 septembrie 2012

The Science Behind The Chip That Could Restore Sight To The Blind

A microchip implant placed under the retina could revolutionize the lives of hundreds of thousands of people who have lost their ability to see.

The chip, developed by German-based company Retina Implant AG, is still in clinical trial but has already shown promising results for patients suffering from retinitis pigmentosa, a genetic disease that leads to blindness with approximately 1.5 million people affected worldwide.

Here's how it works:

The retina is a tissue in the back part of the eye that contains cells called photoreceptors. The photoreceptors convert light rays into nerve signals, which are then processed by nerve cells in the inner retina, sent to the brain and translated as images. The two types of photoreceptor cells are known as rods (responsible for peripheral and night vision) and cones (responsible for color perception). A normal-sighted person has about 120 million rods and about 6 million cones.

In patients with retinitis pigmentosa, the rods and cones die. As a result, the retina's inner cells and the optic nerve fibers that normally send impulses to the brain stop receiving information.

The retinal implant works by replacing the photoreceptor cells that have been destroyed.

There are currently two main approaches to retinal implants: subretinal and epiretinal.

In the epiretinal approach, a wireless implant is surgically implanted on top of the retina. A miniature video camera attached to the patient's eyeglasses captures light and transmits this wirelessly to the implant. This stimulates the retina's functioning inner nerve cells that send electrical impulses to the brain. The downside of this approach is that it requires several different parts, including a battery pack worn on the patient's belt to power the whole system.

The subretinal implant developed by Zrenner and his colleagues, on the other hand, is placed under the retina in a center spot called the macula. According to Retina Implant AG’s website, "the macular region is believed to be the ideal location because this is where light-sensitive photoreceptor cells are located which are responsible for producing clear images in normal-sighted people." The chip also moves with the eye, which means you don't have to move your head to recognize objects as with the epiretinal implant. It also leverages more electrodes than the epiretinal implants (1,500 vs. 64), which makes light and dark images appear more vibrant.

The subretinal implant chip's 1,500 light sensors are triggered by natural light. Electrical impulses stimulate the retina's inner neurons and signals are sent to the brain to produce sight.

The chip is powered by a small battery box in the pocket which forwards power and control signals for brightness and contrast adjustment via a wireless unit behind the ear to the implant by a sub-dermal coil and cable that runs to the eye.

duminică, 15 iulie 2012

The Dictator

This Dictator is simply funny! it's really random and unexpected with really funny jokes. The movie is kind of short so it doesn't bore like many other movies, and unexpectedly there's an idea behind the movie and it's kind of political. If you want to see a movie that doesn't require anything but some sense of humor you will love this.

luni, 9 iulie 2012

Machine Head - BBC Review

On Burn My Eyes, Machine Head’s 1994 debut album, there featured a song with the not entirely user-friendly title of Real Eyes, Realize, Real Lies. Essentially a two-and-three-quarter-minute guitar riff, the track was rendered intriguing by the fact that its lyrics comprised soundbites recorded from the darker thoroughfares of America’s meanest streets: voices of the poor bemoaning police brutality, police radios alerting squad cars to explosions of gang violence, and gangbangers telling reporters why it was they hated other gangbangers who were, for all intents and purposes, identical to themselves. At the time, the song was startling: metal but not as it was known; urban rather than suburban; street and actually rather cool. Who knew?

Without mellowing one single beat-per-minute, a generation on and Machine Head’s once-fringe thrash has moved to the centre ground to such a degree that in December the quartet will headline a show at London’s Wembley Arena. And while the years between their first album and Unto the Locust haven’t linked together entirely seamlessly – the group endured a particularly unconvincing middle-period – the ferocity and precision displayed throughout this release’s seven tracks offers proof that, since their inception, Machine Head and others like them have dragged metal’s mainstream to them rather than them having made concessions to it. That fare as mean and ugly and unsparing as this can bask in the sunlight is heartening indeed.

Unto the Locust is a quite terrific release, and one which shows that while its creators can thrash as well as any – the forensic This Is the End offers ample evidence of this – this is set is more than a one-dimensional dog and pony show. Tracks such as the subtle (you read that correctly) Darkness Within, and the climactic and contagious, even life-affirming, Who We Are display a band that have learned much about tonality; that, and the plain fact that power is nothing without control. Even so, Unto the Locust isn’t likely to be confused with Metallica – it has no crossover appeal. But for metalheads who like their music sharp and executed without recourse to compromise, then this is a contender for genre album of the year. From : BBC Review

sâmbătă, 7 iulie 2012

Differences Between Brains and Computers

"A good metaphor is something even the police should keep an eye on." - G.C. Lichtenberg

Although the brain-computer metaphor has served cognitive psychology well, research in cognitive neuroscience has revealed many important differences between brains and computers. Appreciating these differences may be crucial to understanding the mechanisms of neural information processing, and ultimately for the creation of artificial intelligence. Below, I review the most important of these differences (and the consequences to cognitive psychology of failing to recognize them): similar ground is covered in this excellent (though lengthy) lecture (http://www.msri.org/cgi-bin/real.cgi?realhost=real.msri.org&realfile=/hosted/pmmb/2002/mumford/1).

Difference # 1: Brains are analogue; computers are digital

It's easy to think that neurons are essentially binary, given that they fire an action potential if they reach a certain threshold, and otherwise do not fire. This superficial similarity to digital "1's and 0's" belies a wide variety of continuous and non-linear processes that directly influence neuronal processing.

For example, one of the primary mechanisms of information transmission appears to be the rate at which neurons fire - an essentially continuous variable. Similarly, networks of neurons can fire in relative synchrony or in relative disarray; this coherence affects the strength of the signals received by downstream neurons. Finally, inside each and every neuron is a leaky integrator circuit, composed of a variety of ion channels and continuously fluctuating membrane potentials.

Failure to recognize these important subtleties may have contributed to Minksy & Papert's infamous mischaracterization of perceptrons, a neural network without an intermediate layer between input and output. In linear networks, any function computed by a 3-layer network can also be computed by a suitably rearranged 2-layer network. In other words, combinations of multiple linear functions can be modeled precisely by just a single linear function. Since their simple 2-layer networks could not solve many important problems, Minksy & Papert reasoned that that larger networks also could not. In contrast, the computations performed by more realistic (i.e., nonlinear) networks are highly dependent on the number of layers - thus, "perceptrons" grossly underestimate the computational power of neural networks.

Difference # 2: The brain uses content-addressable memory

In computers, information in memory is accessed by polling its precise memory address. This is known as byte-addressable memory. In contrast, the brain uses content-addressable memory, such that information can be accessed in memory through "spreading activation" from closely related concepts. For example, thinking of the word "fox" may automatically spread activation to memories related to other clever animals, fox-hunting horseback riders, or attractive members of the opposite sex.

The end result is that your brain has a kind of "built-in Google," in which just a few cues (key words) are enough to cause a full memory to be retrieved. Of course, similar things can be done in computers, mostly by building massive indices of stored data, which then also need to be stored and searched through for the relevant information (incidentally, this is pretty much what Google does, with a few twists).

Although this may seem like a rather minor difference between computers and brains, it has profound effects on neural computation. For example, a lasting debate in cognitive psychology concerned whether information is lost from memory because of simply decay or because of interference from other information. In retrospect, this debate is partially based on the false asssumption that these two possibilities are dissociable, as they can be in computers. Many are now realizing that this debate represents a false dichotomy (http://act-r.psy.cmu.edu/papers/365/ema_cds_2002_a.pdf).

Difference # 3: The brain is a massively parallel machine; computers are modular and serial

An unfortunate legacy of the brain-computer metaphor is the tendency for cognitive psychologists to seek out modularity in the brain. For example, the idea that computers require memory has lead some to seek for the "memory area," when in fact these distinctions are far more messy. One consequence of this over-simplification is that we are only now learning that "memory" regions (such as the hippocampus) are also important for imagination (http://cubic-parsec.blogspot.com/2007/01/hippocampus-and-imagination.html), the representation of novel goals (http://forebrain.blogspot.com/2007/01/goal-related-activity-in-hippocampal.html), spatial navigation (http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=5124915&dopt=Abstract), and other diverse functions.

Similarly, one could imagine there being a "language module" in the brain, as there might be in computers with natural language processing programs. Cognitive psychologists even claimed to have found this module, based on patients with damage to a region of the brain known as Broca's area. More recent evidence has shown that language too is computed by widely distributed and domain-general neural circuits, and Broca's area may also be involved in other computations (see here for more on this (http://www.psych.upenn.edu/stslab/Language_organ.pdf)).

Difference # 4: Processing speed is not fixed in the brain; there is no system clock

The speed of neural information processing is subject to a variety of constraints, including the time for electrochemical signals to traverse axons and dendrites, axonal myelination, the diffusion time of neurotransmitters across the synaptic cleft, differences in synaptic efficacy, the coherence of neural firing, the current availability of neurotransmitters, and the prior history of neuronal firing. Although there are individual differences in something psychometricians call "processing speed," this does not reflect a monolithic or unitary construct, and certainly nothing as concrete as the speed of a microprocessor. Instead, psychometric "processing speed" probably indexes a heterogenous combination of all the speed constraints mentioned above.

Similarly, there does not appear to be any central clock in the brain, and there is debate as to how clock-like the brain's time-keeping devices actually are. To use just one example, the cerebellum is often thought to calculate information involving precise timing, as required for delicate motor movements; however, recent evidence suggests that time-keeping in the brain bears more similarity to ripples on a pond (http://www.scientificblogging.com/news/how_does_your_brain_tell_time_study_challenges_theory_of_inner_clock) than to a standard digital clock.

Difference # 5 - Short-term memory is not like RAM

Although the apparent similarities between RAM and short-term or "working" memory emboldened many early cognitive psychologists, a closer examination reveals strikingly important differences. Although RAM and short-term memory both seem to require power (sustained neuronal firing in the case of short-term memory, and electricity in the case of RAM), short-term memory seems to hold only "pointers" to long term memory whereas RAM holds data that is isomorphic to that being held on the hard disk. (See here (http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=15377128&dopt=Abstract) for more about "attentional pointers" in short term memory).
Unlike RAM, the capacity limit of short-term memory is not fixed; the capacity of short-term memory seems to fluctuate with differences in "processing speed" (see Difference #4) as well as with expertise and familiarity.

Difference # 6: No hardware/software distinction can be made with respect to the brain or mind

For years it was tempting to imagine that the brain was the hardware on which a "mind program" or "mind software" is executing. This gave rise to a variety of abstract program-like models of cognition, in which the details of how the brain actually executed those programs was considered irrelevant, in the same way that a Java program can accomplish the same function as a C++ program.

Unfortunately, this appealing hardware/software distinction obscures an important fact: the mind emerges directly from the brain, and changes in the mind are always accompanied by changes in the brain. Any abstract information processing account of cognition will always need to specify how neuronal architecture can implement those processes - otherwise, cognitive modeling is grossly underconstrained. Some blame this misunderstanding for the infamous failure of "symbolic AI (http://www.psych.utoronto.ca/%7Ereingold/courses/ai/symbolic.html)."

Difference # 7: Synapses are far more complex than electrical logic gates

Another pernicious feature of the brain-computer metaphor is that it seems to suggest that brains might also operate on the basis of electrical signals (action potentials) traveling along individual logical gates. Unfortunately, this is only half true. The signals which are propagated along axons are actually electrochemical in nature, meaning that they travel much more slowly than electrical signals in a computer, and that they can be modulated in myriad ways. For example, signal transmission is dependent not only on the putative "logical gates" of synaptic architecture but also by the presence of a variety of chemicals in the synaptic cleft, the relative distance between synapse and dendrites, and many other factors. This adds to the complexity of the processing taking place at each synapse - and it is therefore profoundly wrong to think that neurons function merely as transistors.

Difference #8: Unlike computers, processing and memory are performed by the same components in the brain

Computers process information from memory using CPUs, and then write the results of that processing back to memory. No such distinction exists in the brain. As neurons process information they are also modifying their synapses - which are themselves the substrate of memory. As a result, retrieval from memory always slightly alters those memories (usually making them stronger, but sometimes making them less accurate - see here (http://develintel.blogspot.com/2006/05/origins-of-memory-distortion.html) for more on this).

Difference # 9: The brain is a self-organizing system

This point follows naturally from the previous point - experience profoundly and directly shapes the nature of neural information processing in a way that simply does not happen in traditional microprocessors. For example, the brain is a self-repairing circuit - something known as "trauma-induced plasticity" kicks in after injury. This can lead to a variety of interesting changes, including some that seem to unlock unused potential in the brain (known as acquired savantism (http://www.sciam.com/article.cfm?articleID=0006216C-45CB-116C-85CB83414B7F0000&sc=I100322)), and others that can result in profound cognitive dysfunction (as is unfortunately far more typical in traumatic brain injury and developmental disorders).

One consequence of failing to recognize this difference has been in the field of neuropsychology, where the cognitive performance of brain-damaged patients is examined to determine the computational function of the damaged region. Unfortunately, because of the poorly-understood nature of trauma-induced plasticity, the logic cannot be so straightforward. Similar problems underlie work on developmental disorders and the emerging field of "cognitive genetics", in which the consequences of neural self-organization are frequently neglected (http://www.psychology.nottingham.ac.uk/staff/gs/Publications_files/TiCS_Final.pdf) .

Difference # 10: Brains have bodies

This is not as trivial as it might seem: it turns out that the brain takes surprising advantage of the fact that it has a body at its disposal. For example, despite your intuitive feeling that you could close your eyes and know the locations of objects around you, a series of experiments in the field of change blindness (http://viscog.beckman.uiuc.edu/djs_lab/demos.html) has shown that our visual memories are actually quite sparse. In this case, the brain is "offloading" its memory requirements to the environment in which it exists: why bother remembering the location of objects when a quick glance will suffice? A surprising set of experiments by Jeremy Wolfe (http://search.bwh.harvard.edu/new/pubs/targetsearch.pdf) has shown that even after being asked hundreds of times which simple geometrical shapes are displayed on a computer screen, human subjects continue to answer those questions by gaze rather than rote memory. A wide variety of evidence from other domains suggests that we are only beginning to understand the importance of embodiment in information processing.

Bonus Difference: The brain is much, much bigger than any [current] computer

Accurate biological models of the brain would have to include some 225,000,000,000,000,000 (225 million billion) interactions between cell types, neurotransmitters, neuromodulators, axonal branches and dendritic spines, and that doesn't include the influences of dendritic geometry, or the approximately 1 trillion glial cells which may or may not be important for neural information processing. Because the brain is nonlinear, and because it is so much larger than all current computers, it seems likely that it functions in a completely different fashion. (See here (http://develintel.blogspot.com/2006/01/complexity-and-biologically-accurate_04.html) for more on this.) The brain-computer metaphor obscures this important, though perhaps obvious, difference in raw computational power.

vineri, 6 iulie 2012

Video nasty

Last weekend I got around to seeing 'Ringu', the Japanese horror film that inspired an American remake, 'The Ring'. Having seen the latter when it came out in the cinema, there weren't many surprises, although 'Ringu' stays closer to the original novel and has more rudimentary special effects. For example, the facial contortions of the victims of the video curse are not quite so grotesque in 'Ringu'. That said, Sadako's jerky, shuffling walk (created by filming the actress walking backwards and then playing the film in reverse) is marvellously disturbing.

'The Ring' certainly gave me a few uneasy nights, despite the fact that the film's central premise is ludicrous. Of course, watching 'Ringu' is so unsettling because (a) the horror in the film derives from a film, which has the effect of putting the (real-life) viewer under the spell, and (b) the omnipresent and trusted home TV is reconfigured as a conduit for malice.

Iron Maiden - 'Afraid To Shoot Strangers'

The last album from the first Bruce Dickinson period, 'Fear of the Dark' may not be Maiden's finest full-length, but it certainly has its moments. The prominent synth on 'Afraid To Shoot Strangers' makes one nostalgic for the glory days of 'Seventh Son...' (particularly 'Infinite Dreams', another Steve Harris composition). The striking, solitary leads in the song's middle and end are simple but gloriously effective.

System of a Down

Back in about 1998, I slammed a System of a Down sampler tape that had been sent to me for review. Those were the dark days when nu-metal was beginning to make threatening noises, and early SOAD didn't seem to offer anything very different from that dubious template.

Despite this, it has to be said (having spun it for the first time today) that the band's breakthrough effort, 'Toxicity' is a strong record. Its songs are short, rhythmically interesting and pleasantly melodic, while the riffs arguably owe more to thrash than they do to nu-metal. 'Chop Suey!' is an obvious hit, although possibly the first US number one to feature baby blastbeats!

Personality plasticity in trout

Does experience influence the personality of fish? A paper by Frost et al. (2007, Proc R Soc B, 274, 333-9) found that 'bold' rainbow trout who lost fights or who observed other trout exhibiting 'shy' behaviour became less bold themselves. This was tested behaviourally by examining fishes' inquisitive reactions to Lego bricks or novel prey dropped in front of them. However, there was an asymmetry to this social learning: shy trout who observed bold trout did not subsequently become bolder. Fish who won fights tended to engage in more approach behaviour towards novel objects, although this was also true for shy fish who lost fights (the so-called 'Desperado' effect).

Egyptian unfreedom

Spare a thought for Abdel Karem Soliman, the Egyptian student and blogger who was recently sentenced to four years in jail for writing posts showing 'contempt for religion' (penalty: 3 years) and 'insulting the president' (penalty: 1 year). Ominously, the website campaigning for his release seems to be down at the moment, but this tyrannical treatment deserves widespread exposure.

'The Last King of Scotland'

The latest film about the brutal Ugandan dictator Idi Amin does not disappoint. Forest Whitaker's swivel-eyed enactment of the paranoid leader is magnificent, while James McAvoy also shines as the brave young doctor (Nicholas Garrigan) who becomes his personal physician. The movie focuses on Amin's personal magnetism and it is a while before his murderous tendencies begin to dawn on Garrigan. As a proud Scot, the doctor clearly sympathises with Amin, who put one over the British army (who trained him) when he became the leader of an independent Uganda. Garrigan's desire to escape from a puritanical and traditional family practice also helps to explain his initial credulity. The final scales fall from his eyes after seeing the results of a particularly nasty detruncation.

Ataraxia - 'Al Ballo Masquerato'

When the Italian neoclassical/gothic/folk/darkwave musicians Ataraxia released 'Il Fantasma dell'Opera' on Avantgarde Music in 1996, it brought them a little more attention from metal folk. Francesca Nicoli had recently helped out on MonumentuM's 'In Absentia Christi' opus, and her heavenly soprano on 'Al Ballo Masquerato' never fails to tug the heartstrings. Though the song could have benefitted from a genuine string accompaniment in place of synth, its melody combines sprightly grace with the faint taste of melancholy.

Elsewhere, the eerie, gothic cover of Kate Bush's 'Wuthering Heights' ('La Nouva Marguerita') isn't bad either.

'Shooting Dogs'

A depressing tale told in the midst of the Rwandan genocide, 'Shooting Dogs' shines a light on the impotence of the United Nations in a benighted country descending into hell. It's told simply, without digression or ornamentation, save for the brief TV footage showing a UN official doing her best to avoid using the 'G' word. John Hurt stars as the Catholic priest sheltering fleeing Tutsis, while Hugh Dancy plays the callow English youth teaching at his school. The brutal violence is shocking in a way that does not require excessive gore, and although more could probably have been made of the script, it's not a film that will leave your thoughts easily.

Upon realising that he is a dead man, there's a moving moment in which Hurt's character bids farewell to his protégé with the words: 'Find fulfillment in everything.' As Marie, one of the survivors of the genocide reminds us, 'This time we have been given, we must use it well.'

(Though no thesp myself, this blogger once shared a stage with Hugh Dancy, though back at school we knew him as 'Jack').

Dusk/dawn asymmetry around the solstice

Despite my amateurish enthusiasm for astronomy, recently I came across a puzzle relating to the solstice that I couldn't immediately fathom. Why is the shortest day not also the day of the latest sunrise? In fact, in the Northern hemisphere, sunrise continues to get later as we pass the winter solstice, before it gradually begins getting earlier again.

Obviously there is some kind of asymmetry going on here. The first thing that came to my mind was the fact that the Earth is nearer the Sun in January (perihelion), though after reflection this does not seem to be important. The real explanation is (I think at least mostly) due to the difference between the sidereal day and the solar day. Thus, because of its rotation around the Sun, the Earth has to rotate slightly more each day before sunrise at a particular location. Dawn gets later after the winter solstice in the Northern Hemisphere because the Earth rotates in the same direction as it orbits the Sun.

Neurobiology of intelligence

Not exactly hot off the press, but here's an interesting review article on neuroscientific insights into intelligence (Gray & Thompson, 2004, Nat Rev Neurosci, 5, 1-13). The authors focus on fluid intelligence (Gf), which is moderately correlated with brain volume. Neuroimaging and neuropsychological research implicate the lateral prefrontal cortex (PFC) as a brain region that is active during Gf tests and related to individual differences in Gf. The review explains that grey matter volume in the lateral PFC is both correlated with fluid intelligence and largely under genetic control, although it should be remembered that environmental input has been related to differences in brain morphology. Individual genes associated with Gf are difficult to pin down, while environmental factors are rather more easily detected (e.g., lead exposure).

On the vexed question of possible group-based differences, the authors propose that it is better to buttress ethical standards than to censor research in this field entirely. For example, they propose that data should only be retained if participants give their active support to a study's aims after a thorough debriefing.

A wing and no prayer

For over half my life I've been a Microsoft Flight Simulator pilot. After taking off from the chalk-on-grass airport runways of version 4.0, my virtual wings have flown me through to FS 2004, with its ATC chatter and AI aircraft. (My current spec would baulk at FSX). Because it's a simulation rather than a game, and there aren't any guns, missiles or bombs, the goal of FS is whatever the user wants it to be. For some it will be crazy jumbo jet aerobatics. For others it will be perfectly recreating their last holiday flight to Tenerife. For most people, horrible crashes and belly landings are inevitable. And sometimes, things aren't all that realistic.

This afternoon, for example, I flew a 737 from Exeter to Salzburg. Having visited the real Salzburg, I knew about the spectacular mountains, but didn't bother checking the approach charts. ATC cleared me for a straight-in visual approach to runway 34. It was dusk and I was not 'visual' with the surrounding terrain, but all seemed well as the runway lights appeared at 8 miles out and landing clearance was granted. There was a slight problem: the plane was 9,000 feet above the field after clearing the mountains and was now way too high. Solution? Close the throttle, deploy the spoilers and sink like a stone until I'm safely bouncing along the tarmac. In reality, straight-in approaches to runway 34 would be too dangerous and instead aircraft have to circle and land. Still, at least the virtual captain and passengers walked away.

Sometimes they don't. One time, a jet under my command was accelerating rather sluggishly down the shorter runway at Stockholm Skavsta. With no room to abort, I pulled the plane into the air at under 140 knots. It rose a few feet, stalled, then crashed into a forest on the airfield boundary. The problem? I'd forgotten to retract the spoilers after the previous landing. Doh!

Opeth - 'The Twilight is My Robe'

My first acquaintance with Opeth came about via a dodgy tape-traded copy of their 'Orchid' debut. The recording was effectively in mono so that only one of the lead guitars rang through, but despite this handicap the music proved to be absolutely beguiling.

As a sucker for dual guitar harmonies, the Swedes' first three albums have always appealed to me more than their subsequent output. 'The Twilight is My Robe' gets better and better as it goes on, ascending to a climactic powerchord (8:58) that clears the air before the closing refrain.