The future of computing

The Economist on the evolving speed of computer hardware evolution:

IN 1971 the fastest car in the world was the Ferrari Daytona, capable of 280kph (174mph). The world’s tallest buildings were New York’s twin towers, at 415 metres (1,362 feet). In November that year Intel launched the first commercial microprocessor chip, the 4004, containing 2,300 tiny transistors, each the size of a red blood cell.

Since then chips have improved in line with the prediction of Gordon Moore, Intel’s co-founder. According to his rule of thumb, known as Moore’s law, processing power doubles roughly every two years as smaller transistors are packed ever more tightly onto silicon wafers, boosting performance and reducing costs. A modern Intel Skylake processor contains around 1.75 billion transistors—half a million of them would fit on a single transistor from the 4004—and collectively they deliver about 400,000 times as much computing muscle. This exponential progress is difficult to relate to the physical world. If cars and skyscrapers had improved at such rates since 1971, the fastest car would now be capable of a tenth of the speed of light; the tallest building would reach half way to the Moon.

The impact of Moore’s law is visible all around us. Today 3 billion people carry smartphones in their pockets: each one is more powerful than a room-sized supercomputer from the 1980s. Countless industries have been upended by digital disruption. Abundant computing power has even slowed nuclear tests, because atomic weapons are more easily tested using simulated explosions rather than real ones. Moore’s law has become a cultural trope: people inside and outside Silicon Valley expect technology to get better every year.

But now, after five decades, the end of Moore’s law is in sight. Making transistors smaller no longer guarantees that they will be cheaper or faster.

 

/Source

Antonio Ortiz

Antonio Ortiz has always been an autodidact with an eclectic array of interests. Fascinated with technology, advertising and culture he has forged a career that combines them all. In 1991 Antonio developed one of the very first websites to market the arts. It was text based, only available to computer scientists, and increased attendance to the Rutgers Arts Center where he had truly begun his professional career. Since then Antonio has been an early adopter and innovator merging technology and marketing with his passion for art, culture and entertainment. For a more in-depth look at those passions, visit SmarterCreativity.com.

Robots and Babies Both Use Curiosity to Learn

"Curiosity Depends on What You Already Know," Zach St. George for Nautilus:

Scientists who study the mechanics of curiosity are finding that it is, at its core, a kind of probability algorithm—our brain’s continuous calculation of which path or action is likely to gain us the most knowledge in the least amount of time. Like the links on a Wikipedia page, curiosity builds upon itself, every question leading to the next. And as with a journey down the Wikipedia wormhole, where you start dictates where you might end up. That’s the funny thing about curiosity: It’s less about what you don’t know than about what you already do.
...
Brain studies suggest that this “novelty bonus”—the additional weight we give to new options—stems at least in part from the euphoric feeling it gives us. For instance, a 2007 study found that, like Pavlov’s dog salivating at the ring of a bell, the part of our brain that processes rewards like love and sweets activates when we expect to find something new, even if that expectation doesn’t play out. These findings, the researchers conclude, “raise the possibility that novelty itself is processed akin to a reward.”
/Source

Antonio Ortiz

Antonio Ortiz has always been an autodidact with an eclectic array of interests. Fascinated with technology, advertising and culture he has forged a career that combines them all. In 1991 Antonio developed one of the very first websites to market the arts. It was text based, only available to computer scientists, and increased attendance to the Rutgers Arts Center where he had truly begun his professional career. Since then Antonio has been an early adopter and innovator merging technology and marketing with his passion for art, culture and entertainment. For a more in-depth look at those passions, visit SmarterCreativity.com.

Attention Residue Is Ruining Your Concentration

Tanya Basu, for The Science of Us:

Which means ... what, exactly? Newport explains it using a 2009 paper titled “Why Is It So Hard to Do My Work?” from Sophie Leroy, a business-school professor at the University of Minnesota. She studied a modern, daily workplace conundrum: switching between tasks and getting things done. In two experiments, Leroy finds that people are less productive when they are constantly moving from one task to another instead of focusing on one thing at a time.

...

Leroy calls this carryover from one task to another “attention residue,” where you’re still thinking of a previous task as you start another one. Even if you finish your task completely, you still have some attention residue swirling around your head as you embark on your next task, meaning that bullet point on your to-do list doesn’t start off on the right foot. In other words, as much as multitasking gets nods for being an asset in today’s time-crunched world, it’s not really a good thing when it comes to your productivity, and it's actually a time-waster.

 

/Source

Antonio Ortiz

Antonio Ortiz has always been an autodidact with an eclectic array of interests. Fascinated with technology, advertising and culture he has forged a career that combines them all. In 1991 Antonio developed one of the very first websites to market the arts. It was text based, only available to computer scientists, and increased attendance to the Rutgers Arts Center where he had truly begun his professional career. Since then Antonio has been an early adopter and innovator merging technology and marketing with his passion for art, culture and entertainment. For a more in-depth look at those passions, visit SmarterCreativity.com.

A Neuroscientist Explains 'Why We Snap'

Melissa Dahl interviews R. Douglas Fields, a senior investigator at the National Institutes of Health, about the reasons why we sometimes snap:

Fields argues that there are nine major triggers that invoke the rage response, which he has assembled into the acronym LIFEMORTS: life and limb, as in your physical safety; and insult, meaning a verbal threat. The next six are self-explanatory: family, environment, mate, order in society, resources, tribe — you already know each of these are things you’d fight for if you felt they were in danger. The last is stopped, the idea that any animal (humans included) will ready itself to fight if it feels restrained or trapped. So for each of these nine triggers, the rage kicks in to prepare you for a potential fight, because you feel like something essential has been threatened.

These triggers evolved in our brains for a reason, and at times they give rise to defensive action that is as necessary for modern humans as it was for our early ancestors. But they can misfire, too, sometimes to violent, irreversible effect. Fields spoke with Science of Us about what happens inside the brain when we flip our proverbial lids, and how we can begin to control this impulse.

 

/Source

Antonio Ortiz

Antonio Ortiz has always been an autodidact with an eclectic array of interests. Fascinated with technology, advertising and culture he has forged a career that combines them all. In 1991 Antonio developed one of the very first websites to market the arts. It was text based, only available to computer scientists, and increased attendance to the Rutgers Arts Center where he had truly begun his professional career. Since then Antonio has been an early adopter and innovator merging technology and marketing with his passion for art, culture and entertainment. For a more in-depth look at those passions, visit SmarterCreativity.com.

Study Finds That Smart People Live Longer Than Not-Smart People

Fast Company reports

People are living longer than ever, says a report from the World Health Organization (WHO), but smart people live longer yet. People who are, shall we say, less smart die younger than more intelligent folks, and various studies around the world are attempting to find out why.

IQ affects how long you manage to stick around in this life, with a 15% increase in IQ giving a 21% better chance of not dying. These numbers come from a cohort study by researchers Lawrence Whalley and Ian Deary, using the Scottish Mental Surveys, a historic survey in which almost all 11-year olds in Scotland got the same IQ test on the same day in 1932. The new study found out which of these subjects were still alive, and at which age others had died.

 

/Source

Antonio Ortiz

Antonio Ortiz has always been an autodidact with an eclectic array of interests. Fascinated with technology, advertising and culture he has forged a career that combines them all. In 1991 Antonio developed one of the very first websites to market the arts. It was text based, only available to computer scientists, and increased attendance to the Rutgers Arts Center where he had truly begun his professional career. Since then Antonio has been an early adopter and innovator merging technology and marketing with his passion for art, culture and entertainment. For a more in-depth look at those passions, visit SmarterCreativity.com.