Good Reads

http://blogs.scientificamerican.com/octopus-chronicles/files/2013/12/octopus_sucker_label.jpg

We all know that the male octopus uses his third right arm as a penis. (Oh, you didn’t? It’s true. Sometimes he even detaches it to give to the female.)

In fact, all of the arms, if not so specialized, are easily identifiable—as numbers one, two, three or four on the left or right side. This simple scheme helps scientists track whether an octopus prefers a left or right side (they do seem to exhibit a “handedness”) and how frequently it uses different arms for various tasks—from crawling to feeling.

But we have no such way to identify each arm’s hundreds of suckers. This is a sticking point for scientists, who are not able to dive into more specific research of the capabilities, sizes and various uses of these impressive organs—which can vary in size to a matter of millimeters to several centimeters across.

A team of researchers is thus calling for a universal code to ID specific octopus suckers. “In scientific communication, a common terminology is essential,” the team wrote in a paper on the matter, published last month in Marine and Freshwater Behavior and Physiology.

But it is not so simple as appending numbers one through 2,240 to these appendages.​
- read the full article Octopus, How Do You Count Your Suckers? (from Scientific American)
 
http://farm4.staticflickr.com/3260/2692068440_d707d77375.jpg
image courtesy Dan Pope (Flickr)

Thus, for some of us, the night of carousing also means a morning of hangovers.

Just in the nick of time, here’s our complete guide to the science of hangovers—what we know, what we don’t know, and how you can use this information to minimize your suffering.

Why Do Hangovers Happen?

Given that they’re such a widespread health phenomenon, it’s perhaps a bit surprising that scientists still don’t fully understand the causes of a hangover. (They do, however, have a scientific name for them: veisalgia.) It’s far from clear why, after all traces of alcohol have been fully expelled from your body, you can still experience a load of awful symptoms, including headache, dizziness, fatigue, nausea, stomach problems, drowsiness, sweating, excessive thirst and cognitive fuzziness.

The simplest and most familiar explanation is that drinking alcohol causes dehydration, both because it acts as a diuretic, increasing urine production, and because people who are drinking heavily for multiple hours probably aren’t drinking much water during that time period. But studies examining the link between dehydration and hangovers have turned up some surprising data. One, for instance, found no correlation between high levels of the hormones associated with dehydration and the severity of a hangover. It’s most likely that dehydration accounts for some of the symptoms of a hangover (dizziness, lightheadedness and thirst) but that there are other factors at work as well.

Most scientists believe that a hangover is driven by alcohol interfering with your body’s natural balance of chemicals in a more complex way. One hypothesis is that in order to process alcohol, your body must convert the enzyme NAD+ into an alternate form, NADH. With an excess buildup of NADH and insufficient quantities of NAD+, the thinking goes, your cells are no longer capable of efficiently performing a number of metabolic activities—everything from absorbing glucose from the blood to regulating electrolyte levels. But this hypothesis, too, has been contradicted by data: In studies, people with severe hangovers weren’t found to have lower levels of electrolytes or glucose in their blood.

The most compelling theory, at the moment, is that hangovers result from a buildup of acetaldehyde, a toxic compound, in the body. As the body processes alcohol, acetaldehyde is the very first byproduct, and it’s estimated to be between 10 and 30 times as toxic as alcohol itself. In controlled studies, it’s been found to cause symptoms such as sweating, skin flushing, nausea and vomiting.​
- read the full article Your Complete Guide to the Science of Hangovers (from Smithsonian Magazine)
 


http://wattsupwiththat.files.wordpress.com/2014/01/clip_image022.jpg

By Bob Tisdale

Even before the study of human-induced global warming became fashionable, tax dollars had funded a major portion of that research. Government organizations continue to supply the vast majority of the moneys for those research efforts. Yet with the tens of billions of dollars expended over the past couple of decades, there has been little increase in our understanding of what the future might bring.

The recent 5th Assessment Report from the Intergovernmental Panel on Climate Change (IPCC) proclaims that global surface temperatures are projected to increase through the year 2100, that sea levels will continue to rise, that in some regions rainfall might increase and in others it will decrease, etc. But those were the same basic messages with the 4th Assessment Report in 2007, and the 3rd Assessment Report in 2001, and the 2nd Assessment Report in 1995. So we’ve received little benefit for all of those tax dollars spent over the past few decades.

Those predictions of the future are based on simulations of climate using numerical computer programs known as climate models. Past and projected factors that are alleged to impact climate on Earth (known as forcings) serve as inputs to the models. Then the models, based on more assumptions made by programmers, crunch a lot of numbers and regurgitate outputs that are representations of what the future might hold in store, with the monumental supposition that the models properly simulate climate.

But it is well known that climate models are flawed, that they do not properly simulate climate metrics that are of interest to policymakers and the public—like surface temperatures, precipitation, sea ice area. And in at least one respect the current generation of climate models performs more poorly than the earlier generation. That is, climate models are getting worse, not better, at simulating Earth’s climate.

With that in mind, the following are sample questions that policymakers should be asking climate scientists and agencies who receive government funding for research into human-induced global warming—along with information to support the questions...
 
http://www.washingtonpost.com/rf/image_606w/2010-2019/WashingtonPost/2013/11/07/Others/Images/2013-11-06/_CMX89721383792786.jpg

ZHUZHOU, China — Regulars of the Jianba barbershop in this southern city recently found it shuttered, with a curious note taped to the door.

Dear customers, I got a call from my daughter yesterday. I have been away from her so long, she doesn’t even know how to call me ‘Daddy’ anymore . . . I beg you for a week off to visit my family.

The letter, photographed by a passerby, was posted on the Chinese equivalent of Twitter and quickly went viral. It reflected a growing angst in this country over “left-behind children.”

More than 61 million children — about one-fifth of the kids in China — live in villages without their parents. Most are the offspring of peasants who have flocked to cities in one of the largest migrations in human history. For three decades, the migrants’ cheap labor has fueled China’s rise as an economic juggernaut. But the city workers are so squeezed by high costs and long hours that many send their children to live with elderly relatives in the countryside.

The barber who posted the note, Wu Hongwei, and his wife, Wang Yuan, had left their daughter with her grandparents in a remote village when she was 9 months old. The couple thought the 340-mile distance was a challenge they could overcome.

Every day, they phoned and told the little girl that “Mommy loves you” and “Daddy misses you.” They taped photos of themselves on the concrete walls of her room at her grandparents’ house.

But after almost two years, they have come to a stark realization.

“We are complete strangers to her,” Wu said.​
- read the full article In China, one in five children live in rural villages without their parents (from The Washington Post)
 
http://a3.img.talkingpointsmemo.com/image/upload/c_fill,fl_keep_iptc,g_faces,h_365,w_652/pjvcatxdydvayo9iz2g9.jpg

Donna received the letter canceling her insurance plan on Sept. 16. Her insurance company, LifeWise of Washington, told her that they'd identified a new plan for her. If she did nothing, she'd be covered.

A 56-year-old Seattle resident with a 57-year-old husband and 15-year-old daughter, Donna had been looking forward to the savings that the Affordable Care Act had to offer.

But that's not what she found. Instead, she'd be paying an additional $300 a month for coverage. The letter made no mention of the health insurance marketplace that would soon open in Washington, where she could shop for competitive plans, and only an oblique reference to financial help that she might qualify for, if she made the effort to call and find out.

Otherwise, she'd be automatically rolled over to a new plan -- and, as the letter said, "If you're happy with this plan, do nothing."

If Donna had done nothing, she would have ended up spending about $1,000 more a month for insurance than she will now that she went to the marketplace, picked the best plan for her family and accessed tax credits at the heart of the health care reform law.

"The info that we were sent by LifeWise was totally bogus. Why the heck did they try to screw us?" Donna said. "People who are afraid of the ACA should be much more afraid of the insurance companies who will exploit their fear and end up overcharging them."

Donna is not alone.

Across the country, insurance companies have sent misleading letters to consumers, trying to lock them into the companies' own, sometimes more expensive health insurance plans rather than let them shop for insurance and tax credits on the Obamacare marketplaces -- which could lead to people like Donna spending thousands more for insurance than the law intended. In some cases, mentions of the marketplace in those letters are relegated to a mere footnote, which can be easily overlooked.​
 
http://cdn.static-economist.com/sites/default/files/imagecache/original-size/images/print-edition/20131221_STC950.png

Green corridors once crossed the Sahara, allowing mankind to do so, too

THE “Out of Africa” model of humanity’s spread is now about as well established as anything can be which attempts to describe what happened 60,000 years ago. Human beings, it posits, evolved on that continent some 200,000 years in the past and then, millennia later, a band of intrepid migrants crossed the straits of Bab el Mandeb into what is now Yemen to colonise the rest of the planet.

This does not, though, explain how Africa itself was colonised—in particular, how people crossed from the sub-Saharan part of the continent, where the evidence suggests Homo sapiens originated, to its northern fringes. That happened much earlier, around 125,000 years ago.​
- read the full article A river ran through it (from The Economist)
 
http://s.wsj.net/public/resources/images/P1-BO546_SPEED__G_20131229190212.jpg

SILVER SPRING, Md.—When it comes to playing fast, few can beat Gene "The Human Drum Machine" Hoglan.

"There are young dudes coming up behind me who want to take my throne, but I'm not going to give it up that easy," says Mr. Hoglan, 46 years old, who warms up with drum sticks twice as heavy as usual, a trick he learned from baseball that makes his normal sticks seem lighter. To tone his legs, crucial for foot-drumming, he wears 3-pound ankle weights. When he pops these off, he can really fly.

Though he weighs nearly 300 pounds and is, in his own words, "really lethargic," Mr. Hoglan has been called one of the quickest and most precise drummers in heavy metal.

Ever since spinning out of rock 'n' roll in the 1970s, metal has gotten faster and faster. Like many drummers of his generation, Mr. Hoglan left the drum-pounding abilities of his heroes in the dust, fueling an arms race that has sparked an unlikely crisis. Speed metal, as this subgenre is called, has become so fast that drummers can't keep up. Instead, more bands have quietly switched to using computerized drum machines.

Mark Mynett, a British music producer and lecturer at the University of Huddersfield's Department of Engineering and Technology near Manchester, England, says he once "produced a whole album where the drummer didn't play a single bass-drum track on a single song." He likens it to "magazines airbrushing models."

How did heavy-metal drumming get so fast?

Ian Christe, author of "Sound of the Beast: The Complete Headbanging History of Heavy Metal," says the genre speeded up in the 1980s, when drummers for bands Metallica, Slayer and Testament one-upped older groups by making metal more about fast rhythms than melody. Over the next decade, "grindcore" and "death metal" groups got even more extreme. Musicians "took to wearing sweatpants on stage" because the playing was so athletic, Mr. Christe writes.

When new technologies arrived, metal drumming standards entered the realm of the physically impossible. Today, many bands write songs using computers without even rehearsing them. When an English band recently came to Mr. Mynett's studio, "none of the musicians could play the parts they'd written," he said. The band's bass-drum tracks—the foundation of metal songs—had to be digitally constructed.​
- read the full article In Speed Metal, Fastest Drummers Take a Beating (from The Wall Street Hournal)
 
http://cdn.static-economist.com/sites/default/files/imagecache/original-size/images/print-edition/20131221_STC950.png

Green corridors once crossed the Sahara, allowing mankind to do so, too

THE “Out of Africa” model of humanity’s spread is now about as well established as anything can be which attempts to describe what happened 60,000 years ago. Human beings, it posits, evolved on that continent some 200,000 years in the past and then, millennia later, a band of intrepid migrants crossed the straits of Bab el Mandeb into what is now Yemen to colonise the rest of the planet.

This does not, though, explain how Africa itself was colonised—in particular, how people crossed from the sub-Saharan part of the continent, where the evidence suggests Homo sapiens originated, to its northern fringes. That happened much earlier, around 125,000 years ago.​
- read the full article A river ran through it (from The Economist)

Since you asked: I follow civilization, and I imagine that migration was significantly different from what we think happened. William J.Calvin theorizes that the tropics aren't good for nurturing intellect, because life is too easy if eating and breeding are your goals, and no civilization has ever erupted tween the Tropics of Cancer or Capricorn. Egypt, India, China, Japan, and Middle East are pretty much where the fun was. The folks who emigrated from Ethiopia, Kenya, and Tanzania prolly followed the coast to Egypt and spread. Those who went west and south are the people we see today. As long as the monkeys and bananas last they have no impulse to evolve civilizations.
 




Bill Overstreet, Famed WWII Fighter Pilot, Dies At 92

January 03, 2014

NPR
All Things Considered


World War II fighter pilot William Overstreet Jr. passed away Sunday. He was 92. Overstreet gained fame for flying beneath the Eiffel Tower's arches during the war in pursuit of a German aircraft.

AUDIE CORNISH, HOST:

We end this hour with a remembrance of a daring World War II flight that lifted the spirits of the French people and of the humble man who flew it. In 1944, American fighter pilot William Overstreet of the 357th Fighter Group was on a mission in Nazi-occupied territory. Flying his P-51 Mustang, Overstreet was escorting American bombers through France when a dogfight broke out. Overstreet broke away to pursue an enemy German plane.

PASTOR JEFF CLEMMONS: It started at 30,000 feet.

CORNISH: That's Pastor Jeff Clemmons, a combat veteran of the U.S. Army Chaplain Corps and a close friend of William Overstreet.

CLEMMONS: This was a half-hour dogfight which would end up going through the streets of Paris and conclude itself through a pursuit through the Eiffel Tower where Bill shot down the German pilot.

CORNISH: Yes, you heard that right, from the stratosphere, down through the arches of the Eiffel Tower. Here's how Overstreet himself described the chase in an interview posted online.

WILLIAM OVERSTREET: He figured I'd try to get around and he'd have time to get away. He was wrong. I was right behind him, right under the Eiffel Tower with him. And when he pulled up, I did get him. But that's a huge space. That's not close at all. It's plenty of room to go under the Eiffel Tower. But it makes a good story.

CORNISH: More like a great one. Clemmons says the pursuit inspired thousands of people below who witnessed the feat.

CLEMMONS: The Paris citizenry actually rose up in defiance of the Germans for a period of three days, celebrating that victory. And they knew the Germans would lose the war.

CORNISH: The French people never forgot Overstreet's courage. In 2009, France presented William Overstreet with that country's highest award, the Legion of Honor. Clemmons tells us that Overstreet was a very modest man. He accepted the medal in memory of servicemen who died in the war.

CLEMMONS: Bill was selfless. He was authentic. He knew who he was. I was there when he died. I felt his last heartbeat. We will never see the likes of these men again.

CORNISH: World War II veteran William Overstreet passed away in Roanoke, Virginia this past Sunday. He was 92 years old.
 
Last edited:




Bill Overstreet, Famed WWII Fighter Pilot, Dies At 92

January 03, 2014

NPR
All Things Considered

They really were the greatest generation. Living through the depression, fighting WW2 and Korea. A fuck tonne didn't make it. A supremely heavy price for our freedom. Not a lot of days go by that I don't ponder that.

Couldn't do that type of flying today as there's a lot of scaffolding in the way at the moment.
 
http://farm4.staticflickr.com/3820/9852656445_d389c4f432.jpg
image courtesy i1473 (Flickr)

Not long ago, in a room very like this one (the setting of most of these stories is both familiar and vague), I was looking for the origin of a video game. Polybius was a coin-operated arcade shoot-’em-up released in 1981 to a few suburban locations in Portland, Oregon. The player experience was abstract and psychedelic, involving geometric patterns, bright colours and arcane rules. It was also psychoactive, triggering insomnia, hallucinations and amnesia in its players. These side effects might not have been accidental. Hidden menu operations suggested they were designed in. At night, anonymous officials collected data from the machines.

Polybius the game never existed. Polybius the urban legend is, however, as real as these things get. It’s a persistent thread of internet lore, passed from message board to message board, elaborated over time. The myth of a sinister black-ops mind-control arcade game is so appealing that people have built and photographed ‘real’ Polybius units. Others have manufactured menu screenshots; on YouTube you can find ‘real’ Polybius gameplay footage. As a subcultural phenomenon, Polybius is significant enough to have produced at least one eddy in the mainstream: the game can be glimpsed in the background of a 2006 episode of The Simpsons.

As with most urban legends, the origins of the story are obscure and tediously contested. But hunting around for more information, I found that Polybius was simply the most prominent among scores of similar accounts of haunted or malign video games. Subliminal high-frequency sound in the LavenderTown level of Pokemon (1996) was linked to the suicides of more than 200 children. A ‘possessed’ version of Mario 64 (1996) has someone whispering in Japanese over the titles, and aberrant graphics and sound, which include disturbing images of a hanged Luigi and — somehow — the player’s own family. A bootleg copy of Spyro 2 (1999) turns out to be menacing and vengeful.
[...]
The word ‘creepypasta’ derives from ‘copypasta’, a generic term for any short piece of writing, image or video clip that is widely copy-and-pasted across forums and message boards. In its sinister variant, it flourishes on sites such as 4chan.org and Reddit, and specialised venues such as creepypasta.com and the Creepypasta Wiki (creepypasta.wikia.com), which at the time of writing has nearly 15,000 entries (these sites are all to be avoided at work). Creepypasta resembles rumour: generally it is repeated without acknowledgement of the original creator, and is cumulatively modified by many hands, existing in many versions. Even its creators might claim they heard it from someone else or found it on another site, obscuring their authorship to aid the suspension of disbelief. In the internet’s labyrinth of dead links, unattributed reproduction and misattribution lends itself well to horror: creepypasta has an eerie air of having arisen from nowhere.​
 
Evgeny Morozov wants to convince us that digital technology can’t save the world, and he’s willing to burn every bridge from Cambridge to Silicon Valley to do it

http://cjrarchive.org/img/posts/morozov.jpg

Depending on whom you ask, Evgeny Morozov is either the most astute, feared, loathed, or useless writer about digital technology working today. Just 29 years old, from an industrial town in Belarus, he appeared as if out of nowhere in the late aughts, amid the conference-goers and problem solvers working to shape our digital futures, a hostile messenger from a faraway land brashly declaring the age of big ideas and interconnected bliss to be, well, bullshit.

To say that Morozov has gone out of his way to irritate powerful and influential people in the tech world doesn’t quite capture it. Doing so is his primary occupation. In the Morozovian worldview, New York University professor and social-media theorist Clay Shirky is a “consultant-cum-intellectual”; Google’s mission is to “monetize all of the world’s information and make it universally inaccessible and profitable”; and Tim O’Reilly, the Silicon Valley publisher and venture capitalist who coined “Web 2.0,” is an Orwellian “meme hustler” and the main culprit behind “the enduring emptiness of our technology debates.” To millions of viewers, TED talks are inspirational speeches about “ideas worth spreading” in science and technology. To Morozov they are a “sinister” hyping of “ideas no footnotes can support.”

Or try this passage. It’s a takedown of a work of technological triumphalism called Hybrid Reality, but it doubles as a summary of his thinking about the entirety of the tech discourse: “[P]erhaps this is what the Hybrid Age is all about: marketing masquerading as theory, charlatans masquerading as philosophers, a New Age cult masquerading as a university, business masquerading as redemption, slogans masquerading as truths.”​
- read the full article Evgeny vs. the internet (from Columbia Journalism Review)
 
Lera Boroditsky once did a simple experiment: She asked people to close their eyes and point southeast. A room of distinguished professors in the U.S. pointed in almost every possible direction, whereas 5-year-old Australian aboriginal girls always got it right.

She says . Boroditsky, an associate professor of cognitive science at the University of California, San Diego, says the Australian aboriginal language doesn't use words like left or right. It uses compass points, so they say things like "that girl to the east of you is my sister."

http://www.npr.org/blogs/health/201...anguage-seems-to-shape-ones-view-of-the-world
 
http://farm4.staticflickr.com/3823/9637977697_bd9de0e619.jpg
image courtesy wwarby (Flickr)

There’s a lot we don’t know about time travel. Whether it exists, for example.

Just for the sake of argument, let’s assume time travel is theoretically possible. Even so, the fact that we aren’t aware of any time travelers isn’t particularly surprising. Making a big change far in the past, one that would conclusively announce to the world that time travel is real, could potentially change history such that the time traveler would never be born in the first place.

But according to a real study conducted by pair of physics professors at Michigan Technical University, there may be a way to locate time travelers—and it involves Twitter.

Released late last month, ‟Searching the Internet for evidence of time travelers” attempts to find real-world Marty McFlys by searching for information online that couldn’t have been posted without foreknowledge of the future.

‟Were a time traveler from the future to access the Internet of the past few years, they might have left once-prescient content that persists today,” the authors speculate. ‟Alternatively, such information might have been placed on Internet by a third party discussing something unusual they have heard. Such content might have been catalogued by search engines such as Google...or Bing...or remain in posts left on Facebook...Google Plus...or Twitter.”

For their analysis, the authors selected a pair of search terms that would have been completely unknown before a specific date in recent history and then looked for Internet traffic relating to those terms before the sets dates. The terms they looked at were ‟Comet ISON”, a comet first discovered on September 21, 2012, and ‟Pope Francis,” the name selected by Jorge Mario Bergoglio on March 16, 2013 when he ascended to the head of the Catholic Church.​
 
http://www.aralseadisaster.org/sites/g/files/g369661/f/styles/large/public/aral_sea_disaster_3_nasa.jpg?itok=_LpZtLHa


The Aral Sea is situated in Central Asia, between the Southern part of Kazakhstan and Northern Uzbekistan. Up until the third quarter of the 20th century it was the world?s fourth largest saline lake, and contained 10grams of salt per liter. The two rivers that feed it are the Amu Darya and Syr Darya rivers, respectively reaching the Sea through the South and the North. The Soviet government decided in the 1960s to divert those rivers so that they could irrigate the desert region surrounding the Sea in order to favor agriculture rather than supply the Aral Sea basin. The reason why we decided to explore the implications up to today of this human alteration of the environment is precisely that certain characteristics of the region, from its geography to its population growth, account for dramatic consequences since the canals have been dug. Those consequences range from unexpected climate feedbacks to public health issues, affecting the lives of millions of people in and out of the region.

--- Read more The Aral Sea Crisis
 
http://dinmerican.files.wordpress.com/2012/08/the-revolt-of-the-masses2.jpg

I first read José Ortega y Gasset’s The Revolt of the Masses more than thirty years ago. I still remember how disappointed I was by this cantankerous book. I’d read other works by Ortega (1883-1955), and been impressed by the Spanish philosopher’s intelligence and insight. But this 1929 study of the modern world, his most famous book, struck me as hopelessly nostalgic and elitist.

Yet I recently read The Revolt of the Masses again, and with a completely different response. The same ideas I dismissed as old-fashioned and out-of-date back in the 20th century now reveal an uncanny ability to explain the most peculiar happenings of the digital age.
[...]
The key driver of change, as Ortega sees it, comes from a shocking attitude characteristic of the modern age—or, at least, Ortega was shocked. Put simply, the masses hate experts. If forced to choose between the advice of the learned and the vague impressions of other people just like themselves, the masses invariably turn to the latter. The upper elite still try to pronounce judgments and lead, but fewer and fewer of those down below pay attention.

Above all, the favorite source of wisdom for the masses, in Ortega’s schema, is their own strident opinions. “Why should he listen, when he has all the answers, everything he needs to know?” Ortega writes. “It is no longer the season to listen, but on the contrary, a time to pass judgment, to pronounce sentence, to issue proclamations.”

Ortega couldn’t have foreseen digital age culture, but he is describing it with precision. He would recognize the angry, assertive tone of comments on web articles as the exact same tendency he identified in 1929. He would understand why Yelp reviews have more influence than the considered judgments of restaurant reviewers. He would know why Amazon customer comments have more clout than critics in The New Yorker. He would attend an angry town hall meeting or listen to talk radio, and recognize the same tendencies he described in his book.
- read the full article The Smartest Book About Our Digital Age Was Published in 1929 (from The Daily Beast)
 
http://img.gawkerassets.com/img/19bqel1yybxj3jpg/ku-xlarge.jpg

It's a question that's been asked since time immemorial: Could a woman create an online dating profile "so loathsome that no man would message it"?

Cracked's crack investigative reporter Alli Reed decided to solve the "omnipervet paradox" once and for all by going on OKCupid and posting what she firmly believed was the "worst online dating profile ever."

Her thesis: There did indeed exist a woman "so awful, so toxic, so irredeemably unlikeable that no one would message her, or if they did, at least they would realize they never, ever wanted to meet her."

As it turns out, there wasn't.

In creating her profile for "AaronCarterFan," Reed used the real photo of her model friend Rae to entice gentlemen callers. But that was exactly where AaronCarterFan's attractiveness ended and her repugnant personality began.

Her self-summary was the infamous "if u cant handle me at my worst u dont deserve me at my best"; the thing she was really good at was "convincing people im pregnat lol"; and under "favorite books, movies, shows, music, and food," AaronCarterFan wrote "soooooo glad their making another Grown Ups."

"I figured any profile with photos of a beautiful woman would get a few messages from men whose boners were willing to overlook her personality," Reed wrote in her breakdown of the experiment. "She got 150 messages in 24 hours."​
 
http://graphics8.nytimes.com/images/2014/01/05/magazine/05montauk1/05montauk1-articleLarge.jpg

Looking back, John Aldridge knew it was a stupid move. When you’re alone on the deck of a lobster boat in the middle of the night, 40 miles off the tip of Long Island, you don’t take chances. But he had work to do: He needed to start pumping water into the Anna Mary’s holding tanks to chill, so that when he and his partner, Anthony Sosinski, reached their first string of traps a few miles farther south, the water would be cold enough to keep the lobsters alive for the return trip. In order to get to the tanks, he had to open a metal hatch on the deck. And the hatch was covered by two 35-gallon Coleman coolers, giant plastic insulated ice chests that he and Sosinski filled before leaving the dock in Montauk harbor seven hours earlier. The coolers, full, weighed about 200 pounds, and the only way for Aldridge to move them alone was to snag a box hook onto the plastic handle of the bottom one, brace his legs, lean back and pull with all his might.

And then the handle snapped.

Suddenly Aldridge was flying backward, tumbling across the deck toward the back of the boat, which was wide open, just a flat, slick ramp leading straight into the black ocean a few inches below. Aldridge grabbed for the side of the boat as it went past, his fingertips missing it by inches. The water hit him like a slap. He went under, took in a mouthful of Atlantic Ocean and then surfaced, sputtering. He yelled as loud as he could, hoping to wake Sosinski, who was asleep on a bunk below the front deck. But the diesel engine was too loud, and the Anna Mary, on autopilot, moving due south at six and a half knots, was already out of reach, its navigation lights receding into the night. Aldridge shouted once more, panic rising in his throat, and then silence descended. He was alone in the darkness. A single thought gripped his mind: This is how I’m going to die.​
- read the full article A Speck in the Sea (from The New York Times)
 

A single thought gripped his mind: This is how I’m going to die.​
- read the full article A Speck in the Sea (from The New York Times)

I have the same green Dunlop boots. They're pretty awesome actually. Kept my feet warm and dry at - 10°C. Never tried to sea if they float though.
 
I have the same green Dunlop boots. They're pretty awesome actually. Kept my feet warm and dry at - 10°C. Never tried to sea if they float though.

Dunlop should get that guy to endorse those boots.
 
http://farm1.staticflickr.com/72/160632575_bb4fec174f.jpg
image courtesy John Lester (Flickr)

If the icy blast of polar air that's descended upon much of the U.S. over the last couple of days has you reaching for the cookie jar for comfort — and ready to give up on those New Year's resolutions — then seriously? It's time to toughen up. Just think: At least you're not in the Antarctic.

That putting the deep freeze on America comes from the Arctic, but the coldest official temperature ever recorded on Earth — minus 128.6 Fahrenheit — was actually at the other end of the world. Even so, Antarctica's vast, frozen, barren landscape has beckoned scholars and adventurers alike for more than a century. And one thing we've learned from them is: When life is stripped down to man versus the most brutal elements, bring plenty of snacks.

Indeed, the history of exploration on the continent is as much about hunger as heroism, as Jason Anthony explores in his book Hoosh: Roast Penguin, Scurvy Day, and Other Stories of Antarctic Cuisine.

"Hunger," Anthony writes, "was the one spice every expedition carried."
[...]
But it was all a sort of cruel survival math. Even today, it takes roughly 5,000-plus calories a day to feed a person doing outdoor work, Dr. Gavin Francis, who spent a year as the medical officer at the British Antarctic Survey's remote Halley Research Station, tells me.

Those involved in manhauling — i.e., pulling sleds across the ice and snow with their bodies — need more like 6,500 calories a day. (Those of us shivering back home stateside may also be tempted to munch more right about now — some say for evolutionary reasons — but we don't have the same excuse to chow down.)

The mix of calories matters, too: Ironically, before British explorer Robert F. Scott famously starved and froze to death on his way back from the South Pole, his party conducted a study that suggested a high-carb, high-fat diet to be optimal for the harsh climes.

- read the full article Think You're Cold And Hungry? Try Eating In Antarctica (from NPR)
 
http://pixel.nymag.com/imgs/daily/intelligencer/2014/01/06/06-ces-lg.o.jpg/a_560x375.jpg

You'll see a lot of breathless reports from this year's CES about all the new ways TV-makers are trying to recapture the market's attention. LG is bringing back webOS, a failed product that promised to merge the TV and the computer back in 2009. Samsung is introducing a new TV remote that has a small trackpad on it. And both companies are talking up their new curved-screen displays, which apparently reduce glare and make it easier to see the screen from side angles. None of these are tremendously groundbreaking innovations, but then again, groundbreaking innovations may not be possible for TVs these days.

The TV, like the computer mouse or the inkjet printer, has run up against a kind of creative asymptote. There was once a period when it made sense to upgrade your TV every few years, because the technology was improving by leaps and bounds. New models had HD, or USB ports, or just obviously better screen quality. But now, they've become something close to a commodity. You can get a 50-inch, high-def LED flat screen from a major manufacturer for well under $1,000. (Here's one for $648.) That's more than enough for most people. And unless you're a real screen geek, you probably won't notice all that much difference in a new model that costs six or eight times as much.​
- read the full article Nobody Needs a New TV Anymore (from NY Magazine)
 
http://www.dissentmagazine.org/wp-content/files_mf/1388733275montparnasseflowerschambresnoiresfredericrole666.jpg

In recent years, as death has silenced the critical voices of too many people who have been important to me, I’ve found it necessary to read obituaries. Though I never took walks with philosopher Barbara Johnson (d. 2009) and her canine companion Nietzsche, lunched with “death of God” theologian Gabriel Vahanian (d. 2012), attended a jazz concert with the writer and music critic Albert Murray (d. 2013), or invited the sociologist Robert Bellah (d. 2013) to my family’s very American interfaith “Chanukmas” holiday celebration, their ideas were, as one friend wrote of Christopher Hitchens (d. 2011) in his obituary, “a central part of the landscape” of my life.

None of these people were my friends. But they were, like so many other significant thinkers who died in the last decade and whose books line my bookshelves, my interlocutors, my prods, my disturbers of the peace, and my sources of consolation. As literary critic Wayne Booth (d. 2005) put it, our books and our imagined relationship with their authors are the “company we keep.” So I’ve taken up the unhappy practice of trolling obituaries to find lost companions.​
- read the full article Over Our Dead Bodies (from Dissent)
 
http://www.ox.ac.uk/images/maincolumn/18010_social_networks.jpg

Despite the way that mobile technologies and social networking sites have made it easier to stay in touch with large numbers of acquaintances, a new study has shown that people still put most of their efforts into communicating with small numbers of close friends or relatives. We often operate unconscious one-in, one-out policies so that communication patterns remain the same even when friendships change.

'Although social communication is now easier than ever, it seems that our capacity for maintaining emotionally close relationships is finite,' said Dr Felix Reed-Tsochas, James Martin Lecturer in Complex Systems at Saïd Business School, University of Oxford. 'While this number varies from person to person, what holds true in all cases is that at any point individuals are able to keep up close relationships with only a small number of people, so that new friendships come at the expense of "relegating" existing friends.'​
- read the full article One in, one out: Oxford study shows people limit social networks (from University of Oxford)
 
http://mag.uchicago.edu/sites/default/files/1402_Okrent_Listicle-as-literary-form.png

Every form of writing has its established conventions, and writers have to learn the nature of those conventions as they go. I’ve written scientific summaries, academic articles, journalistic essays, and a book, but these days, as a language columnist for online publications the Week and Mental Floss, I mostly write listicles.

A listicle is an article in the form of a list. I’d like to think the ones I do count among the nobler examples of the genre—less “15 Best Butts in Hollywood,” more “12 Mind-Blowing Number Systems from Other Languages.” There’s nothing about the form of the listicle itself that prevents it from dealing with highbrow or important subjects, and increasingly, news of all kinds is being delivered in this form (“11 Architectural Innovations that Made the Modern City Possible,” “7 Supreme Court Cases that Could Change the Country,” and so on). Still, there are good reasons to object to the rising ubiquity of the listicle. It caters to our Internet-fed distractible tendencies, critics say, replacing complex arguments and reasoned transitions with snack-packs of bullet points.

We don’t want to get all of our news through listicles for the same reason we don’t want to get all of our news in haiku or limerick form. But that doesn’t mean the haiku and the limerick aren’t fine forms to work in. Like listicles, they are also compact packages of predictable structure that people enjoy reading. On second thought, maybe people would like to get all their news in these forms. The Haiku Daily News or the Limerick Law Review would probably go over extremely well.​
- read the full article The listicle as literary form (from The University of Chicago Magazine)
 
Back
Top