Install Theme

Your web-browser is very outdated, and as such, this website may not display properly. Please consider upgrading to a modern, faster and more secure browser. Click here to do so.

deep deep dark dark deep dark pit

Commission status: CLOSED

FAQ | Archive | Cute things | Artwork | Mass Effect | Original works

Mostly SFW.

Personal blog with occasional original content, usually in the form of fan art (currently Mass Effect-related). Loads of reblogs of cats, tech, SCIENCE!, cats, politics, cats, feminism, cats, cats, and whatever else cats strikes my fancy. Cats, mostly. Flamingly liberal, just a heads-up if that sort of thing turns you off.

Tumblr Savior "quibbles" to avoid my many pointless text posts and reblogs. "politics", "feminism", and "SCIENCE!" are my most frequently used text tags if you don't want to block out everything.

Posts tagged perception

Apr 20 '14
"High-tech civilizing missions, like Patrick McConlogue’s adoption of Leo, rely on two common assumptions. The first is an unwavering belief in the virtues of self-help over just being helpful. The second is the idea that technology can solve almost anything. By this logic, the onus is on the homeless person to hack the system—to gain entry into polite society and adapt to its ways. Such a worldview cannot acknowledge that polite society may have played a large part in contributing to the homeless person’s plight. Nor does this philosophy hold that humans deserve homes. It’s worth noting that during the tenure of New York City mayor Michael Bloomberg—a data-crazy technocrat if there ever was one—homelessness shot up by 73 percent, according to the Coalition for the Homeless, in part because he tried to remove incentives for people to use public assistance and, instead of making it easier to find housing, made New Yorkers jump through hoops to secure a temporary and often crumbling roof over their heads. Homelessness is a statistically confounding problem—a perfect example of when the politics of upward redistribution trump math and reason. There is a glut of housing in this country—by Amnesty International USA’s count, there are five empty homes in the United States for every person who lacks one—and yet some 3.5 million people inhabit streets, shelters, or whatever refuge they can find. The paradox of homelessness is reminiscent of another equally absurd problem: hunger. Tons upon tons of food get thrown out every day, most of it perfectly edible, yet according to the U.S. Department of Agriculture, 49 million people, including 8.3 million children, were living in food-insecure households in 2012."
Apr 27 '13

argh, sorry, one last mini-rant re: poverty

Let me clarify one last thing: to poor folks, “rich” doesn’t necessarily mean “Scrooge McDuck swimming in gold”-level opulence. To me personally, being rich means:

  • being able to afford fresh food on a regular basis
  • being able to pay all your bills on time
  • being able to fix your car when it needs it instead of delaying repairs
  • being able to afford any kind of professional equipment, be it computers or pencils or typewriter ribbons or whatever, without having to take the funds from another regular commitment like bills
  • not worrying about having to go into bankruptcy when you get sick
  • not worrying about getting sick
  • not dreaming every other night about being short on money or being stranded somewhere with your account in the negative
  • not having to write checks or send payments based on the time it takes for your paycheck to be credited and the payment to be debited

To me, a salary of $25,000 is stinking rich, and I would absolutely love even to break the $20,000 mark. That is the perspective I come from when I talk about being “rich enough” or joke about being “white enough” to afford things others take for granted. That’s it.

Jan 12 '13
oldmanyellsatcloud:

neurosciencestuff:

Researchers Find Causality in the Eye of the Beholder
We rely on our visual system more heavily than previously thought in determining the causality of events. A team of researchers has shown that, in making judgments about causality, we don’t always need to use cognitive reasoning. In some cases, our visual brain—the brain areas that process what the eyes sense—can make these judgments rapidly and automatically.
The study appears in the latest issue of the journal Current Biology.
“Our study reveals that causality can be computed at an early level in the visual system,” said Martin Rolfs, who conducted much of the research as a post-doctoral fellow in NYU’s Department of Psychology. “This finding ends a long-standing debate over how some visual events are processed: we show that our eyes can quickly make assessments about cause-and-effect—without the help of our cognitive systems.”
Rolfs is currently a research group leader at the Bernstein Center for Computational Neuroscience and the Department of Psychology of Berlin’s Humboldt University. The study’s other co-authors were Michael Dambacher, post-doctoral researcher at the universities of Potsdam and Konstanz, and Patrick Cavanagh, professor at Université Paris Descartes.
We frequently make rapid judgments of causality (“The ball knocked the glass off the table”), animacy (“Look out, that thing is alive!”), or intention (“He meant to help her”). These judgments are complex enough that many believe that substantial cognitive reasoning is required—we need our brains to tell us what our eyes have seen. However, some judgments are so rapid and effortless that they “feel” perceptual – we can make them using only our visual systems, with no thinking required.
It is not yet clear which judgments require significant cognitive processing and which may be mediated solely by our visual system. In the Current Biology study, the researchers investigated one of these—causality judgments—in an effort to better understand the division of labor between visual and cognitive processes.

Oh man, thats actually a super critical neurocognitive distinction, considering that this might imply WHY we might think so little upon what we see and observe in objects as well as other people…and in turn, the opinions we form.
Gotta keep an eye out for future research into more specific examples on this.

oldmanyellsatcloud:

neurosciencestuff:

Researchers Find Causality in the Eye of the Beholder

We rely on our visual system more heavily than previously thought in determining the causality of events. A team of researchers has shown that, in making judgments about causality, we don’t always need to use cognitive reasoning. In some cases, our visual brain—the brain areas that process what the eyes sense—can make these judgments rapidly and automatically.

The study appears in the latest issue of the journal Current Biology.

“Our study reveals that causality can be computed at an early level in the visual system,” said Martin Rolfs, who conducted much of the research as a post-doctoral fellow in NYU’s Department of Psychology. “This finding ends a long-standing debate over how some visual events are processed: we show that our eyes can quickly make assessments about cause-and-effect—without the help of our cognitive systems.”

Rolfs is currently a research group leader at the Bernstein Center for Computational Neuroscience and the Department of Psychology of Berlin’s Humboldt University. The study’s other co-authors were Michael Dambacher, post-doctoral researcher at the universities of Potsdam and Konstanz, and Patrick Cavanagh, professor at Université Paris Descartes.

We frequently make rapid judgments of causality (“The ball knocked the glass off the table”), animacy (“Look out, that thing is alive!”), or intention (“He meant to help her”). These judgments are complex enough that many believe that substantial cognitive reasoning is required—we need our brains to tell us what our eyes have seen. However, some judgments are so rapid and effortless that they “feel” perceptual – we can make them using only our visual systems, with no thinking required.

It is not yet clear which judgments require significant cognitive processing and which may be mediated solely by our visual system. In the Current Biology study, the researchers investigated one of these—causality judgments—in an effort to better understand the division of labor between visual and cognitive processes.

Oh man, thats actually a super critical neurocognitive distinction, considering that this might imply WHY we might think so little upon what we see and observe in objects as well as other people…and in turn, the opinions we form.

Gotta keep an eye out for future research into more specific examples on this.

Aug 7 '12

Disney researchers add sense of touch to augmented reality applications 
Technology developed by Disney Research, Pittsburgh, makes it possible to change the feel of real-world surfaces and objects, including touch-screens, walls, furniture, wooden or plastic objects, without requiring users to wear special gloves or use force-feedback devices. Surfaces are not altered with actuators and require little if any instrumentation. 
Instead, Disney researchers employ a newly discovered physical phenomenon called reverse electrovibration to create the illusion of changing textures as the user’s fingers sweep across a surface. A weak electrical signal, which can be applied imperceptibly anywhere on the user’s body, creates an oscillating electrical field around the user’s fingers that is responsible for the tactile feedback.
The technology, called REVEL, could be used to create “please touch” museum displays, add haptic feedback to games, apply texture to projected images on surfaces of any size and shape, provide customized directions on walls for people with visual disabilities and enhance other applications of augmented reality.

Disney researchers add sense of touch to augmented reality applications 

Technology developed by Disney Research, Pittsburgh, makes it possible to change the feel of real-world surfaces and objects, including touch-screens, walls, furniture, wooden or plastic objects, without requiring users to wear special gloves or use force-feedback devices. Surfaces are not altered with actuators and require little if any instrumentation. 

Instead, Disney researchers employ a newly discovered physical phenomenon called reverse electrovibration to create the illusion of changing textures as the user’s fingers sweep across a surface. A weak electrical signal, which can be applied imperceptibly anywhere on the user’s body, creates an oscillating electrical field around the user’s fingers that is responsible for the tactile feedback.

The technology, called REVEL, could be used to create “please touch” museum displays, add haptic feedback to games, apply texture to projected images on surfaces of any size and shape, provide customized directions on walls for people with visual disabilities and enhance other applications of augmented reality.

(Source: neurosciencestuff)