Archive

Posts Tagged ‘robots’

iPad-Controlled Quadricopter Surveys Tuscaloosa Storm Damage

May 8th, 2011 No comments

Screencap from the Parrot AR Drone photo gallery.

The mainstream news is finally catching up on the robot takeover of the globe — and I, for one, welcome our robot overlords.

This past week CNN featured a video from the Parrot quadricopter as it’s flown by CNN reporter Aaron Brodie over tornado-ravaged Tuscaloosa, Alabama, following last week’s storms. It’s pretty amazing footage, and surely it’s only sob sisters like me who worry about getting excited over new technology when so many of my fellow Americans have had their lives completely f*cked by mother nature. But for what it’s worth, the technology is amazing, not because of its absolute value but because of how easily available it is now.

Sold as a “flying video game,” the Parrot A.R. Drone utilizes an intuitive piloting system that makes it reportedly easy as pie to use. It doesn’t just run on Apple products, by the way; it also works with Android. The amazing thing is that it doesn’t just operate from the iPad/iPod Touch/iPhone — it operates from those platforms motion sensors:

The cockpit of the AR.Drone includes an inertial unit, ultrasound sensors and a vertical camera…The combination of these elements which are controlled by an autopilot program allows extremely accurate piloting of the quadricopter. The AR.Drone detects the movements of your iPod Touch®/iPhone® (to go up, down, turn, reverse, go forwards etc.). Anyone can pilot the AR.Drone, it is extremely simple to use.

[Link.]

There’s even a slight flavor of open source about it:

You can also control the Parrot AR.Drone from a Linux PC and a joystick with the software AR.Drone Navigationdesigned for application developers and available for free.

The quadricopter runs about $300 and has two cameras — forward and down — but the CNN reporter added an additional high-definition camera, to the tune of about another $250.

Brodie cogently observed of the technology:

This is really at the low end of what’s possible…There’s much more sophisticated drone technology out there that is now available to really anybody, including us in the news media, and I think this is going to continue to provide a whole new perspective on things.

[Link.]

You can check out the photo gallery at the Parrot A.R. Drone site here — and guess what? if you’ve become enamored of Parrot, you can even like them on Facebook and follow them on Twitter. See how easy it is to follow the galloping pace of technology?

Incidentally, one of the significant advantages of a quadricopter is that each individual set of rotors can be smaller, reducing the kinetic energy stored. That limits damage if you hit something with the rotors. The platform is also less expensive because maintaining stability with it doesn’t require the same mechanical coordination as a standard helicopter configuration.

Unmanned Aerial Vehicles (UAVs) have become critical in high-tech military engagements — often with somewhat freaky results. Just last week, a U.S. drone attack in Pakistan was reported to have killed eight; drone attacks are being staged against reputed Al Qaeda figures in Yemen, and U.S. Predators armed with Hellfire missiles are an increasingly important part of U.S. military strategy.

But, of course, the technology’s simplicity is also vulnerability. As far back as 2009, insurgents in Iraq were even reported to have hacked U.S. drones, accessing the video feeds to get their own intel — and determine what U.S. forces could see — using $26 off-the-shelf software.

But UAVs have also become increasingly important in civilian applications, marking the confluence of cheap-and-easy video, wireless communications and increasingly affordable model airplane tech. Once you start talking about the application of drones to “semi-civilian” fields like law enforcement and fire abatement, and things get really interesting. And did someone mention border control? Devoted Techyum readers might remember when a Mexican border surveillance drone crashed in El Paso, which the Mexican government at first denied. An Australian archaeology team used a DIY paraglider drone to survey an ancient site in Thailand. And you might recall the incredible video from a drone flying around New York City.

This is Not the Michael Bay Zombies Vs. Robots Trailer…

February 23rd, 2011 No comments


…but you can rest assured that Chris Ryall and Ashley Wood’s IDW Publishing comic book Zombies vs. Robots will get a treatment pretty similar to Michael Bay directing a bowl of cereal, now that the famous Hollywood douchebag has signed on to give ZvR a hot steaming dose of his artistic vision.

To be fair, there’s a long smarmy road of hot-under-the-collar rock-video drama to walk in slow-motion in the rain before the film actually reaches the big screen. All that’s happened is that Sony’s acquired the rights to a spec script based on the comic book, for a co-production with Bay’s Platinum Dunes Pictures. About the comic book, the spinoff blog at Comic Book Resources says this:

Debuting in 2006, Zombies Vs. Robots is set on a post-apocalyptic Earth overrun by zombies whose only chance of recovery is a team of robots that must protect and clone the last surviving human baby. It’s unknown how Inherit the Earth differs from the original premise.

Another article on the same site tells me:

The film focuses on a young girl who is the last survivor on earth. She is protected by a group of robots from a pack of zombies that are intelligent and evolved.

Awesome! I’m really looking forward to watching zombies and robots overacting copiously to the sound of wailing cock-rock guitar solos.

Best of all, the project has fan cred. I mean, just check out this insightful comment from the Deadline story:

Very happy for this project. and couldn’t be happier about this for Brad Fuller! he always wears the most interesting shirts. he’s a very handsome man.

There you go…”interesting shirts.” Glad to know this will be a project of substance. That’s Platinum Dunes co-owner Bradley Fuller, incidentally, responsible for the recent remakes of The Texas Chainsaw Massacre, The Hitcher, The Amityville Horror, A Nightmare on Elm Street, and Friday the 13th. This project just looks better and better, doesn’t it? I couldn’t find a single picture of Fuller wearing an interesting shirt, though. It just be a Hollywood insider thing.

Robot Hand is a Balloon Filled With Coffee Grounds

October 25th, 2010 No comments

John Amend and Hod Lipson with the robot hand. Photo by Robert Barker, University Photography, via the Cornell Chronicle Online.

Fast Company’s Ariel Schwarz has a post about an amazing robot hand developed by researchers at Cornell, iRobot Corporation and the University of Chicago. It’s made out of what amounts to a balloon filled with coffee grounds.

You heard me! A balloon filled with coffee grounds. And it turns out that this particulate-matter sort of robot-hand concept is almost old hat in the robot-concept department, though this is a new application.

Before you start digging through your compost, read on. Schwarz describes this amazing concept with a quote from one of the designers:

Hod Lipson, Cornell associate professor of mechanical engineering and computer science, explained in a statement, “The ground coffee grains are like lots of small gears. When they are not pressed together they can roll over each other and flow. When they are pressed together just a little bit, the teeth interlock, and they become solid.” Any particulate matter that jams well can be used in theory; the researchers chose coffee after experimenting with ground-up tires, rice, and couscous.

Couscous. Is science delicious, or what?

Fast Company’s piece features a New Scientist video, and the Cornell Chronicle has more info about the robot hand’s development.

Furthermore, a Hizook post informs me that this type of technology is (or was) called “particle-jamming skin,” and nicknamed the “blob bot,” which I like a lot better.

A user comment on Schwarz’s post mentions that iRobot CEO Collin Angle demonstrated a prototype of this concept back in 2009 at TEDMED in a talk about robots for medical applications. It’s a talk filled with fascinating robot love:

Tags: , ,

World’s First Robot Pop Star Debuts in Japan

October 18th, 2010 No comments

You heard me: World’s first robot pop star. She must be seen to be believed. It must be heard to induce violent projectile vomiting in Techyum bloggers who have still never forgiven Bowie for “Let’s Dance.”

At the Digital Content Expo in Japan, the HRP-4C made her first public appearance. According to an article in Popular Science, Masataka Goto, head of Japan’s Institute of Advanced Industrial Science and Technology’s media interaction group, said the robot isn’t just programmed to mimic the physical gestures of a human singer; she learns them on her own.

Mr. Goto, whose name I can’t type without automatically wanting to put “10″ after it, said the HRC-4P uses “a program called Vocawatcher to analyze a singer’s facial tics as she belts out a tune. The robot’s head therefore follows the roll, pitch and yaw movements of the real singer.” This goes not only for facial movements and gestures. The robot learns and mimics human breathing patterns.

Further excitement ensues in PopSci:

Masataka says he believes the entertainment industry must embrace robots if they are ever to become widely accepted.

“We hope the entertainment industry will be able to make widespread use of robots,” he says.

This comes to me via an “Open Post” on DListed, which means the comments are open to any topic, which should terrify you if you read DListed. It makes said comments even more interesting than usual. Run! Hide!

The HRP-4C, the world's first robotic pop star. Image from Popular Science.

As I was saying: Headlining his piece “Open Post, Hosted by the Pop Star Of Your Nightmares,” DListed’s Michael K makes a telling typo (as he tends to) in the second graf reproduced here:

If a robot can dance and lip-synch to songs, she can also creep up on you while you’re sleeping, wave a kitchen knife all over your face and write a suicide note in your own handwriting.

We already have too many factory built poop stars with microchip brains and mouths operated by remote control, we don’t need another one. This is seriously how it’s going to end.

If it hasn’t already, Michael. If it hasn’t already. I am a computer…

Tags: ,

Scientists Still Working on Slovenian Punchbot’s “Bitchslap Mode”

October 16th, 2010 No comments

Image: B.Povse, D. Koritnik, T Bajd, M Munih, via New Scientist.

On the sunny shores of the Adriatic Sea, they build my kinda robots. They also clearly know what academia’s all about.

Borut Povše at the University of Ljubljana in Slovenia “persuaded six male colleagues to let a powerful industrial robot repeatedly strike them on the arm,” according to New Scientist.

No, this isn’t an attempt to streamline the dissertation review process in PhD programs. The study is being done “to assess human-robot pain thresholds.” But seriously now, folks, wouldn’t you have loved to be a fly on the wall for that conversation in the faculty lounge?

New Scientist quotes Povše:

Even robots designed to Asimov’s laws can collide with people. We are trying to make sure that when they do, the collision is not too powerful. …We are taking the first steps to defining the limits of the speed and acceleration of robots, and the ideal size and shape of the tools they use, so they can safely interact with humans.

Povše refers, of course, to the Three Laws of Robotics brought down by Moses from Mount Asimov as related in Genesis 19:42. In case you’re wondering, they are as follows:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

I’m sorry, did I say they were from brought down by Moses in the Bible? What I meant, of course, is that they were pulled out of Belarussian-born American science fiction author Isaac Asimov’s ass in his 1942 story “Runaround,” and thereafter used throughout Asimov’s “Robot” stories and also his space opera “Lucky Starr” series.

In futurist and science fiction fan circles, Asimov’s laws are vastly more sacred than The Bible. After all, stories from The Bible were retold throughout the 1950s, 1960s and 1970s with, y’know, Adam and Eve replaced with slime molds, Pharaoh as a thirty-tentacled alien that craves human spleen, and Jesus rising from the dead as a satellite-based AI. Asimov’s laws, on the other hand, are treated routinely by fiction and nonfiction writers alike as if they were real laws, as opposed to fictional constructs used by Asimov to consider philosophical questions about the nature of consciousness and individuality. (Asimov also didn’t like Hair, by the way — would you trust him?)

Anyway, as New Scientist was saying:

Povše and his colleagues borrowed a small production-line robot made by Japanese technology firm Epson and normally used for assembling systems such as coffee vending machines. They programmed the robot arm to move towards a point in mid-air already occupied by a volunteer’s outstretched forearm, so the robot would push the human out of the way. Each volunteer was struck 18 times at different impact energies, with the robot arm fitted with one of two tools – one blunt and round, and one sharper.

Yes, that’s the same Epson that makes printers; they’ll be integrating this technology into their next round of laser printers, so make sure you watch those serial commas. Anyway:

The volunteers were then asked to judge, for each tool type, whether the collision was painless, or engendered mild, moderate, horrible or unbearable pain…Ultimately, the idea is to cap the speed a robot should move at when it senses a nearby human, to avoid hurting them.

However, New Scientist quotes Baylor College of Medicine biomechanics specialist Michael Liebschner as criticizing the study: “Pain is very subjective. Nobody cares if you have a stinging pain when a robot hits you – what you want to prevent is injury, because that’s when litigation starts.”

But I think Liebschner is missing the point; this isn’t about the punchbots at all. It’s about what scientists will do to get answers.

The whole thing could be a godsend for science education. Can you imagine the good that could be done by melding this concept with Robot Wars? A Smack A Scientist reality show on NBC or, better yet, Spike, would get kids interested in science again. It’d be a sort of primetime Stanford Prison Experiment with punchbots. Punchbots that hopefully run amok right around Sweeps Week. Oh, wait…Haven’t I seen that show?

Carnegie Mellon’s Robot Census

September 21st, 2010 2 comments

Photo by Celia Ludwinski, Carnegie Mellon Tartan Photo Editor.

Meanwhile, in other robot news today, a PhD student at Carnegie Mellon’s Robotic Institute is taking time off from her interactive robot theater projects to perform a “robot census” of every ‘bot at her institution.

Student Heather Knight, whom the Carnegie Mellon Tartan says “is interested in making interactive theater productions featuring people and acting robots, in which an audience can teach robots how to be more like humans,” was inspired to undertake this offering after wrestling with the complicated grad student problem of choosing a research adviser. “It’s like, how can you choose your adviser if you don’t know what robots they have?” she told the Tartan. I remember asking myself the same question in my undergrad days — and I was a History major. Knight launched the “Carnegie Mellon 2010 Robot Census” to correspond with this year’s US Census, perhaps in hopes of creating a Harmonic Robot Convergence that will soon have us bowing to our future Robot Overlords.

So how many damn robots roam the halls at Carnegie Mellon, anyhoo? The answer might surprise you. Knight found 460 ‘bots “and counting” at Carnegie Mellon, a tentative number for the in-progress survey that, at press time, is thirty-eight-point-three times the number of followers at the Robot Census Twitter Account. Hell’s bells, why is it so hard to get ‘bots motivated to join social networks? It’s your education, geniuses. Get off your lazy alloy cans, will ya?

The robots documented by Knight range from “a single spinning wheel that can traverse rough terrain” to “artbots, like artist Golan Levin’s interactive eye,” and “snakebots…some of which can climb tree trunks.” One computer science professor, Manuela Veloso, gets the Robot Gold Star, with a total of 116 robots listed. But the Tartan readers’ fave robot is sure to be the one we all know best (“we” being all Carnegie-Mellon students, of course): “Marion ‘Tank’ LeFleur, roboceptionist of Newell-Simon Hall.”

Those Pittsburgh cats have robo-freakin’-ceptionists?

All right, then, smartasses. Where’s my flying car?

Researchers Developing Robot Chair

September 21st, 2010 No comments

Photo by Brianne Bowen

Photo by Brianne Bowen, Yale Daily News.

…working on robot World of Warcraft legend to sit in it and pwn all your asses. Actually, what the robot lab at Yale is working on at Yale is really more like a robot Sister Mary Katherine SitUpStraight, complete with vibrating ruler.

But seriously, folks: the student newspaper at Quinnipiac University in Connecticut informs me that the robotics department at Yale is working on a robotic chair that zaps you when your posture goes South:

The department has taken a simple office chair and placed sensing resistors at four different spots where pressure is highest when sitting in proper posture. When the sitter’s posture shifts out of the ideal position, the chair vibrates as a reminder to sit up straight. Depending on the sensors, different parts of the chair will vibrate.

Yale engineering and applied science professor John Morrell and [industrial design] graduate student Ying Zheng…decided to take a regular office chair with a mesh back and tweaked it, applying various sensors to it to create the posture alert chair that is in works at the Yale Human Machine Interface Laboratory.

The Yale Daily News reported this back in April when said Nun Chair was presented at the 2010 Haptics Symposium, but it’s HermanMiller.com that probably got most excited about this development, since Morrell et al started their project with a Herman Miller Aeron Chair — yes, the same damn ones you hear about constantly if you listen to San Francisco’s KQED, where they tell you to pop over to Sit4Less.com and order that puppy in the new color, “True Black,” whatever the hell that’s supposed to mean.

Then if you want to play ball like the Yale kids, you break out about a dozen Hitachi Magic Wands and rig their switches to your back, neck, elbows, knees and tookus with some crochet yarn and duct tape, and…

Look, I’d continue — but when you’ve got a pedigree like mine, you just don’t crack wise about things that vibrate.

Yale Developing Computers Smart Enough to Drive Cars

September 16th, 2010 No comments

If you’ve ever been nearly run off the road by some Sunday driver, consider this: Would you feel safer or less safe knowing you were flipping off a robot?

Yale’s website features a release on a computer system being developed by Eugenio Culurciello at Yale and Yann LeCun at NYU, who presented their research yesterday at the High Performance Embedded Computing workshop in Lexington, Massachusetts. Dubbed NeuFlow, it’s a computer that processes information visually and, it’s hoped, may someday drive a car.

Although this feat is a standard in science fiction, it’s actually proven elusive to robot-builders. A three-dimensional world is, apparently, more than previous robot brains can handle. As the Yale release puts it:

Navigating our way down the street is something most of us take for granted; we seem to recognize cars, other people, trees and lampposts instantaneously and without much thought. In fact, visually interpreting our environment as quickly as we do is an astonishing feat requiring an enormous number of computations—which is just one reason that coming up with a computer-driven system that can mimic the human brain in visually recognizing objects has proven so difficult.

Now [Culurciello] has developed a supercomputer based on the human visual system that operates much more quickly and efficiently than ever before. Dubbed NeuFlow, the system takes its inspiration from the mammalian visual system, mimicking its neural network to quickly interpret the world around it….The system uses complex vision algorithms developed by [LeCun] to run large neural networks for synthetic vision applications. One idea—the one Culurciello and LeCun are focusing on, is a system that would allow cars to drive themselves. In order to be able to recognize the various objects encountered on the road—such as other cars, people, stoplights, sidewalks, not to mention the road itself—NeuFlow processes tens of megapixel images in real time.

Check out the video of NeuFlow in action above, or test your comprehension of wacky robot design schematics by looking at some of the details at Yale’s site.

Nanowire E-Skin to Help Robots Feel

September 13th, 2010 No comments

An artist’s illustration of an artificial e-skin with nanowire active matrix circuitry covering a hand. The fragile egg illustrates the functionality of the e-skin device for prosthetic and robotic applications.

Researchers at UC Berkeley have developed a nanowire material that may provide a sense of touch for robots. Unlike current attempts to create e-skin, it uses inorganic materials.

Described in a press release on the UCB website (which I got to through yesterday’s article in PC World), the material is detailed in this week’s electronic version of the science-engineering journal Nature Materials, a spinoff of Nature.

The material is a made of germanium-silicon semiconductor nanowires, or as UCB likes to put it: “It is the first such material made out of inorganic single crystalline semiconductors.”

It’s a departure from previous attempts to develop E-skin, which used organic materials. But organic materials are poor conductors, which means high voltages are required to transmit sensory information, which this material does not; it’s also stronger than organic skin. Stronger. Faster. Better.

As the eggheads put it:

A touch-sensitive artificial skin would help overcome a key challenge in robotics: adapting the amount of force needed to hold and manipulate a wide range of objects.

“Humans generally know how to hold a fragile egg without breaking it,” said [team leader Associate Professor Ali Javey]… “If we ever wanted a robot that could unload the dishes, for instance, we’d want to make sure it doesn’t break the wine glasses in the process. But we’d also want the robot to be able to grip a stock pot without dropping it.”

A longer term goal would be to use the e-skin to restore the sense of touch to patients with prosthetic limbs, which would require significant advances in the integration of electronic sensors with the human nervous system.

…The UC Berkeley engineers utilized an innovative fabrication technique that works somewhat like a lint roller in reverse. Instead of picking up fibers, nanowire “hairs” are deposited.

[Link.]

Google Instant: You’re Searching NOW.

September 9th, 2010 No comments

Look, it’s not like I have a problem with Google. I keep hating on the galaxy-sized faceless soulless vat-grown bot-controlled terrifying Megacorporation of Doom only because I love them so much.

But I do find it ever-so-slightly creepy that the Monsters of Mountain View KNOW I’ve been watching Season 1 of Terminator: The Sarah Connor Chronicles this week.

How did Google find out what I’m watching on TV? Surely not by sending a little person to crouch on my fire escape jotting notes on a steno pad every time someone says “Come with me if you want to live,” which is how I’d do it. Google probably does it the old-fashioned way — by remotely activating my webcam, like a high school teacher.

Anyway, knowing that this is Terminator week at Casa Roche, Google delivered the personalized cloud computing it’s decided we’ve been looking for (but just didn’t know it yet) since the first time Harlan Ellison had no mouth and had to scream with two fingers on a manual Olivetti at 120 words a minute (poor bastard). They know I’ve been thinking to myself, “Okay, so machines are evil — but that Summer Glau chick. She‘s not evil…right?”

Ask, and you shall receive! Google delivered a fire-and-brimstone sermon to me about the dangers of letting your machines get too smart; said sermon wore a name badge proclaiming in dot-matrix letters: “HI! MY NAME IS GOOGLE INSTANT.”

That’s right. If you’ve been meaning to Google something, anything — you’re not sure what, but you’ll get to it, you guess, sooner or later — but you’re not so sure about the whole complicated pressing return at the end thing — what an ordeal! — you can finally liberate your creative impulse, leaving your pinky finger up your butt where it belongs.

Read more…