I’ve been working on my master thesis and I’m finally seeing some progress in my project. I’m implementing a stereovision algorithm in a FPGA and it’s killing me.
Pictures shows results for synthetic images hard coded into the program. In the first picture there is a waveform taken from the simulator. Highlighted values match values displayed by hardware showed in the next three pictures.
The past two days I’ve spent on fixing interface part of this project (that is: handling buttons and display output and pausing the algorithm so I can read the results). The algorithm itself appears to be working correctly ever since it was valid for synthesis. :)
There is still a lot to do since presented algorithm operates on synthetic images (9 pixels wide and 9 pixels high) rather than from digital cameras. Also I need to use external RAM memory and said cameras. Some machine readable output would be nice too.
Finally, both Red Angry Bird/Luke Skywalker and Hedgehog are important parts of the science team ;)
The Controllino is a Kickstart project for an Arduino-based and compatible industrial programmable logic controller (PLC). The DIN rail mounted Controllino comes in three different varieties and can be programmed using the usual software, including Arduino sketches. It has connectors for direct access to MCU ports and you may use your old Arduino shields to expand its functionality.
I hope the founders can make this into a reality, expand its support and add an open-source hardware ecosystem and the marketing people at SIEMENS might just start to cry blood.
At the Palais de la Decouverte in Paris, a 1kg aluminium plate is levitated above a large coil of wire that is being supplied with 800A of alternating current at 900Hz.
Note the flare of the globes on the lamp when he moves it into the magnetic field and the steam coming off the surface - it’s also humming with the frequency of the alternating current. If you have time I recommend watching the video - it is very entertaining.
Given their extreme vulnerability, the vastness of city space, the dangers posed by traffic, suspicion of terrorism, and the possibility that no one would be interested in helping a lost little robot, I initially conceived the Tweenbots as disposable creatures which were more likely to struggle and die in the city than to reach their destination. Because I built them with minimal technology, I had no way of tracking the Tweenbot’s progress, and so I set out on the first test with a video camera hidden in my purse. I placed the Tweenbot down on the sidewalk, and walked far enough away that I would not be observed as the Tweenbot––a smiling 10-inch tall cardboard missionary––bumped along towards his inevitable fate.
The results were unexpected. Over the course of the following months, throughout numerous missions, the Tweenbots were successful in rolling from their start point to their far-away destination assisted only by strangers. Every time the robot got caught under a park bench, ground futilely against a curb, or became trapped in a pothole, some passerby would always rescue it and send it toward its goal. Never once was a Tweenbot lost or damaged. Often, people would ignore the instructions to aim the Tweenbot in the “right” direction, if that direction meant sending the robot into a perilous situation. One man turned the robot back in the direction from which it had just come, saying out loud to the Tweenbot, “You can’t go that way, it’s toward the road.”
The Tweenbot’s unexpected presence in the city created an unfolding narrative that spoke not simply to the vastness of city space and to the journey of a human-assisted robot, but also to the power of a simple technological object to create a complex network powered by human intelligence and asynchronous interactions. But of more interest to me, was the fact that this ad-hoc crowdsourcing was driven primarily by human empathy for an anthropomorphized object. The journey the Tweenbots take each time they are released in the city becomes a story of people’s willingness to engage with a creature that mirrors human characteristics of vulnerability, of being lost, and of having intention without the means of achieving its goal alone. As each encounter with a helpful pedestrian takes the robot one step closer to attaining it’s destination, the significance of our random discoveries and individual actions accumulates into a story about a vast space made small by an even smaller robot.
Man this is still one of my favorite little social projects/experiments.
This high-speed video of a bullet fired into a water balloon shows how dramatically drag forces can affect an object. In general, drag is proportional to fluid density times an object’s velocity squared. This means that changes in velocity cause even larger changes in drag force. In this case, though, it’s not the bullet’s velocity that is its undoing. When the bullet penetrates the balloon, it transitions from moving through air to moving through water, which is 1000 times more dense. In an instant, the bullet’s drag increases by three orders of magnitude. The response is immediate: the bullet slows down so quickly that it lacks the energy to pierce the far side of the balloon. This is not the only neat fluid dynamics in the video, though. When the bullet enters the balloon, it drags air in its wake, creating an air-filled cavity in the balloon. The cavity seals near the entry point and quickly breaks up into smaller bubbles. Meanwhile, a unstablejet of water streams out of the balloon through the bullet hole, driven by hydrodynamic pressure and the constriction of the balloon. (Video credit: Keyence)
Acoustic sound is a form of pressure wave propagating through air or another fluid. Place a speaker opposite a plate, and its sound will reflect off the surface. The original pressure wave and its reflection form a standing wave. With intense enough sound waves, the acoustic radiation pressure can be large enough to counter the force of gravity on an object, causing it to levitate. We’ve shown you several examples of acoustic levitation before, including squished and vibrating droplets and applications for container-free mixing. Today’s video, however, shows the first acoustic levitation system capable of manipulating objects in three dimensions, an important step in developing the technology for application. (Video credit: Y. Ochiai et al.; via NatGeo)