Get a daily delivery of PSFK
Get a daily delivery of PSFK
Get a daily delivery of PSFK
New service links facial recognition to social media.
At this week’s Mobile World Congress 2011 in Barcelona, Silicon Valley mobile start-up, Viewdle, showed off it’s visual analysis technology linking facial recognition to social media. Viewdle sits between the camera and the user analysing faces in the camera stream, identifying them, then offering links to Facebook, YouTube, LinkedIn, and other social media platforms. A user can identify and tag people in pictures & videos then pass the information to their social networks. As they tag others the software learns to recognize them, and can even share these new visual profiles with other users. The live view also offers an augmented reality tagging overlay that reveals information about the people around you.
The radio personality made waves yesterday by live Tweeting scene-by-scene commentary while watching his movie, Private Parts.
Shock Jock & XM Radio personality, Howard Stern, made waves yesterday by live Tweeting scene-by-scene commentary while watching his movie, Private Parts. By providing behind-the-scenes anecdotes, personal asides, and hints at archival footage not included in the final cut, Stern gave his fans a new layer of value and added dimensionality to the movie property simply by hanging out at home on a Saturday, watching his own film and tweeting along. This seemingly simple & obvious act further erodes the barriers between celebrities & their fans, side-stepping the industry gatekeepers who might seek to directly monetize such commentary. By offering his thoughts for free, Stern has given new life to the film, grabbing headlines (and no-doubt new viewers) almost 15 years after it’s original release.
Marking another step forward in the march of algorithmic & embedded governance, researchers at the University of Colorado have developed a process for effectively detecting & deleting "objectionable" (ie nude) content in videos.
Marking another step forward in the march of algorithmic & embedded governance, researchers at the University of Colorado have developed a process for effectively detecting & deleting “objectionable” (ie nude) content in videos. By running facial detection, determining whether or not the upper body is in view, and looking for signature motions of skin & hands, the suite of algorithms can determine a statistical likelihood – a “misbehaving probability” – of naughty behavior in the stream. After a quick analysis, the offending video stream can be automatically killed.
The company best known for arming both citizenry & militias with tear gas & pepper spray is now extending its line of personal security equipment.
The company best known for arming both citizenry & militias with tear gas & pepper spray is now extending its line of personal security equipment to include small GPS tracking devices. Mace is partnering with assistive device maker, buddi, to distribute the MaceBuddi tracking device in North America. The small, matchbook-sized device reports the wearer’s position, has voice communication built-in, and can send a report to a response center when the wearer has fallen. The device also features a panic-alert button and can be programmed to send an alert when the wearer has crossed a virtual perimeter.
A specialized app works with a dash-mounted radar & laser detector to deliver radar alerts, enable tracking, and augment trap detection with mapping of speed & red-light cameras.
In another sign of the ongoing convergence of smart phones and vehicles, car radar detection manufacturer Cobra, has extended its line of radar detectors to integrate with the iPhone. The Cobra iRadar pairs a dash-mounted radar & laser detector with a specialized iPhone app via Bluetooth to deliver radar alerts, enable tracking, and augment trap detection with mapping of speed & red-light cameras. The app uses the phone’s GPS and the AURA database of verified traps & cameras to deliver a regularly-updated list of hazards to the device. The user can also manually log traps as they encounter them, building a personal database of maps & alerts.
Nanotechnology researchers at the Nanotech Institute at the University of Texas have developed a method to make smart fibers much more resilient and durable.
One of the challenges of creating smart fabrics has been figuring out how to introduce nanomaterials & conductive pathways that stand up to daily wear-and-tear and the rigors of machine washing. Now, nanotechnology researchers at the Nanotech Institute at the University of Texas have developed a method to make their fibers much more resilient and durable.
An ambitious project aims to gather the world's knowledge & data streams to drive a single, real-time global model of our planet.
While the world reveals ever increasing volumes of data about itself, the behavior of complex living systems like the global economy remain mostly mysterious. Now, an ambitious project aims to gather the world’s knowledge & data streams to drive a single, real-time global model of our planet. The Living Earth Simulator (LES) is an international project managed by FuturICT chair, Dr. Dirk Helbing of the Swiss Federal Institute of Technology, to advance our understanding of human & natural systems. Running on a super-computer cluster, the simulation would mine the web, gathering real-time data streams such as those pouring out of social networks, financial tickers, and embedded systems, as well as fixed-knowledge databases like Wikipedia. The model would be optimized to sort & categorize information, and to derive relationships between components across diverse data sets. Dr. Helbing hopes that such a massive simulation would help social & economic scientists identify the patterns & behaviors underlying living human systems, perhaps revealing the drivers behind such challenges as social & economic volatility, regional conflicts, and the spread of disease.
With cheaper & more sophisticated tools readily available, parents & hobbyists are setting up simple home laboratories.
With cheaper & more sophisticated tools readily available, parents & hobbyists are setting up simple home laboratories. While some wish to educate their children and others to grow their personal interests, the convergence of traditional scientific instruments and digital technology is enabling users to investigate their world – and even trouble-shoot personal health – without leaving their home. Edmund Scientific & ThinkGeek.com offer microscopes and telescopes that send images to the desktop through a USB connection. Parents can show their kids live views of the petri dish, then capture video on-screen for school work & social sharing. Hobbyist astronomers can record multiple frames and overlays of distant planets and star clusters, even automating capture to build complex layered images of stellar objects. Home owners are using infrared thermometers to identify heat loss and gaps in insulation. And some are analyzing their own bodies by experimenting with DNA by using the OpenPCR platform for simple PCR (polymerase chain reaction) analysis.
A recent experiment asks: who's better - "the crowd," or the computer?
Researchers at the review site Yelp wanted to see if Amazon’s Mechanical Turk was indeed better at cataloging than a machine algorithm. They tasked 4660 Turkers to identify the proper category for a business based on a multiple choice test. Of the sample, only 79 were successful. The 79 successful Turkers were then split into groups of three and each group given a second task of categorizing businesses. The 79 Turkers scored around 62% accuracy. They then sent the same set of tasks to a supervised learning algorithm – a Naive Bayes classifier, to be precise. The results showed that the algorithm succeeded about 80% of the time, handily beating the Turkers by a wide margin. The results suggest one of two things (or possibly both): that the Mechanical Turk process needs to be refined to encourage better work, or that perhaps crowdsourced cataloging is steadily being threatened by the growing sophistication of machine algorithms. In either case, the future of work is likely to be influenced by more such challenges pitting humans against their machine counterparts.
A kaleidoscope of patterns, trails, starbursts, and text elements flows with the movement of the skaters.
Australian interaction designers ENESS worked with Disney to create an interactive half-pipe that expresses the stylized visual aesthetic of TRON: Legacy. The team uses overhead projection to map visuals onto the ramp and to track the skater’s movements. Riders wear iPods running a custom ENESS app that measures air time and coordinates additional motion graphics with aerials & landings. The result is a kaleidoscope of patterns, trails, starbursts, and text elements flowing with the movement of the skaters. Tron Legacy Premiere – A Light Session from ENESS on Vimeo.
The Institute for the Future has released a map of "The Future of Cities, Information, and Inclusion."
The Institute for the Future has released a map from it’s most recent research on urban development, The Future of Cities, Information, & Inclusion. The research looks at the converging impacts of urban demographics, social collaboration, economic disparity, and the rapid digitization of the cityscape. IFTF envisions the city as a massive laboratory for innovation in addressing the challenges of our times. Within this landscape, communities adopt resilient strategies, encourage makers & tinkerers, and re-purpose digital tools to create local solutions to problems that are often the result of macrotrends. Managers of the emerging city-states look to instrumentation and monitoring as tools for gathering run-time data about civic processes, crunching massive data streams to optimize civic efficiencies and model futures possibilities.
A touch-sensitive music controller explores the interface between traditional Japanese designer art and modern interactive surfaces.
Japanese designer Yuri Suzuki, and British composer Matthew Rogers, have developed a touch-sensitive music controller that explores the interface between traditional Japanese designer art and modern interactive surfaces. The Urushi interface embeds capacitive gold inlays into lacquered wood hand-crafted in the Wajima Urushi technique. When touched, the inlays communicate with a MIDI controller enabling the interface to drive any MIDI-capable instrument. The result is an aesthetic artifact that captures the traditional lacquer techniques of Wajima while inviting people to interact with it as a musical instrument. The technology itself is not especially advanced but the presentation offers a model for bringing computation & interaction into otherwise purely visual art forms and household artifacts. Urushi Musical Interface from Yuri Suzuki on Vimeo.