Doesn't text indexing of video assets sound a little like magic? Last year, when YouTube (Google) added auto captioning for videos by employing some of the speech-to-text algorithms found in Google's Voice Search to automatically generate captions, critics wondered how well it would work.
Another DAM product touting a "meaning-based rich media management platform" is Autonomy's MediaBin. At the heart of this innovation is "the Intelligent Data Operating Layer (IDOL) which allows businesses to automate the processing of all rich media assets.
IDOL forms a conceptual understanding that allows marketers to automatically tag and classify any rich media asset, regardless of format or language."
Sounds incredible, right? I would imagine both of these processes works about as well as any automated metadata generation which means it is probably partially amazing and accurate. While I agree that the exponential amount of disparate media information ingested in corporate DAM systems required automation with a minimization of manual intervention, I just do not think an entirely automated solution can really interpret nuances like an actual person with a live brain.
Hence, the need for live people to manage DAM within organizations. Digital asset managers can maintain consistency, spearhead innovation, streamline processes, and ensure asset findability.