DAM: A modular solution open to other applications

DAM solutions are increasingly connected to third-party applications in both directions. “A DAM does not operate on its own anymore. It may be connected to a CMS (Content Management System), a PIM (Product Information Manager) and any other software that uses files and media,” says Jonathan Kus, sales manager at Orphea. As to connectivity with a PIM, Céline Renaudie, Digital & E-Commerce Project Manager, adds: “A PIM is useful for centralizing all product information. Once we have a DAM, there’s no need to put the same photos and media on another platform. Linking the DAM to the PIM is the best solution for streamlining the PIM’s population and operation process. Malika Kechich, sales director at Orphea, emphasized DAM’s ability to interact with new platforms like video creation platforms (e.g. Pitchy, EasyMovie). “At an earlier stage, it makes it possible to interface with content production and population tools. The platform calls ingredients, files, logos, and images from the brand center in order to produce and generate video and animation with ease. Once the content has been generated, it is integrated into the company’s Media Center to populate the publications of communicators, marketers, community managers, etc.,” she says.  At an earlier stage, there are also complementary tools like LAMARK technology that make it possible to indelibly mark files and images to protect them, track them, and prove their origin.

The major subjects mentioned during discussions also include workflow tools and Digital Asset Compliance. These are digital content certification systems. Veriflies, a major brand in the market, ensures the quality and compliance of media and digital content production, both technical and in terms of copyright. Content distributed from the Orphea DAM that will go through Veriflies will have this guarantee. “This aspect of file and media compliance is increasingly important given the speed at which digital files spread nowadays,” says Malika Kechich.

Artificial intelligence and Deep Learning will transform DAM

“The evolution of artificial intelligence into Deep Learning, is providing and will provide a great deal of performance and innovation to DAM tools,” according to Malika Kechich. This is because, although artificial intelligence (AI) first appeared in the 50s, we now talk about Deep Learning. The computing power of machines and their ability to mimic human intelligence and behavior has enabled the development of Machine Learning (ML). These learning machines explore a large volume of data, analyze them, and learn themselves in order to assist humans in carrying out repetitive tasks and solving problems that had once been considered inaccessible.

Pattern recognition in the broad sense, facial recognition, natural language recognition, and optical character recognition (OCR) are all technologies that fall under Deep Learning and will add power to DAM.

“Based on neural net algorithms, Deep Learning makes it possible to mix together billions of pieces of data, media, images, and behaviors in seconds, mimicking the functioning of our neurons to deliver us an analysis of photos and videos, sound and voice recognition, automatic transcription in all languages, and more. This is where the future of DAM lies,” says Malika Kechich. She returns to the theme to say that “DAM combined with these technologies cannot work without human emotional intelligence.”  A look back at these three AI tools for the future of DAM:

1. Speech to text

Several AI-related advances have been covered, including speech-to-text, a solution for transcribing audio or video files. Given the increase in the production of video files, this service is popular among companies and institutions. It is starting to expand into the professional environment of DAM. “Speech-to-text makes it possible not just to transcribe voice into text, but also to extract keywords from it and to learn over time, taking into account accents, for example,” explains Malika Kechich. Speech-to-text uses artificial intelligence and deep learning to combine information about grammatical and linguistic structure and knowledge of the audio signal’s composition to generate an accurate transcript and machine translations.

2. Similarity search and facial recognition

The DAM of the future also involves the appearance of new search methods, such as similarity search. “It makes it possible to run a search in order to find an image similar to the original in terms of colors, textures, or shapes. The careful proportionate configuration of these three criteria refines the search results,” says Jonathan Kus. Ultimately, this feature will make it possible to automatically assign one or more images or photos in the DAM with a subject or selection of DAM platform. As for visual recognition, it makes it possible to scan an image and carry over a series of metadata for the purposes of automatic indexing (keywords, categories, etc.), supplemented and validated by document specialists and administrators on the Digital Asset Management platform. In a similar vein, there’s facial recognition. “This is a powerful type of search that can be seen on Facebook. It consists of measuring the positions of a series of characteristic facial points such as the spacing of the eyes, the ears, the nose bridge, in order to establish its geometry and therefore to identify it,” according to Jonathan Kus. These methods may be applied to DAM in order to automatically identify public figures or employees who are already indexed in the base.

Example with Imagga, an auto-tagging solution, based on machine learning, which makes it possible to assign keywords automatically to images in multiple languages.

3. Semantic search

This technology is aimed at improving search accuracy. Understanding its purpose and the contextual meaning of the terms as they appear in the searched data space, whether on the web or in a closed system, generates more useful results. In the context of DAM, it involves increasing the power of search engines to take into account not just words but also their contextual meanings.

In order to achieve this, the engine must understand natural language the same way a human being does.

Jonathan Kus explains this nuance: “Natural language searching makes it possible to run a search using a more natural expression as a criterion in the search engine, such as ‘video files downloaded since the start of the month’.” These natural language engines must theoretically be able to understand any question asked with common words.

DAM must be at the heart of the company’s digital ecosystem.

At the conclusion of this roundtable, Isabelle Roy gave her vision of DAM and the keys to starting it: “For us, DAM tools must be at the heart of the company’s digital ecosystem. DAM must be the storage, publication, and distribution base for all media because even today we can see that media is being replicated on the net, on the intranet, on company servers, and in different media centers. When it comes to categorizing or archiving this content, we must do it on each tool that exists.” She adds that “DAM does not necessarily need to have all features, but it must be well-positioned with respect to the company’s entire digital ecosystem.”  To achieve this, Isabelle Roy recommends adopting a co-construction logic with all of the company’s internal and external stakeholders. “This is a participatory, collaborative approach using agile project management methods. Groundwork on the user path must be laid from the beginning, rather than starting with a large functional scope as we so often see.”

Finally, she recommends “establishing a partnership with your DAM publisher, which can provide you with new ideas for enhancing your tool.”

Claire Lissalde added to Ms. Roy’s statements, specifying that “The user must feel at home when they go to a DAM platform. It’s not access to a database as we normally understand it, but a customized space for contribution, collaboration, and search.”

“DAM has a bright future in store,” says Malika Kechich. New practices will develop around the production and circulation of media. “New formats will appear, and Digital Asset Management solutions will need to know how to manage, protect, and distribute them,” she added. According to her, new features will appear, many of which will be derived from the promising contributions of deep learning and AI.

DAM is a major market that has been growing for years, with an expected annual growth of over 30% per year worldwide (MarketandMarkets). The production of media and data can no longer be managed in simple, fixed databases. Thus, this market is being professionalized, developed, and enhanced through artificial intelligence and deep learning, key functions for making DAM platforms more open, ergonomic, and powerful.

Orphea, which is constantly monitoring these technologies, invests nearly 20% of its revenue into research and development in order to upgrade its solutions and provide its clients, users, and partners with lasting, solid innovations.

Need more information...