December 14th, 2024

Galt Museum welcomes AI to the archives team

By Alejandra Pulido-Guzman - Lethbridge Herald on February 10, 2024.

LETHBRIDGE HERALDapulido@lethbridgeherald.com

Artificial Intelligence has joined the archives team at the Galt Museum and Archives through an innovative AI model developed by a Swiss company specifically for archives.
Head Archivist Andrew Chernevych, said this week the AI model Archipanion offers users a better and more effective way to search for images within the museum’s database.
“This project has been coming for probably nine months since we started and finally it’s ready to be used by anybody,” said Chernevych
He explained that the Galt has a lot of images in the archives with many of them on the website, but plenty could potentially be showcased and used by the community. However, they need to be organized in a way that makes them accessible, searchable and properly documented.
 Chernevych said unfortunately the process is time-consuming because staff have to scan the images and input the metadata for each picture individually, describing who is on the picture, when it was taken, what you can see on the picture or the context. If it is a picture from the Lethbridge Herald they would source the article associated with the image.
“We don’t have enough resources to do all that work. Various volunteers have been doing it at various times, but it is very slow. We managed to scan thousands of pictures and we have them, but we cannot really showcase them because the descriptions are not available,” said Chernevych.
He said this is where AI comes in. They can now upload those images into Archipanion where they are analyzed and interpreted without a description and Archipanion’s integration transforms them into a searchable online collection by skipping the very time-consuming archival process that normally keeps records inaccessible for years.
“Users can simply input a question or phrase, such as ‘people eating,’ and the system instantly retrieves a wide range of social situations related to meals, encompassing restaurants, picnics, and country fairs,” said Chernevych.
He said users can find images even without a specific description, as Archipanion comprehends the underlying concept and delivers relevant images within seconds.
“You can search for dangerous activities, and it will show you everything from the electricians climbing the poles to kids on the swing, whatever AI considers to be a dangerous activity in some way,” said Chernevych.
He said one thing the model is struggling with is detecting images related to Indigenous people because it is not familiar with Blackfoot culture.
“We’re actively fixing this problem right now. We launched a separate sub-project specifically to deal with this problem. There is a team in partnership and collaboration with the Archipanion and the university in Switzerland and they are using their scholars’ team to do a model training,” said Chernevych.
He explained this is being done in partnership with Swiss teams because that is where the AI model was developed.
“This advancement ensures that the rich visual heritage preserved within the Galt archive can be easily explored and enjoyed by individuals from all walks of life. In the future, this capacity will be built into new-generation content management systems, like the InMagic Discovery Interface that we are currently using now,” said Chernevych.
He said they anticipate that 16,000 Herald images will be made available on Archipanion in a short timeframe, with ongoing expansion in the years ahead.
“I would like to emphasize the crucial role that the Friends of the Galt played in this project actually taking place. They’ve been on board with the project from the very beginning, they provided funding for actually making it happen and to make sure we have the capacity to continue with that, to have the annual subscription and so that it becomes available for everybody in the community,” said Chernevych.
To access the AI model and search for images visit http://www.galt.archipanion.com

Share this story:

3
-2

Comments are closed.